Compare commits
73 Commits
feature/em
...
16983fd871
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
16983fd871 | ||
|
|
ff49589f32 | ||
|
|
4abb442c50 | ||
|
|
1c004eb7d6 | ||
|
|
32544d4f4a | ||
|
|
1ee9af8f28 | ||
|
|
70599083b8 | ||
|
|
6a38189ef0 | ||
|
|
c9d58173f3 | ||
|
|
3dd2ff50d8 | ||
|
|
378265c3a3 | ||
|
|
30c0132a92 | ||
|
|
20d0652c85 | ||
|
|
4e02cd8eaa | ||
|
|
48b0087b01 | ||
|
|
c291559366 | ||
|
|
622cf9ac76 | ||
|
|
df5e124115 | ||
|
|
2b4cb1456f | ||
|
|
c2e5445196 | ||
|
|
360b69effc | ||
|
|
3bc8bb9e0c | ||
|
|
7ca4823573 | ||
|
|
5325169495 | ||
|
|
74cdd5dc57 | ||
|
|
574f20bfd7 | ||
|
|
c8249c94d6 | ||
|
|
51a1a69b45 | ||
|
|
6ee2d67027 | ||
|
|
9472210d8b | ||
|
|
62f017c4ca | ||
|
|
857871273d | ||
|
|
a9521a9da1 | ||
|
|
1f41974208 | ||
|
|
b68fde71c0 | ||
|
|
f61be90d5c | ||
|
|
87f37567fb | ||
|
|
9e684a956b | ||
|
|
379ef0d209 | ||
|
|
b1ff684df6 | ||
|
|
0032590030 | ||
|
|
db38d68399 | ||
|
|
48a0865199 | ||
|
|
7daed22d2d | ||
|
|
6c02b8831f | ||
|
|
042f80dd2a | ||
|
|
a472c11ac8 | ||
|
|
a037dd92af | ||
|
|
634de0b6a5 | ||
|
|
b4635b56a3 | ||
|
|
bfb68e81a8 | ||
|
|
1247a3420e | ||
|
|
6caee8a007 | ||
|
|
cf93d3b3a6 | ||
|
|
53cb296adc | ||
|
|
f71b70d03b | ||
|
|
0bdc3f4731 | ||
|
|
345065c03b | ||
|
|
c50dc618bf | ||
|
|
96e6ced8da | ||
|
|
4738ae3a75 | ||
|
|
591ca5a149 | ||
|
|
41ff3a9961 | ||
|
|
0101c0ca2c | ||
| 58bb7f8229 | |||
| a5628019f8 | |||
|
|
b1dbd85346 | ||
|
|
aae8f8926b | ||
|
|
f1773873d4 | ||
|
|
54df3c471e | ||
|
|
64f97f5648 | ||
|
|
c0b3ae3b72 | ||
|
|
e5596b5a17 |
@@ -14,11 +14,18 @@ JWT_SECRET=secure_jwt_secret_here
|
||||
# Application Authentication
|
||||
APP_PASSWORD=application_password_here
|
||||
|
||||
# Search Engine Configuration
|
||||
SEARCH_ENGINE=typesense
|
||||
|
||||
# Typesense Search Configuration
|
||||
TYPESENSE_API_KEY=secure_api_key_here
|
||||
TYPESENSE_ENABLED=true
|
||||
TYPESENSE_REINDEX_INTERVAL=3600000
|
||||
|
||||
# OpenSearch Configuration
|
||||
OPENSEARCH_USERNAME=admin
|
||||
OPENSEARCH_PASSWORD=secure_opensearch_password_here
|
||||
|
||||
# Image Storage
|
||||
IMAGE_STORAGE_PATH=/app/images
|
||||
|
||||
|
||||
@@ -18,10 +18,9 @@ JWT_SECRET=REPLACE_WITH_SECURE_JWT_SECRET_MINIMUM_32_CHARS
|
||||
# Use a strong password in production
|
||||
APP_PASSWORD=REPLACE_WITH_SECURE_APP_PASSWORD
|
||||
|
||||
# Typesense Search Configuration
|
||||
TYPESENSE_API_KEY=REPLACE_WITH_SECURE_TYPESENSE_API_KEY
|
||||
TYPESENSE_ENABLED=true
|
||||
TYPESENSE_REINDEX_INTERVAL=3600000
|
||||
# OpenSearch Configuration
|
||||
#OPENSEARCH_PASSWORD=REPLACE_WITH_SECURE_OPENSEARCH_PASSWORD
|
||||
SEARCH_ENGINE=opensearch
|
||||
|
||||
# Image Storage
|
||||
IMAGE_STORAGE_PATH=/app/images
|
||||
|
||||
220
ASYNC_IMAGE_PROCESSING.md
Normal file
220
ASYNC_IMAGE_PROCESSING.md
Normal file
@@ -0,0 +1,220 @@
|
||||
# Async Image Processing Implementation
|
||||
|
||||
## Overview
|
||||
|
||||
The image processing system has been updated to handle external images asynchronously, preventing timeouts when processing stories with many images. This provides real-time progress updates to users showing which images are being processed.
|
||||
|
||||
## Backend Components
|
||||
|
||||
### 1. `ImageProcessingProgressService`
|
||||
- Tracks progress for individual story image processing sessions
|
||||
- Thread-safe with `ConcurrentHashMap` for multi-user support
|
||||
- Provides progress information: total images, processed count, current image, status, errors
|
||||
|
||||
### 2. `AsyncImageProcessingService`
|
||||
- Handles asynchronous image processing using Spring's `@Async` annotation
|
||||
- Counts external images before processing
|
||||
- Provides progress callbacks during processing
|
||||
- Updates story content when processing completes
|
||||
- Automatic cleanup of progress data after completion
|
||||
|
||||
### 3. Enhanced `ImageService`
|
||||
- Added `processContentImagesWithProgress()` method with callback support
|
||||
- Progress callbacks provide real-time updates during image download/processing
|
||||
- Maintains compatibility with existing synchronous processing
|
||||
|
||||
### 4. Updated `StoryController`
|
||||
- `POST /api/stories` and `PUT /api/stories/{id}` now trigger async image processing
|
||||
- `GET /api/stories/{id}/image-processing-progress` endpoint for progress polling
|
||||
- Processing starts immediately after story save and returns control to user
|
||||
|
||||
## Frontend Components
|
||||
|
||||
### 1. `ImageProcessingProgressTracker` (Utility Class)
|
||||
```typescript
|
||||
const tracker = new ImageProcessingProgressTracker(storyId);
|
||||
tracker.onProgress((progress) => {
|
||||
console.log(`Processing ${progress.processedImages}/${progress.totalImages}`);
|
||||
});
|
||||
tracker.onComplete(() => console.log('Done!'));
|
||||
tracker.start();
|
||||
```
|
||||
|
||||
### 2. `ImageProcessingProgressComponent` (React Component)
|
||||
```tsx
|
||||
<ImageProcessingProgressComponent
|
||||
storyId={storyId}
|
||||
autoStart={true}
|
||||
onComplete={() => refreshStory()}
|
||||
/>
|
||||
```
|
||||
|
||||
## User Experience
|
||||
|
||||
### Before (Synchronous)
|
||||
1. User saves story with external images
|
||||
2. Request hangs for 30+ seconds processing images
|
||||
3. Browser may timeout
|
||||
4. No feedback about progress
|
||||
5. User doesn't know if it's working
|
||||
|
||||
### After (Asynchronous)
|
||||
1. User saves story with external images
|
||||
2. Save completes immediately
|
||||
3. Progress indicator appears: "Processing 5 images. Currently image 2 of 5..."
|
||||
4. User can continue using the application
|
||||
5. Progress updates every second
|
||||
6. Story automatically refreshes when processing completes
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Progress Endpoint
|
||||
```
|
||||
GET /api/stories/{id}/image-processing-progress
|
||||
```
|
||||
|
||||
**Response when processing:**
|
||||
```json
|
||||
{
|
||||
"isProcessing": true,
|
||||
"totalImages": 5,
|
||||
"processedImages": 2,
|
||||
"currentImageUrl": "https://example.com/image.jpg",
|
||||
"status": "Processing image 3 of 5",
|
||||
"progressPercentage": 40.0,
|
||||
"completed": false,
|
||||
"error": ""
|
||||
}
|
||||
```
|
||||
|
||||
**Response when completed:**
|
||||
```json
|
||||
{
|
||||
"isProcessing": false,
|
||||
"totalImages": 5,
|
||||
"processedImages": 5,
|
||||
"currentImageUrl": "",
|
||||
"status": "Completed: 5 images processed",
|
||||
"progressPercentage": 100.0,
|
||||
"completed": true,
|
||||
"error": ""
|
||||
}
|
||||
```
|
||||
|
||||
**Response when no processing:**
|
||||
```json
|
||||
{
|
||||
"isProcessing": false,
|
||||
"message": "No active image processing"
|
||||
}
|
||||
```
|
||||
|
||||
## Integration Examples
|
||||
|
||||
### React Hook Usage
|
||||
```tsx
|
||||
import { useImageProcessingProgress } from '../utils/imageProcessingProgress';
|
||||
|
||||
function StoryEditor({ storyId }) {
|
||||
const { progress, isTracking, startTracking } = useImageProcessingProgress(storyId);
|
||||
|
||||
const handleSave = async () => {
|
||||
await saveStory();
|
||||
startTracking(); // Start monitoring progress
|
||||
};
|
||||
|
||||
return (
|
||||
<div>
|
||||
{isTracking && progress && (
|
||||
<div className="progress-indicator">
|
||||
Processing {progress.processedImages}/{progress.totalImages} images...
|
||||
</div>
|
||||
)}
|
||||
<button onClick={handleSave}>Save Story</button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### Manual Progress Tracking
|
||||
```typescript
|
||||
// After saving a story with external images
|
||||
const tracker = new ImageProcessingProgressTracker(storyId);
|
||||
|
||||
tracker.onProgress((progress) => {
|
||||
updateProgressBar(progress.progressPercentage);
|
||||
showStatus(progress.status);
|
||||
if (progress.currentImageUrl) {
|
||||
showCurrentImage(progress.currentImageUrl);
|
||||
}
|
||||
});
|
||||
|
||||
tracker.onComplete((finalProgress) => {
|
||||
hideProgressBar();
|
||||
showNotification('Image processing completed!');
|
||||
refreshStoryContent(); // Reload story with processed images
|
||||
});
|
||||
|
||||
tracker.onError((error) => {
|
||||
hideProgressBar();
|
||||
showError(`Image processing failed: ${error}`);
|
||||
});
|
||||
|
||||
tracker.start();
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Polling Interval
|
||||
Default: 1 second (1000ms)
|
||||
```typescript
|
||||
const tracker = new ImageProcessingProgressTracker(storyId, 500); // Poll every 500ms
|
||||
```
|
||||
|
||||
### Timeout
|
||||
Default: 5 minutes (300000ms)
|
||||
```typescript
|
||||
const tracker = new ImageProcessingProgressTracker(storyId, 1000, 600000); // 10 minute timeout
|
||||
```
|
||||
|
||||
### Spring Async Configuration
|
||||
The backend uses Spring's default async executor. For production, consider configuring a custom thread pool in your application properties:
|
||||
|
||||
```yaml
|
||||
spring:
|
||||
task:
|
||||
execution:
|
||||
pool:
|
||||
core-size: 4
|
||||
max-size: 8
|
||||
queue-capacity: 100
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Backend Errors
|
||||
- Network timeouts downloading images
|
||||
- Invalid image formats
|
||||
- Disk space issues
|
||||
- All errors are logged and returned in progress status
|
||||
|
||||
### Frontend Errors
|
||||
- Network failures during progress polling
|
||||
- Timeout if processing takes too long
|
||||
- Graceful degradation - user can continue working
|
||||
|
||||
## Benefits
|
||||
|
||||
1. **No More Timeouts**: Large image processing operations won't timeout HTTP requests
|
||||
2. **Better UX**: Users get real-time feedback about processing progress
|
||||
3. **Improved Performance**: Users can continue using the app while images process
|
||||
4. **Error Visibility**: Clear error messages when image processing fails
|
||||
5. **Scalability**: Multiple users can process images simultaneously without blocking
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
1. **WebSocket Support**: Replace polling with WebSocket for real-time push updates
|
||||
2. **Batch Processing**: Queue multiple stories for batch image processing
|
||||
3. **Retry Logic**: Automatic retry for failed image downloads
|
||||
4. **Progress Persistence**: Save progress to database for recovery after server restart
|
||||
5. **Image Optimization**: Automatic resize/compress images during processing
|
||||
137
DEPLOYMENT.md
Normal file
137
DEPLOYMENT.md
Normal file
@@ -0,0 +1,137 @@
|
||||
# StoryCove Deployment Guide
|
||||
|
||||
## Quick Deployment
|
||||
|
||||
StoryCove includes an automated deployment script that handles Solr volume cleanup and ensures fresh search indices on every deployment.
|
||||
|
||||
### Using the Deployment Script
|
||||
|
||||
```bash
|
||||
./deploy.sh
|
||||
```
|
||||
|
||||
This script will:
|
||||
1. Stop all running containers
|
||||
2. **Remove the Solr data volume** (forcing fresh core creation)
|
||||
3. Build and start all containers
|
||||
4. Wait for services to become healthy
|
||||
5. Trigger automatic bulk reindexing
|
||||
|
||||
### What Happens During Deployment
|
||||
|
||||
#### 1. Solr Volume Cleanup
|
||||
The script removes the `storycove_solr_data` volume, which:
|
||||
- Ensures all Solr cores are recreated from scratch
|
||||
- Prevents stale configuration issues
|
||||
- Guarantees schema changes are applied
|
||||
|
||||
#### 2. Automatic Bulk Reindexing
|
||||
When the backend starts, it automatically:
|
||||
- Detects that Solr is available
|
||||
- Fetches all entities from the database (Stories, Authors, Collections)
|
||||
- Bulk indexes them into Solr
|
||||
- Logs progress and completion
|
||||
|
||||
### Monitoring the Deployment
|
||||
|
||||
Watch the backend logs to see reindexing progress:
|
||||
```bash
|
||||
docker-compose logs -f backend
|
||||
```
|
||||
|
||||
You should see output like:
|
||||
```
|
||||
========================================
|
||||
Starting automatic bulk reindexing...
|
||||
========================================
|
||||
📚 Indexing stories...
|
||||
✅ Indexed 150 stories
|
||||
👤 Indexing authors...
|
||||
✅ Indexed 45 authors
|
||||
📂 Indexing collections...
|
||||
✅ Indexed 12 collections
|
||||
========================================
|
||||
✅ Bulk reindexing completed successfully in 2345ms
|
||||
📊 Total indexed: 150 stories, 45 authors, 12 collections
|
||||
========================================
|
||||
```
|
||||
|
||||
## Manual Deployment (Without Script)
|
||||
|
||||
If you prefer manual control:
|
||||
|
||||
```bash
|
||||
# Stop containers
|
||||
docker-compose down
|
||||
|
||||
# Remove Solr volume
|
||||
docker volume rm storycove_solr_data
|
||||
|
||||
# Start containers
|
||||
docker-compose up -d --build
|
||||
```
|
||||
|
||||
The automatic reindexing will still occur on startup.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Reindexing Fails
|
||||
|
||||
If bulk reindexing fails:
|
||||
1. Check Solr is running: `docker-compose logs solr`
|
||||
2. Verify Solr health: `curl http://localhost:8983/solr/admin/ping`
|
||||
3. Check backend logs: `docker-compose logs backend`
|
||||
|
||||
The application will still start even if reindexing fails - you can manually trigger reindexing through the admin API.
|
||||
|
||||
### Solr Cores Not Created
|
||||
|
||||
If Solr cores aren't being created properly:
|
||||
1. Check the `solr.Dockerfile` to ensure cores are created
|
||||
2. Verify the Solr image builds correctly: `docker-compose build solr`
|
||||
3. Check Solr Admin UI: http://localhost:8983
|
||||
|
||||
### Performance Issues
|
||||
|
||||
If reindexing takes too long:
|
||||
- The bulk indexing is already optimized (batch operations)
|
||||
- Consider increasing Solr memory in `docker-compose.yml`:
|
||||
```yaml
|
||||
environment:
|
||||
- SOLR_HEAP=1024m
|
||||
```
|
||||
|
||||
## Development Workflow
|
||||
|
||||
### Daily Development
|
||||
Just use the normal commands:
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
The automatic reindexing still happens, but it's fast on small datasets.
|
||||
|
||||
### Schema Changes
|
||||
When you modify Solr schema or add new cores:
|
||||
```bash
|
||||
./deploy.sh
|
||||
```
|
||||
|
||||
This ensures a clean slate.
|
||||
|
||||
### Skipping Reindexing
|
||||
|
||||
Reindexing is automatic and cannot be disabled. It's designed to be fast and unobtrusive. The application starts immediately - reindexing happens in the background.
|
||||
|
||||
## Environment Variables
|
||||
|
||||
No additional environment variables are needed for the deployment script. All configuration is in `docker-compose.yml`.
|
||||
|
||||
## Backup Considerations
|
||||
|
||||
**Important**: Since the Solr volume is recreated on every deployment, you should:
|
||||
- Never rely on Solr as the source of truth
|
||||
- Always maintain data in PostgreSQL
|
||||
- Solr is treated as a disposable cache/index
|
||||
|
||||
This is the recommended approach for search indices.
|
||||
539
HOUSEKEEPING_COMPLETE_REPORT.md
Normal file
539
HOUSEKEEPING_COMPLETE_REPORT.md
Normal file
@@ -0,0 +1,539 @@
|
||||
# StoryCove Housekeeping Complete Report
|
||||
**Date:** 2025-10-10
|
||||
**Scope:** Comprehensive audit of backend, frontend, tests, and documentation
|
||||
**Overall Grade:** A- (90%)
|
||||
|
||||
---
|
||||
|
||||
## Executive Summary
|
||||
|
||||
StoryCove is a **production-ready** self-hosted short story library application with **excellent architecture** and **comprehensive feature implementation**. The codebase demonstrates professional-grade engineering with only one critical issue blocking 100% compliance.
|
||||
|
||||
### Key Highlights ✅
|
||||
- ✅ **Entity layer:** 100% specification compliant
|
||||
- ✅ **EPUB Import/Export:** Phase 2 fully implemented
|
||||
- ✅ **Tag Enhancement:** Aliases, merging, AI suggestions complete
|
||||
- ✅ **Multi-Library Support:** Robust isolation with security
|
||||
- ✅ **HTML Sanitization:** Shared backend/frontend config with DOMPurify
|
||||
- ✅ **Advanced Search:** 15+ filter parameters, Solr integration
|
||||
- ✅ **Reading Experience:** Progress tracking, TOC, series navigation
|
||||
|
||||
### Critical Issue 🚨
|
||||
1. **Collections Search Not Implemented** (CollectionService.java:56-61)
|
||||
- GET /api/collections returns empty results
|
||||
- Requires Solr Collections core implementation
|
||||
- Estimated: 4-6 hours to fix
|
||||
|
||||
---
|
||||
|
||||
## Phase 1: Documentation & State Assessment (COMPLETED)
|
||||
|
||||
### Entity Models - Grade: A+ (100%)
|
||||
|
||||
All 7 entity models are **specification-perfect**:
|
||||
|
||||
| Entity | Spec Compliance | Key Features | Status |
|
||||
|--------|----------------|--------------|--------|
|
||||
| **Story** | 100% | All 14 fields, reading progress, series support | ✅ Perfect |
|
||||
| **Author** | 100% | Rating, avatar, URL collections | ✅ Perfect |
|
||||
| **Tag** | 100% | Color (7-char hex), description (500 chars), aliases | ✅ Perfect |
|
||||
| **Collection** | 100% | Gap-based positioning, calculated properties | ✅ Perfect |
|
||||
| **Series** | 100% | Name, description, stories relationship | ✅ Perfect |
|
||||
| **ReadingPosition** | 100% | EPUB CFI, context, percentage tracking | ✅ Perfect |
|
||||
| **TagAlias** | 100% | Alias resolution, merge tracking | ✅ Perfect |
|
||||
|
||||
**Verification:**
|
||||
- `Story.java:1-343`: All fields match DATA_MODEL.md
|
||||
- `Collection.java:1-245`: Helper methods for story management
|
||||
- `ReadingPosition.java:1-230`: Complete EPUB CFI support
|
||||
- `TagAlias.java:1-113`: Proper canonical tag resolution
|
||||
|
||||
### Repository Layer - Grade: A+ (100%)
|
||||
|
||||
**Best Practices Verified:**
|
||||
- ✅ No search anti-patterns (CollectionRepository correctly delegates to search service)
|
||||
- ✅ Proper use of `@Query` annotations for complex operations
|
||||
- ✅ Efficient eager loading with JOIN FETCH
|
||||
- ✅ Return types: Page<T> for pagination, List<T> for unbounded
|
||||
|
||||
**Files Audited:**
|
||||
- `CollectionRepository.java:1-55` - ID-based lookups only
|
||||
- `StoryRepository.java` - Complex queries with associations
|
||||
- `AuthorRepository.java` - Join fetch for stories
|
||||
- `TagRepository.java` - Alias-aware queries
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: Backend Implementation Audit (COMPLETED)
|
||||
|
||||
### Service Layer - Grade: A (95%)
|
||||
|
||||
#### Core Services ✅
|
||||
|
||||
**StoryService.java** (794 lines)
|
||||
- ✅ CRUD with search integration
|
||||
- ✅ HTML sanitization on create/update (line 490, 528-532)
|
||||
- ✅ Reading progress management
|
||||
- ✅ Tag alias resolution
|
||||
- ✅ Random story with 15+ filters
|
||||
|
||||
**AuthorService.java** (317 lines)
|
||||
- ✅ Avatar management
|
||||
- ✅ Rating validation (1-5 range)
|
||||
- ✅ Search index synchronization
|
||||
- ✅ URL management
|
||||
|
||||
**TagService.java** (491 lines)
|
||||
- ✅ **Tag Enhancement spec 100% complete**
|
||||
- ✅ Alias system: addAlias(), removeAlias(), resolveTagByName()
|
||||
- ✅ Tag merging with atomic operations
|
||||
- ✅ AI tag suggestions with confidence scoring
|
||||
- ✅ Merge preview functionality
|
||||
|
||||
**CollectionService.java** (452 lines)
|
||||
- ⚠️ **CRITICAL ISSUE at lines 56-61:**
|
||||
```java
|
||||
public SearchResultDto<Collection> searchCollections(...) {
|
||||
logger.warn("Collections search not yet implemented in Solr, returning empty results");
|
||||
return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0);
|
||||
}
|
||||
```
|
||||
- ✅ All other CRUD operations work correctly
|
||||
- ✅ Gap-based positioning for story reordering
|
||||
|
||||
#### EPUB Services ✅
|
||||
|
||||
**EPUBImportService.java** (551 lines)
|
||||
- ✅ Metadata extraction (title, author, description, tags)
|
||||
- ✅ Cover image extraction and processing
|
||||
- ✅ Content image download and replacement
|
||||
- ✅ Reading position preservation
|
||||
- ✅ Author/series auto-creation
|
||||
|
||||
**EPUBExportService.java** (584 lines)
|
||||
- ✅ Single story export
|
||||
- ✅ Collection export (multi-story)
|
||||
- ✅ Chapter splitting by word count or HTML headings
|
||||
- ✅ Custom metadata and title support
|
||||
- ✅ XHTML compliance (fixHtmlForXhtml method)
|
||||
- ✅ Reading position inclusion
|
||||
|
||||
#### Advanced Services ✅
|
||||
|
||||
**HtmlSanitizationService.java** (222 lines)
|
||||
- ✅ Jsoup Safelist configuration
|
||||
- ✅ Loads config from `html-sanitization-config.json`
|
||||
- ✅ Figure tag preprocessing (lines 143-184)
|
||||
- ✅ Relative URL preservation (line 89)
|
||||
- ✅ Shared with frontend via `/api/config/html-sanitization`
|
||||
|
||||
**ImageService.java** (1122 lines)
|
||||
- ✅ Three image types: COVER, AVATAR, CONTENT
|
||||
- ✅ Content image processing with download
|
||||
- ✅ Orphaned image cleanup
|
||||
- ✅ Library-aware paths
|
||||
- ✅ Async processing support
|
||||
|
||||
**LibraryService.java** (830 lines)
|
||||
- ✅ Multi-library isolation
|
||||
- ✅ **Explicit authentication required** (lines 104-114)
|
||||
- ✅ Automatic schema creation for new libraries
|
||||
- ✅ Smart database routing (SmartRoutingDataSource)
|
||||
- ✅ Async Solr reindexing on library switch (lines 164-193)
|
||||
- ✅ BCrypt password encryption
|
||||
|
||||
**DatabaseManagementService.java** (1206 lines)
|
||||
- ✅ ZIP-based complete backup with pg_dump
|
||||
- ✅ Restore with schema creation
|
||||
- ✅ Manual reindexing from database (lines 1047-1097)
|
||||
- ✅ Security: ZIP path validation
|
||||
|
||||
**SearchServiceAdapter.java** (287 lines)
|
||||
- ✅ Unified search interface
|
||||
- ✅ Delegates to SolrService
|
||||
- ✅ Bulk indexing operations
|
||||
- ✅ Tag suggestions
|
||||
|
||||
**SolrService.java** (1115 lines)
|
||||
- ✅ Two cores: stories and authors
|
||||
- ✅ Advanced filtering with 20+ parameters
|
||||
- ✅ Library-aware filtering
|
||||
- ✅ Faceting support
|
||||
- ⚠️ **No Collections core** (known issue)
|
||||
|
||||
### Controller Layer - Grade: A (95%)
|
||||
|
||||
**StoryController.java** (1000+ lines)
|
||||
- ✅ Comprehensive REST API
|
||||
- ✅ CRUD operations
|
||||
- ✅ EPUB import/export endpoints
|
||||
- ✅ Async content image processing with progress
|
||||
- ✅ Duplicate detection
|
||||
- ✅ Advanced search with 15+ filters
|
||||
- ✅ Random story endpoint
|
||||
- ✅ Reading progress tracking
|
||||
|
||||
**CollectionController.java** (538 lines)
|
||||
- ✅ Full CRUD operations
|
||||
- ✅ Cover image upload/removal
|
||||
- ✅ Story reordering
|
||||
- ✅ EPUB collection export
|
||||
- ⚠️ Search returns empty (known issue)
|
||||
- ✅ Lightweight DTOs to avoid circular references
|
||||
|
||||
**SearchController.java** (57 lines)
|
||||
- ✅ Reindex endpoint
|
||||
- ✅ Health check
|
||||
- ⚠️ Minimal implementation (search is in StoryController)
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: Frontend Implementation Audit (COMPLETED)
|
||||
|
||||
### API Client Layer - Grade: A+ (100%)
|
||||
|
||||
**api.ts** (994 lines)
|
||||
- ✅ Axios instance with interceptors
|
||||
- ✅ JWT token management (localStorage + httpOnly cookies)
|
||||
- ✅ Auto-redirect on 401/403
|
||||
- ✅ Comprehensive endpoints for all resources
|
||||
- ✅ Tag alias resolution in search (lines 576-585)
|
||||
- ✅ Advanced filter parameters (15+ filters)
|
||||
- ✅ Random story with Solr RandomSortField (lines 199-307)
|
||||
- ✅ Library-aware image URLs (lines 983-994)
|
||||
|
||||
**Endpoints Coverage:**
|
||||
- ✅ Stories: CRUD, search, random, EPUB import/export, duplicate check
|
||||
- ✅ Authors: CRUD, avatar, search
|
||||
- ✅ Tags: CRUD, aliases, merge, suggestions, autocomplete
|
||||
- ✅ Collections: CRUD, search, cover, reorder, EPUB export
|
||||
- ✅ Series: CRUD, search
|
||||
- ✅ Database: backup/restore (both SQL and complete)
|
||||
- ✅ Config: HTML sanitization, image cleanup
|
||||
- ✅ Search Admin: engine switching, reindex, library migration
|
||||
|
||||
### HTML Sanitization - Grade: A+ (100%)
|
||||
|
||||
**sanitization.ts** (368 lines)
|
||||
- ✅ **Shared configuration with backend** via `/api/config/html-sanitization`
|
||||
- ✅ DOMPurify with custom configuration
|
||||
- ✅ CSS property filtering (lines 20-47)
|
||||
- ✅ Figure tag preprocessing (lines 187-251) - **matches backend**
|
||||
- ✅ Async `sanitizeHtml()` and sync `sanitizeHtmlSync()`
|
||||
- ✅ Fallback configuration if backend unavailable
|
||||
- ✅ Config caching for performance
|
||||
|
||||
**Security Features:**
|
||||
- ✅ Allowlist-based tag filtering
|
||||
- ✅ CSS property whitelist
|
||||
- ✅ URL protocol validation
|
||||
- ✅ Relative URL preservation for local images
|
||||
|
||||
### Pages & Components - Grade: A (95%)
|
||||
|
||||
#### Library Page (LibraryContent.tsx - 341 lines)
|
||||
- ✅ Advanced search with debouncing
|
||||
- ✅ Tag facet enrichment with full tag data
|
||||
- ✅ URL parameter handling for filters
|
||||
- ✅ Three layout modes: sidebar, toolbar, minimal
|
||||
- ✅ Advanced filters integration
|
||||
- ✅ Random story with all filters applied
|
||||
- ✅ Pagination
|
||||
|
||||
#### Collections Page (page.tsx - 300 lines)
|
||||
- ✅ Search with tag filtering
|
||||
- ✅ Archive toggle
|
||||
- ✅ Grid/list view modes
|
||||
- ✅ Pagination
|
||||
- ⚠️ **Search returns empty results** (backend issue)
|
||||
|
||||
#### Story Reading Page (stories/[id]/page.tsx - 669 lines)
|
||||
- ✅ **Sophisticated reading experience:**
|
||||
- Reading progress bar with percentage
|
||||
- Auto-scroll to saved position
|
||||
- Debounced position saving (2 second delay)
|
||||
- Character position tracking
|
||||
- End-of-story detection with reset option
|
||||
- ✅ **Table of Contents:**
|
||||
- Auto-generated from headings
|
||||
- Modal overlay
|
||||
- Smooth scroll navigation
|
||||
- ✅ **Series Navigation:**
|
||||
- Previous/Next story links
|
||||
- Inline metadata display
|
||||
- ✅ **Memoized content rendering** to prevent re-sanitization on scroll
|
||||
- ✅ Preloaded sanitization config
|
||||
|
||||
#### Settings Page (SettingsContent.tsx - 183 lines)
|
||||
- ✅ Three tabs: Appearance, Content, System
|
||||
- ✅ Theme switching (light/dark)
|
||||
- ✅ Font customization (serif, sans, mono)
|
||||
- ✅ Font size control
|
||||
- ✅ Reading width preferences
|
||||
- ✅ Reading speed configuration
|
||||
- ✅ localStorage persistence
|
||||
|
||||
#### Slate Editor (SlateEditor.tsx - 942 lines)
|
||||
- ✅ **Rich text editing with Slate.js**
|
||||
- ✅ **Advanced image handling:**
|
||||
- Image paste with src preservation
|
||||
- Interactive image elements with edit/delete
|
||||
- Image error handling with fallback
|
||||
- External image indicators
|
||||
- ✅ **Formatting:**
|
||||
- Headings (H1, H2, H3)
|
||||
- Text formatting (bold, italic, underline, strikethrough)
|
||||
- Keyboard shortcuts (Ctrl+B, Ctrl+I, etc.)
|
||||
- ✅ **HTML conversion:**
|
||||
- Bidirectional HTML ↔ Slate conversion
|
||||
- Mixed content support (text + images)
|
||||
- Figure tag preprocessing
|
||||
- Sanitization integration
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: Test Coverage Assessment (COMPLETED)
|
||||
|
||||
### Current Test Files (9 total):
|
||||
|
||||
**Entity Tests (5):**
|
||||
- ✅ `StoryTest.java` - Story entity validation
|
||||
- ✅ `AuthorTest.java` - Author entity validation
|
||||
- ✅ `TagTest.java` - Tag entity validation
|
||||
- ✅ `SeriesTest.java` - Series entity validation
|
||||
- ❌ Missing: CollectionTest, ReadingPositionTest, TagAliasTest
|
||||
|
||||
**Repository Tests (3):**
|
||||
- ✅ `StoryRepositoryTest.java` - Story persistence
|
||||
- ✅ `AuthorRepositoryTest.java` - Author persistence
|
||||
- ✅ `BaseRepositoryTest.java` - Base test configuration
|
||||
- ❌ Missing: TagRepository, SeriesRepository, CollectionRepository, ReadingPositionRepository
|
||||
|
||||
**Service Tests (2):**
|
||||
- ✅ `StoryServiceTest.java` - Story business logic
|
||||
- ✅ `AuthorServiceTest.java` - Author business logic
|
||||
- ❌ Missing: TagService, CollectionService, EPUBImportService, EPUBExportService, HtmlSanitizationService, ImageService, LibraryService, DatabaseManagementService, SeriesService, SearchServiceAdapter, SolrService
|
||||
|
||||
**Controller Tests:** ❌ None
|
||||
**Frontend Tests:** ❌ None
|
||||
|
||||
### Test Coverage Estimate: ~25%
|
||||
|
||||
**Missing HIGH Priority Tests:**
|
||||
1. CollectionServiceTest - Collections CRUD and search
|
||||
2. TagServiceTest - Alias, merge, AI suggestions
|
||||
3. EPUBImportServiceTest - Import logic verification
|
||||
4. EPUBExportServiceTest - Export format validation
|
||||
5. HtmlSanitizationServiceTest - **Security critical**
|
||||
6. ImageServiceTest - Image processing and download
|
||||
|
||||
**Missing MEDIUM Priority:**
|
||||
- SeriesServiceTest
|
||||
- LibraryServiceTest
|
||||
- DatabaseManagementServiceTest
|
||||
- SearchServiceAdapter/SolrServiceTest
|
||||
- All controller tests
|
||||
- All frontend component tests
|
||||
|
||||
**Recommended Action:**
|
||||
Create comprehensive test suite with target coverage of 80%+ for services, 70%+ for controllers.
|
||||
|
||||
---
|
||||
|
||||
## Phase 5: Documentation Review
|
||||
|
||||
### Specification Documents ✅
|
||||
|
||||
| Document | Status | Notes |
|
||||
|----------|--------|-------|
|
||||
| storycove-spec.md | ✅ Current | Core specification |
|
||||
| DATA_MODEL.md | ✅ Current | 100% implemented |
|
||||
| API.md | ⚠️ Needs minor updates | Missing some advanced filter docs |
|
||||
| TAG_ENHANCEMENT_SPECIFICATION.md | ✅ Current | 100% implemented |
|
||||
| EPUB_IMPORT_EXPORT_SPECIFICATION.md | ✅ Current | Phase 2 complete |
|
||||
| storycove-collections-spec.md | ⚠️ Known issue | Search not implemented |
|
||||
|
||||
### Implementation Reports ✅
|
||||
|
||||
- ✅ `HOUSEKEEPING_PHASE1_REPORT.md` - Detailed assessment
|
||||
- ✅ `HOUSEKEEPING_COMPLETE_REPORT.md` - This document
|
||||
|
||||
### Recommendations:
|
||||
|
||||
1. **Update API.md** to document:
|
||||
- Advanced search filters (15+ parameters)
|
||||
- Random story endpoint with filter support
|
||||
- EPUB import/export endpoints
|
||||
- Image processing endpoints
|
||||
|
||||
2. **Add MULTI_LIBRARY_SPEC.md** documenting:
|
||||
- Library isolation architecture
|
||||
- Authentication flow
|
||||
- Database routing
|
||||
- Search index separation
|
||||
|
||||
---
|
||||
|
||||
## Critical Findings Summary
|
||||
|
||||
### 🚨 CRITICAL (Must Fix)
|
||||
|
||||
1. **Collections Search Not Implemented**
|
||||
- **Location:** `CollectionService.java:56-61`
|
||||
- **Impact:** GET /api/collections always returns empty results
|
||||
- **Specification:** storycove-collections-spec.md lines 52-61 mandates Solr search
|
||||
- **Estimated Fix:** 4-6 hours
|
||||
- **Steps:**
|
||||
1. Create Solr Collections core with schema
|
||||
2. Implement indexing in SearchServiceAdapter
|
||||
3. Wire up CollectionService.searchCollections()
|
||||
4. Test pagination and filtering
|
||||
|
||||
### ⚠️ HIGH Priority (Recommended)
|
||||
|
||||
2. **Missing Test Coverage** (~25% vs target 80%)
|
||||
- HtmlSanitizationServiceTest - security critical
|
||||
- CollectionServiceTest - feature verification
|
||||
- TagServiceTest - complex logic (aliases, merge)
|
||||
- EPUBImportServiceTest, EPUBExportServiceTest - file processing
|
||||
|
||||
3. **API Documentation Updates**
|
||||
- Advanced filters not fully documented
|
||||
- EPUB endpoints missing from API.md
|
||||
|
||||
### 📋 MEDIUM Priority (Optional)
|
||||
|
||||
4. **SearchController Minimal**
|
||||
- Only has reindex and health check
|
||||
- Actual search in StoryController
|
||||
|
||||
5. **Frontend Test Coverage**
|
||||
- No component tests
|
||||
- No integration tests
|
||||
- Recommend: Jest + React Testing Library
|
||||
|
||||
---
|
||||
|
||||
## Strengths & Best Practices 🌟
|
||||
|
||||
### Architecture Excellence
|
||||
1. **Multi-Library Support**
|
||||
- Complete isolation with separate databases
|
||||
- Explicit authentication required
|
||||
- Smart routing with automatic reindexing
|
||||
- Library-aware image paths
|
||||
|
||||
2. **Security-First Design**
|
||||
- HTML sanitization with shared backend/frontend config
|
||||
- JWT authentication with httpOnly cookies
|
||||
- BCrypt password encryption
|
||||
- Input validation throughout
|
||||
|
||||
3. **Production-Ready Features**
|
||||
- Complete backup/restore system (pg_dump/psql)
|
||||
- Orphaned image cleanup
|
||||
- Async image processing with progress tracking
|
||||
- Reading position tracking with EPUB CFI
|
||||
|
||||
### Code Quality
|
||||
1. **Proper Separation of Concerns**
|
||||
- Repository anti-patterns avoided
|
||||
- Service layer handles business logic
|
||||
- Controllers are thin and focused
|
||||
- DTOs prevent circular references
|
||||
|
||||
2. **Error Handling**
|
||||
- Custom exceptions (ResourceNotFoundException, DuplicateResourceException)
|
||||
- Proper HTTP status codes
|
||||
- Fallback configurations
|
||||
|
||||
3. **Performance Optimizations**
|
||||
- Eager loading with JOIN FETCH
|
||||
- Memoized React components
|
||||
- Debounced search and autosave
|
||||
- Config caching
|
||||
|
||||
---
|
||||
|
||||
## Compliance Matrix
|
||||
|
||||
| Feature Area | Spec Compliance | Implementation Quality | Notes |
|
||||
|-------------|----------------|----------------------|-------|
|
||||
| **Entity Models** | 100% | A+ | Perfect spec match |
|
||||
| **Database Layer** | 100% | A+ | Best practices followed |
|
||||
| **EPUB Import/Export** | 100% | A | Phase 2 complete |
|
||||
| **Tag Enhancement** | 100% | A | Aliases, merge, AI complete |
|
||||
| **Collections** | 80% | B | Search not implemented |
|
||||
| **HTML Sanitization** | 100% | A+ | Shared config, security-first |
|
||||
| **Search** | 95% | A | Missing Collections core |
|
||||
| **Multi-Library** | 100% | A | Robust isolation |
|
||||
| **Reading Experience** | 100% | A+ | Sophisticated tracking |
|
||||
| **Image Processing** | 100% | A | Download, async, cleanup |
|
||||
| **Test Coverage** | 25% | C | Needs significant work |
|
||||
| **Documentation** | 90% | B+ | Minor updates needed |
|
||||
|
||||
---
|
||||
|
||||
## Recommendations by Priority
|
||||
|
||||
### Immediate (This Sprint)
|
||||
1. ✅ **Fix Collections Search** (4-6 hours)
|
||||
- Implement Solr Collections core
|
||||
- Wire up searchCollections()
|
||||
- Test thoroughly
|
||||
|
||||
### Short-Term (Next Sprint)
|
||||
2. ✅ **Create Critical Tests** (10-12 hours)
|
||||
- HtmlSanitizationServiceTest
|
||||
- CollectionServiceTest
|
||||
- TagServiceTest
|
||||
- EPUBImportServiceTest
|
||||
- EPUBExportServiceTest
|
||||
|
||||
3. ✅ **Update API Documentation** (2-3 hours)
|
||||
- Document advanced filters
|
||||
- Add EPUB endpoints
|
||||
- Update examples
|
||||
|
||||
### Medium-Term (Next Month)
|
||||
4. ✅ **Expand Test Coverage to 80%** (20-25 hours)
|
||||
- ImageServiceTest
|
||||
- LibraryServiceTest
|
||||
- DatabaseManagementServiceTest
|
||||
- Controller tests
|
||||
- Frontend component tests
|
||||
|
||||
5. ✅ **Create Multi-Library Spec** (3-4 hours)
|
||||
- Document architecture
|
||||
- Authentication flow
|
||||
- Database routing
|
||||
- Migration guide
|
||||
|
||||
---
|
||||
|
||||
## Conclusion
|
||||
|
||||
StoryCove is a **well-architected, production-ready application** with only one critical blocker (Collections search). The codebase demonstrates:
|
||||
|
||||
- ✅ **Excellent architecture** with proper separation of concerns
|
||||
- ✅ **Security-first** approach with HTML sanitization and authentication
|
||||
- ✅ **Production features** like backup/restore, multi-library, async processing
|
||||
- ✅ **Sophisticated UX** with reading progress, TOC, series navigation
|
||||
- ⚠️ **Test coverage gap** that should be addressed
|
||||
|
||||
### Final Grade: A- (90%)
|
||||
|
||||
**Breakdown:**
|
||||
- Backend Implementation: A (95%)
|
||||
- Frontend Implementation: A (95%)
|
||||
- Test Coverage: C (25%)
|
||||
- Documentation: B+ (90%)
|
||||
- Overall Architecture: A+ (100%)
|
||||
|
||||
**Primary Blocker:** Collections search (6 hours to fix)
|
||||
**Recommended Focus:** Test coverage (target 80%)
|
||||
|
||||
---
|
||||
|
||||
*Report Generated: 2025-10-10*
|
||||
*Next Review: After Collections search implementation*
|
||||
526
HOUSEKEEPING_PHASE1_REPORT.md
Normal file
526
HOUSEKEEPING_PHASE1_REPORT.md
Normal file
@@ -0,0 +1,526 @@
|
||||
# StoryCove Housekeeping Report - Phase 1: Documentation & State Assessment
|
||||
**Date**: 2025-01-10
|
||||
**Completed By**: Claude Code (Housekeeping Analysis)
|
||||
|
||||
## Executive Summary
|
||||
|
||||
Phase 1 assessment has been completed, providing a comprehensive review of the StoryCove application's current implementation status against specifications. The application is **well-implemented** with most core features working, but there is **1 CRITICAL ISSUE** and several areas requiring attention.
|
||||
|
||||
### Critical Finding
|
||||
🚨 **Collections Search Not Implemented**: The Collections feature does not use Typesense/Solr for search as mandated by the specification. This is a critical architectural requirement that must be addressed.
|
||||
|
||||
### Overall Status
|
||||
- **Backend Implementation**: ~85% complete with specification
|
||||
- **Entity Models**: ✅ 100% compliant with DATA_MODEL.md
|
||||
- **Test Coverage**: ⚠️ 9 tests exist, but many critical services lack tests
|
||||
- **Documentation**: ✅ Comprehensive and up-to-date
|
||||
|
||||
---
|
||||
|
||||
## 1. Implementation Status Matrix
|
||||
|
||||
### 1.1 Entity Layer (✅ FULLY COMPLIANT)
|
||||
|
||||
| Entity | Specification | Implementation Status | Notes |
|
||||
|--------|---------------|----------------------|-------|
|
||||
| **Story** | storycove-spec.md | ✅ Complete | All fields match spec including reading position, isRead, lastReadAt |
|
||||
| **Author** | storycove-spec.md | ✅ Complete | Includes avatar_image_path, rating, URLs as @ElementCollection |
|
||||
| **Tag** | TAG_ENHANCEMENT_SPECIFICATION.md | ✅ Complete | Includes color, description, aliases relationship |
|
||||
| **TagAlias** | TAG_ENHANCEMENT_SPECIFICATION.md | ✅ Complete | Implements alias system with createdFromMerge flag |
|
||||
| **Series** | storycove-spec.md | ✅ Complete | Basic implementation as specified |
|
||||
| **Collection** | storycove-collections-spec.md | ✅ Complete | All fields including isArchived, gap-based positioning |
|
||||
| **CollectionStory** | storycove-collections-spec.md | ✅ Complete | Junction entity with position field |
|
||||
| **ReadingPosition** | EPUB_IMPORT_EXPORT_SPECIFICATION.md | ✅ Complete | Full EPUB CFI support, chapter tracking, percentage complete |
|
||||
| **Library** | (Multi-library support) | ✅ Complete | Implemented for multi-library feature |
|
||||
|
||||
**Assessment**: Entity layer is **100% specification-compliant** ✅
|
||||
|
||||
---
|
||||
|
||||
### 1.2 Repository Layer (⚠️ MOSTLY COMPLIANT)
|
||||
|
||||
| Repository | Specification Compliance | Issues |
|
||||
|------------|-------------------------|--------|
|
||||
| **CollectionRepository** | ⚠️ Partial | Contains only ID-based lookups (correct), has note about Typesense |
|
||||
| **TagRepository** | ✅ Complete | Proper query methods, no search anti-patterns |
|
||||
| **StoryRepository** | ✅ Complete | Appropriate methods |
|
||||
| **AuthorRepository** | ✅ Complete | Appropriate methods |
|
||||
| **SeriesRepository** | ✅ Complete | Basic CRUD |
|
||||
| **ReadingPositionRepository** | ✅ Complete | Story-based lookups |
|
||||
| **TagAliasRepository** | ✅ Complete | Name-based lookups for resolution |
|
||||
|
||||
**Key Finding**: CollectionRepository correctly avoids search/filter methods (good architectural design), but the corresponding search implementation in CollectionService is not yet complete.
|
||||
|
||||
---
|
||||
|
||||
### 1.3 Service Layer (🚨 CRITICAL ISSUE FOUND)
|
||||
|
||||
| Service | Status | Specification Match | Critical Issues |
|
||||
|---------|--------|---------------------|-----------------|
|
||||
| **CollectionService** | 🚨 **INCOMPLETE** | 20% | **Collections search returns empty results** (line 56-61) |
|
||||
| **TagService** | ✅ Complete | 100% | Full alias, merging, AI suggestions implemented |
|
||||
| **StoryService** | ✅ Complete | 95% | Core features complete |
|
||||
| **AuthorService** | ✅ Complete | 95% | Core features complete |
|
||||
| **EPUBImportService** | ✅ Complete | 100% | Phase 1 & 2 complete per spec |
|
||||
| **EPUBExportService** | ✅ Complete | 100% | Single story & collection export working |
|
||||
| **ImageService** | ✅ Complete | 90% | Upload, resize, delete implemented |
|
||||
| **HtmlSanitizationService** | ✅ Complete | 100% | Security-critical, appears complete |
|
||||
| **SearchServiceAdapter** | ⚠️ Partial | 70% | Solr integration present but Collections not indexed |
|
||||
| **ReadingTimeService** | ✅ Complete | 100% | Word count calculations |
|
||||
|
||||
#### 🚨 CRITICAL ISSUE Detail: CollectionService.searchCollections()
|
||||
|
||||
**File**: `backend/src/main/java/com/storycove/service/CollectionService.java:56-61`
|
||||
|
||||
```java
|
||||
public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) {
|
||||
// Collections are currently handled at database level, not indexed in search engine
|
||||
// Return empty result for now as collections search is not implemented in Solr
|
||||
logger.warn("Collections search not yet implemented in Solr, returning empty results");
|
||||
return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0);
|
||||
}
|
||||
```
|
||||
|
||||
**Impact**:
|
||||
- GET /api/collections endpoint always returns 0 results
|
||||
- Frontend collections list view will appear empty
|
||||
- Violates architectural requirement in storycove-collections-spec.md Section 4.2 and 5.2
|
||||
|
||||
**Specification Requirement** (storycove-collections-spec.md:52-61):
|
||||
> **IMPORTANT**: This endpoint MUST use Typesense for all search and filtering operations.
|
||||
> Do NOT implement search/filter logic using JPA/SQL queries.
|
||||
|
||||
---
|
||||
|
||||
### 1.4 Controller/API Layer (✅ MOSTLY COMPLIANT)
|
||||
|
||||
| Controller | Endpoints | Status | Notes |
|
||||
|------------|-----------|--------|-------|
|
||||
| **CollectionController** | 13 endpoints | ⚠️ 90% | All endpoints implemented but search returns empty |
|
||||
| **StoryController** | ~15 endpoints | ✅ Complete | CRUD, reading progress, EPUB export |
|
||||
| **AuthorController** | ~10 endpoints | ✅ Complete | CRUD, avatar management |
|
||||
| **TagController** | ~12 endpoints | ✅ Complete | Enhanced features: aliases, merging, suggestions |
|
||||
| **SeriesController** | ~6 endpoints | ✅ Complete | Basic CRUD |
|
||||
| **AuthController** | 3 endpoints | ✅ Complete | Login, logout, verify |
|
||||
| **FileController** | 4 endpoints | ✅ Complete | Image serving and uploads |
|
||||
| **SearchController** | 3 endpoints | ✅ Complete | Story/Author search via Solr |
|
||||
|
||||
#### Endpoint Verification vs API.md
|
||||
|
||||
**Collections Endpoints (storycove-collections-spec.md)**:
|
||||
- ✅ GET /api/collections - Implemented (but returns empty due to search issue)
|
||||
- ✅ GET /api/collections/{id} - Implemented
|
||||
- ✅ POST /api/collections - Implemented (JSON & multipart)
|
||||
- ✅ PUT /api/collections/{id} - Implemented
|
||||
- ✅ DELETE /api/collections/{id} - Implemented
|
||||
- ✅ PUT /api/collections/{id}/archive - Implemented
|
||||
- ✅ POST /api/collections/{id}/stories - Implemented
|
||||
- ✅ DELETE /api/collections/{id}/stories/{storyId} - Implemented
|
||||
- ✅ PUT /api/collections/{id}/stories/order - Implemented
|
||||
- ✅ GET /api/collections/{id}/read/{storyId} - Implemented
|
||||
- ✅ GET /api/collections/{id}/stats - Implemented
|
||||
- ✅ GET /api/collections/{id}/epub - Implemented
|
||||
- ✅ POST /api/collections/{id}/epub - Implemented
|
||||
|
||||
**Tag Enhancement Endpoints (TAG_ENHANCEMENT_SPECIFICATION.md)**:
|
||||
- ✅ POST /api/tags/{tagId}/aliases - Implemented
|
||||
- ✅ DELETE /api/tags/{tagId}/aliases/{aliasId} - Implemented
|
||||
- ✅ POST /api/tags/merge - Implemented
|
||||
- ✅ POST /api/tags/merge/preview - Implemented
|
||||
- ✅ POST /api/tags/suggest - Implemented (AI-powered)
|
||||
- ✅ GET /api/tags/resolve/{name} - Implemented
|
||||
|
||||
---
|
||||
|
||||
### 1.5 Advanced Features Status
|
||||
|
||||
#### ✅ Tag Enhancement System (COMPLETE)
|
||||
**Specification**: TAG_ENHANCEMENT_SPECIFICATION.md (Status: ✅ COMPLETED)
|
||||
|
||||
| Feature | Status | Implementation |
|
||||
|---------|--------|----------------|
|
||||
| Color Tags | ✅ Complete | Tag entity has `color` field (VARCHAR(7) hex) |
|
||||
| Tag Descriptions | ✅ Complete | Tag entity has `description` field (VARCHAR(500)) |
|
||||
| Tag Aliases | ✅ Complete | TagAlias entity, resolution logic in TagService |
|
||||
| Tag Merging | ✅ Complete | Atomic merge with automatic alias creation |
|
||||
| AI Tag Suggestions | ✅ Complete | TagService.suggestTags() with confidence scoring |
|
||||
| Alias Resolution | ✅ Complete | TagService.resolveTagByName() checks both tags and aliases |
|
||||
|
||||
**Code Evidence**:
|
||||
- Tag entity: Tag.java:29-34 (color, description fields)
|
||||
- TagAlias entity: TagAlias.java (full implementation)
|
||||
- Merge logic: TagService.java:284-320
|
||||
- AI suggestions: TagService.java:385-491
|
||||
|
||||
---
|
||||
|
||||
#### ✅ EPUB Import/Export (PHASE 1 & 2 COMPLETE)
|
||||
**Specification**: EPUB_IMPORT_EXPORT_SPECIFICATION.md (Status: ✅ COMPLETED)
|
||||
|
||||
| Feature | Status | Files |
|
||||
|---------|--------|-------|
|
||||
| EPUB Import | ✅ Complete | EPUBImportService.java |
|
||||
| EPUB Export (Single) | ✅ Complete | EPUBExportService.java |
|
||||
| EPUB Export (Collection) | ✅ Complete | EPUBExportService.java, CollectionController:309-383 |
|
||||
| Reading Position (CFI) | ✅ Complete | ReadingPosition entity with epubCfi field |
|
||||
| Metadata Extraction | ✅ Complete | Cover, tags, author, title extraction |
|
||||
| Validation | ✅ Complete | File format and structure validation |
|
||||
|
||||
**Frontend Integration**:
|
||||
- ✅ Import UI: frontend/src/app/import/epub/page.tsx
|
||||
- ✅ Bulk Import: frontend/src/app/import/bulk/page.tsx
|
||||
- ✅ Export from Story Detail: (per spec update)
|
||||
|
||||
---
|
||||
|
||||
#### ⚠️ Collections Feature (MOSTLY COMPLETE, CRITICAL SEARCH ISSUE)
|
||||
**Specification**: storycove-collections-spec.md (Status: ⚠️ 85% COMPLETE)
|
||||
|
||||
| Feature | Status | Issue |
|
||||
|---------|--------|-------|
|
||||
| Entity Model | ✅ Complete | Collection, CollectionStory entities |
|
||||
| CRUD Operations | ✅ Complete | Create, update, delete, archive |
|
||||
| Story Management | ✅ Complete | Add, remove, reorder (gap-based positioning) |
|
||||
| Statistics | ✅ Complete | Word count, reading time, tag frequency |
|
||||
| EPUB Export | ✅ Complete | Full collection export |
|
||||
| **Search/Listing** | 🚨 **NOT IMPLEMENTED** | Returns empty results |
|
||||
| Reading Flow | ✅ Complete | Navigation context, previous/next |
|
||||
|
||||
**Critical Gap**: SearchServiceAdapter does not index Collections in Solr/Typesense.
|
||||
|
||||
---
|
||||
|
||||
#### ✅ Reading Position Tracking (COMPLETE)
|
||||
| Feature | Status |
|
||||
|---------|--------|
|
||||
| Character Position | ✅ Complete |
|
||||
| Chapter Tracking | ✅ Complete |
|
||||
| EPUB CFI Support | ✅ Complete |
|
||||
| Percentage Calculation | ✅ Complete |
|
||||
| Context Before/After | ✅ Complete |
|
||||
|
||||
---
|
||||
|
||||
### 1.6 Frontend Implementation (PRESENT BUT NOT FULLY AUDITED)
|
||||
|
||||
**Pages Found**:
|
||||
- ✅ Collections List: frontend/src/app/collections/page.tsx
|
||||
- ✅ Collection Detail: frontend/src/app/collections/[id]/page.tsx
|
||||
- ✅ Collection Reading: frontend/src/app/collections/[id]/read/[storyId]/page.tsx
|
||||
- ✅ Tag Maintenance: frontend/src/app/settings/tag-maintenance/page.tsx
|
||||
- ✅ EPUB Import: frontend/src/app/import/epub/page.tsx
|
||||
- ✅ Stories List: frontend/src/app/stories/page.tsx
|
||||
- ✅ Authors List: frontend/src/app/authors/page.tsx
|
||||
|
||||
**Note**: Full frontend audit deferred to Phase 3.
|
||||
|
||||
---
|
||||
|
||||
## 2. Test Coverage Assessment
|
||||
|
||||
### 2.1 Current Test Inventory
|
||||
|
||||
**Total Test Files**: 9
|
||||
|
||||
| Test File | Type | Target | Status |
|
||||
|-----------|------|--------|--------|
|
||||
| BaseRepositoryTest.java | Integration | Database setup | ✅ Present |
|
||||
| AuthorRepositoryTest.java | Integration | Author CRUD | ✅ Present |
|
||||
| StoryRepositoryTest.java | Integration | Story CRUD | ✅ Present |
|
||||
| TagTest.java | Unit | Tag entity | ✅ Present |
|
||||
| SeriesTest.java | Unit | Series entity | ✅ Present |
|
||||
| AuthorTest.java | Unit | Author entity | ✅ Present |
|
||||
| StoryTest.java | Unit | Story entity | ✅ Present |
|
||||
| AuthorServiceTest.java | Integration | Author service | ✅ Present |
|
||||
| StoryServiceTest.java | Integration | Story service | ✅ Present |
|
||||
|
||||
### 2.2 Missing Critical Tests
|
||||
|
||||
**Priority 1 (Critical Features)**:
|
||||
- ❌ CollectionServiceTest - **CRITICAL** (for search implementation verification)
|
||||
- ❌ TagServiceTest - Aliases, merging, AI suggestions
|
||||
- ❌ EPUBImportServiceTest - Import validation, metadata extraction
|
||||
- ❌ EPUBExportServiceTest - Export generation, collection EPUB
|
||||
|
||||
**Priority 2 (Core Services)**:
|
||||
- ❌ ImageServiceTest - Upload, resize, security
|
||||
- ❌ HtmlSanitizationServiceTest - **SECURITY CRITICAL**
|
||||
- ❌ SearchServiceAdapterTest - Solr integration
|
||||
- ❌ ReadingPositionServiceTest (if exists) - CFI handling
|
||||
|
||||
**Priority 3 (Controllers)**:
|
||||
- ❌ CollectionControllerTest
|
||||
- ❌ TagControllerTest
|
||||
- ❌ EPUBControllerTest
|
||||
|
||||
### 2.3 Test Coverage Estimate
|
||||
- **Current Coverage**: ~25% of service layer
|
||||
- **Target Coverage**: 80%+ for service layer
|
||||
- **Gap**: ~55% (approximately 15-20 test classes needed)
|
||||
|
||||
---
|
||||
|
||||
## 3. Specification Compliance Summary
|
||||
|
||||
| Specification Document | Compliance | Issues |
|
||||
|------------------------|------------|--------|
|
||||
| **storycove-spec.md** | 95% | Core features complete, minor gaps |
|
||||
| **DATA_MODEL.md** | 100% | Perfect match ✅ |
|
||||
| **API.md** | 90% | Most endpoints match, need verification |
|
||||
| **TAG_ENHANCEMENT_SPECIFICATION.md** | 100% | Fully implemented ✅ |
|
||||
| **EPUB_IMPORT_EXPORT_SPECIFICATION.md** | 100% | Phase 1 & 2 complete ✅ |
|
||||
| **storycove-collections-spec.md** | 85% | Search not implemented 🚨 |
|
||||
| **storycove-scraper-spec.md** | ❓ | Not assessed (separate feature) |
|
||||
|
||||
---
|
||||
|
||||
## 4. Database Schema Verification
|
||||
|
||||
### 4.1 Tables vs Specification
|
||||
|
||||
| Table | Specification | Implementation | Match |
|
||||
|-------|---------------|----------------|-------|
|
||||
| stories | DATA_MODEL.md | Story.java | ✅ 100% |
|
||||
| authors | DATA_MODEL.md | Author.java | ✅ 100% |
|
||||
| tags | DATA_MODEL.md + TAG_ENHANCEMENT | Tag.java | ✅ 100% |
|
||||
| tag_aliases | TAG_ENHANCEMENT | TagAlias.java | ✅ 100% |
|
||||
| series | DATA_MODEL.md | Series.java | ✅ 100% |
|
||||
| collections | storycove-collections-spec.md | Collection.java | ✅ 100% |
|
||||
| collection_stories | storycove-collections-spec.md | CollectionStory.java | ✅ 100% |
|
||||
| collection_tags | storycove-collections-spec.md | @JoinTable in Collection | ✅ 100% |
|
||||
| story_tags | DATA_MODEL.md | @JoinTable in Story | ✅ 100% |
|
||||
| reading_positions | EPUB_IMPORT_EXPORT | ReadingPosition.java | ✅ 100% |
|
||||
| libraries | (Multi-library) | Library.java | ✅ Present |
|
||||
|
||||
**Assessment**: Database schema is **100% specification-compliant** ✅
|
||||
|
||||
### 4.2 Indexes Verification
|
||||
|
||||
| Index | Required By Spec | Implementation | Status |
|
||||
|-------|------------------|----------------|--------|
|
||||
| idx_collections_archived | Collections spec | Collection entity | ✅ |
|
||||
| idx_collection_stories_position | Collections spec | CollectionStory entity | ✅ |
|
||||
| idx_reading_position_story | EPUB spec | ReadingPosition entity | ✅ |
|
||||
| idx_tag_aliases_name | TAG_ENHANCEMENT | Unique constraint on alias_name | ✅ |
|
||||
|
||||
---
|
||||
|
||||
## 5. Architecture Compliance
|
||||
|
||||
### 5.1 Search Integration Architecture
|
||||
|
||||
**Specification Requirement** (storycove-collections-spec.md):
|
||||
> All search, filtering, and listing operations MUST use Typesense as the primary data source.
|
||||
|
||||
**Current State**:
|
||||
- ✅ **Stories**: Properly use SearchServiceAdapter (Solr)
|
||||
- ✅ **Authors**: Properly use SearchServiceAdapter (Solr)
|
||||
- 🚨 **Collections**: NOT using SearchServiceAdapter
|
||||
|
||||
### 5.2 Anti-Pattern Verification
|
||||
|
||||
**Collections Repository** (CollectionRepository.java): ✅ CORRECT
|
||||
- Contains ONLY findById methods
|
||||
- Has explicit note: "For search/filter/list operations, use TypesenseService instead"
|
||||
- No search anti-patterns present
|
||||
|
||||
**Comparison with Spec Anti-Patterns** (storycove-collections-spec.md:663-689):
|
||||
```java
|
||||
// ❌ WRONG patterns NOT FOUND in codebase ✅
|
||||
// CollectionRepository correctly avoids:
|
||||
// - findByNameContaining()
|
||||
// - findByTagsIn()
|
||||
// - findByNameContainingAndArchived()
|
||||
```
|
||||
|
||||
**Issue**: While the repository layer is correctly designed, the service layer implementation is incomplete.
|
||||
|
||||
---
|
||||
|
||||
## 6. Code Quality Observations
|
||||
|
||||
### 6.1 Positive Findings
|
||||
1. ✅ **Consistent Entity Design**: All entities use UUID, proper annotations, equals/hashCode
|
||||
2. ✅ **Transaction Management**: @Transactional used appropriately
|
||||
3. ✅ **Logging**: Comprehensive SLF4J logging throughout
|
||||
4. ✅ **Validation**: Jakarta validation annotations used
|
||||
5. ✅ **DTOs**: Proper separation between entities and DTOs
|
||||
6. ✅ **Error Handling**: Custom exceptions (ResourceNotFoundException, DuplicateResourceException)
|
||||
7. ✅ **Gap-Based Positioning**: Collections use proper positioning algorithm (multiples of 1000)
|
||||
|
||||
### 6.2 Areas for Improvement
|
||||
1. ⚠️ **Test Coverage**: Major gap in service layer tests
|
||||
2. 🚨 **Collections Search**: Critical feature not implemented
|
||||
3. ⚠️ **Security Tests**: No dedicated tests for HtmlSanitizationService
|
||||
4. ⚠️ **Integration Tests**: Limited E2E testing
|
||||
|
||||
---
|
||||
|
||||
## 7. Dependencies & Technology Stack
|
||||
|
||||
### 7.1 Key Dependencies (Observed)
|
||||
- ✅ Spring Boot (Jakarta EE)
|
||||
- ✅ Hibernate/JPA
|
||||
- ✅ PostgreSQL
|
||||
- ✅ Solr (in place of Typesense, acceptable alternative)
|
||||
- ✅ EPUBLib (for EPUB handling)
|
||||
- ✅ Jsoup (for HTML sanitization)
|
||||
- ✅ JWT (authentication)
|
||||
|
||||
### 7.2 Search Engine Note
|
||||
**Specification**: Calls for Typesense
|
||||
**Implementation**: Uses Solr (Apache Solr)
|
||||
**Assessment**: ✅ Acceptable - Solr provides equivalent functionality
|
||||
|
||||
---
|
||||
|
||||
## 8. Documentation Status
|
||||
|
||||
### 8.1 Specification Documents
|
||||
| Document | Status | Notes |
|
||||
|----------|--------|-------|
|
||||
| storycove-spec.md | ✅ Current | Comprehensive main spec |
|
||||
| DATA_MODEL.md | ✅ Current | Matches implementation |
|
||||
| API.md | ⚠️ Needs minor updates | Most endpoints documented |
|
||||
| TAG_ENHANCEMENT_SPECIFICATION.md | ✅ Current | Marked as completed |
|
||||
| EPUB_IMPORT_EXPORT_SPECIFICATION.md | ✅ Current | Phase 1 & 2 marked complete |
|
||||
| storycove-collections-spec.md | ⚠️ Needs update | Should note search not implemented |
|
||||
| CLAUDE.md | ✅ Current | Good project guidance |
|
||||
|
||||
### 8.2 Code Documentation
|
||||
- ✅ Controllers: Well documented with Javadoc
|
||||
- ✅ Services: Good inline comments
|
||||
- ✅ Entities: Adequate field documentation
|
||||
- ⚠️ Tests: Limited documentation
|
||||
|
||||
---
|
||||
|
||||
## 9. Phase 1 Conclusions
|
||||
|
||||
### 9.1 Summary
|
||||
StoryCove is a **well-architected application** with strong entity design, comprehensive feature implementation, and good adherence to specifications. The codebase demonstrates professional-quality development practices.
|
||||
|
||||
### 9.2 Critical Finding
|
||||
**Collections Search**: The most critical issue is the incomplete Collections search implementation, which violates a mandatory architectural requirement and renders the Collections list view non-functional.
|
||||
|
||||
### 9.3 Test Coverage Gap
|
||||
With only 9 test files covering the basics, there is a significant testing gap that needs to be addressed to ensure code quality and prevent regressions.
|
||||
|
||||
### 9.4 Overall Assessment
|
||||
**Grade**: B+ (85%)
|
||||
- **Entity & Database**: A+ (100%)
|
||||
- **Service Layer**: B (85%)
|
||||
- **API Layer**: A- (90%)
|
||||
- **Test Coverage**: C (25%)
|
||||
- **Documentation**: A (95%)
|
||||
|
||||
---
|
||||
|
||||
## 10. Next Steps (Phase 2 & Beyond)
|
||||
|
||||
### Phase 2: Backend Audit (NEXT)
|
||||
1. 🚨 **URGENT**: Implement Collections search in SearchServiceAdapter/SolrService
|
||||
2. Deep dive into each service for business logic verification
|
||||
3. Review transaction boundaries and error handling
|
||||
4. Verify security measures (authentication, authorization, sanitization)
|
||||
|
||||
### Phase 3: Frontend Audit
|
||||
1. Verify UI components match UI/UX specifications
|
||||
2. Check Collections pagination implementation
|
||||
3. Review theme implementation (light/dark mode)
|
||||
4. Test responsive design
|
||||
|
||||
### Phase 4: Test Coverage
|
||||
1. Create CollectionServiceTest (PRIORITY 1)
|
||||
2. Create TagServiceTest with alias and merge tests
|
||||
3. Create EPUBImportServiceTest and EPUBExportServiceTest
|
||||
4. Create security-critical HtmlSanitizationServiceTest
|
||||
5. Add integration tests for search flows
|
||||
|
||||
### Phase 5: Documentation Updates
|
||||
1. Update API.md with any missing endpoints
|
||||
2. Update storycove-collections-spec.md with current status
|
||||
3. Create TESTING.md with coverage report
|
||||
|
||||
### Phase 6: Code Quality
|
||||
1. Run static analysis tools (SonarQube, SpotBugs)
|
||||
2. Review security vulnerabilities
|
||||
3. Performance profiling
|
||||
|
||||
---
|
||||
|
||||
## 11. Priority Action Items
|
||||
|
||||
### 🚨 CRITICAL (Must Fix Immediately)
|
||||
1. **Implement Collections Search** in SearchServiceAdapter
|
||||
- File: backend/src/main/java/com/storycove/service/SearchServiceAdapter.java
|
||||
- Add Solr indexing for Collections
|
||||
- Update CollectionService.searchCollections() to use search engine
|
||||
- Est. Time: 4-6 hours
|
||||
|
||||
### ⚠️ HIGH PRIORITY (Fix Soon)
|
||||
2. **Create CollectionServiceTest**
|
||||
- Verify CRUD operations
|
||||
- Test search functionality once implemented
|
||||
- Est. Time: 3-4 hours
|
||||
|
||||
3. **Create HtmlSanitizationServiceTest**
|
||||
- Security-critical testing
|
||||
- XSS prevention verification
|
||||
- Est. Time: 2-3 hours
|
||||
|
||||
4. **Create TagServiceTest**
|
||||
- Alias resolution
|
||||
- Merge operations
|
||||
- AI suggestions
|
||||
- Est. Time: 4-5 hours
|
||||
|
||||
### 📋 MEDIUM PRIORITY (Next Sprint)
|
||||
5. **EPUB Service Tests**
|
||||
- EPUBImportServiceTest
|
||||
- EPUBExportServiceTest
|
||||
- Est. Time: 5-6 hours
|
||||
|
||||
6. **Frontend Audit**
|
||||
- Verify Collections pagination
|
||||
- Check UI/UX compliance
|
||||
- Est. Time: 4-6 hours
|
||||
|
||||
### 📝 DOCUMENTATION (Ongoing)
|
||||
7. **Update API Documentation**
|
||||
- Verify all endpoints documented
|
||||
- Add missing examples
|
||||
- Est. Time: 2-3 hours
|
||||
|
||||
---
|
||||
|
||||
## 12. Appendix: File Structure
|
||||
|
||||
### Backend Structure
|
||||
```
|
||||
backend/src/main/java/com/storycove/
|
||||
├── controller/ (12 controllers - all implemented)
|
||||
├── service/ (20 services - 1 incomplete)
|
||||
├── entity/ (10 entities - all complete)
|
||||
├── repository/ (8 repositories - all appropriate)
|
||||
├── dto/ (~20 DTOs)
|
||||
├── exception/ (Custom exceptions)
|
||||
├── config/ (Security, DB, Solr config)
|
||||
└── security/ (JWT authentication)
|
||||
```
|
||||
|
||||
### Test Structure
|
||||
```
|
||||
backend/src/test/java/com/storycove/
|
||||
├── entity/ (4 entity tests)
|
||||
├── repository/ (3 repository tests)
|
||||
└── service/ (2 service tests)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
**Phase 1 Assessment Complete** ✅
|
||||
|
||||
**Next Phase**: Backend Audit (focusing on Collections search implementation)
|
||||
|
||||
**Estimated Total Time to Address All Issues**: 30-40 hours
|
||||
118
PORTABLE_TEXT_SETUP.md
Normal file
118
PORTABLE_TEXT_SETUP.md
Normal file
@@ -0,0 +1,118 @@
|
||||
# Portable Text Editor Setup Instructions
|
||||
|
||||
## Current Status
|
||||
|
||||
⚠️ **Temporarily Reverted to Original Editor**
|
||||
|
||||
Due to npm cache permission issues preventing Docker builds, I've temporarily reverted the imports back to `RichTextEditor`. The Portable Text implementation is complete and ready to activate once the npm issue is resolved.
|
||||
|
||||
## Files Ready for Portable Text
|
||||
|
||||
- ✅ `PortableTextEditor.tsx` - Complete implementation
|
||||
- ✅ `schema.ts` - Portable Text schema
|
||||
- ✅ `conversion.ts` - HTML ↔ Portable Text conversion
|
||||
- ✅ `package.json.with-portabletext` - Updated dependencies
|
||||
|
||||
## Docker Build Issue Resolution
|
||||
|
||||
The error `npm ci` requires `package-lock.json` but npm cache permissions prevent generating it.
|
||||
|
||||
### Solution Steps:
|
||||
|
||||
1. **Fix npm permissions:**
|
||||
```bash
|
||||
sudo chown -R $(whoami) ~/.npm
|
||||
```
|
||||
|
||||
2. **Switch to Portable Text setup:**
|
||||
```bash
|
||||
cd frontend
|
||||
mv package.json package.json.original
|
||||
mv package.json.with-portabletext package.json
|
||||
npm install # This will generate package-lock.json
|
||||
```
|
||||
|
||||
3. **Update component imports** (change RichTextEditor → PortableTextEditor):
|
||||
```typescript
|
||||
// In src/app/add-story/page.tsx and src/app/stories/[id]/edit/page.tsx
|
||||
import PortableTextEditor from '../../components/stories/PortableTextEditor';
|
||||
// And update the JSX to use <PortableTextEditor ... />
|
||||
```
|
||||
|
||||
4. **Build and test:**
|
||||
```bash
|
||||
npm run build
|
||||
docker-compose build
|
||||
```
|
||||
|
||||
## Implementation Complete
|
||||
|
||||
✅ **Portable Text Schema** - Defines formatting options matching the original editor
|
||||
✅ **HTML ↔ Portable Text Conversion** - Seamless conversion between formats
|
||||
✅ **Sanitization Integration** - Uses existing sanitization strategy
|
||||
✅ **Component Replacement** - PortableTextEditor replaces RichTextEditor
|
||||
✅ **Image Processing** - Maintains existing image processing functionality
|
||||
✅ **Toolbar** - All formatting buttons from original editor
|
||||
✅ **Keyboard Shortcuts** - Ctrl+B, Ctrl+I, Ctrl+Shift+1-6
|
||||
|
||||
## Features Maintained
|
||||
|
||||
### 1. **Formatting Options**
|
||||
- Bold, Italic, Underline, Strike, Code
|
||||
- Headings H1-H6
|
||||
- Paragraphs and Blockquotes
|
||||
- All original toolbar buttons
|
||||
|
||||
### 2. **Visual & HTML Modes**
|
||||
- Visual mode: Structured Portable Text editing
|
||||
- HTML mode: Direct HTML editing (fallback)
|
||||
- Live preview in HTML mode
|
||||
|
||||
### 3. **Image Processing**
|
||||
- Existing image processing pipeline maintained
|
||||
- Background image download and conversion
|
||||
- Processing status indicators
|
||||
- Warning system
|
||||
|
||||
### 4. **Paste Handling**
|
||||
- Rich text paste from websites
|
||||
- Image processing during paste
|
||||
- HTML sanitization
|
||||
- Structured content conversion
|
||||
|
||||
### 5. **Maximization & Resizing**
|
||||
- Fullscreen editing mode
|
||||
- Resizable editor height
|
||||
- Keyboard shortcuts (Escape to exit)
|
||||
|
||||
## Benefits of Portable Text
|
||||
|
||||
1. **Structured Content** - Content is stored as JSON, not just HTML
|
||||
2. **Future-Proof** - Easy to export/migrate content
|
||||
3. **Better Search** - Structured content works better with Typesense
|
||||
4. **Extensible** - Easy to add custom block types (images, etc.)
|
||||
5. **Sanitization** - Inherently safer than HTML parsing
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. Install the npm packages using one of the methods above
|
||||
2. Test the editor functionality
|
||||
3. Verify image processing works correctly
|
||||
4. Optional: Add custom image block types for enhanced image handling
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
frontend/src/
|
||||
├── components/stories/
|
||||
│ ├── PortableTextEditor.tsx # New editor component
|
||||
│ └── RichTextEditor.tsx # Original (can be removed after testing)
|
||||
├── lib/portabletext/
|
||||
│ ├── schema.ts # Portable Text schema and types
|
||||
│ └── conversion.ts # HTML ↔ Portable Text conversion
|
||||
└── app/
|
||||
├── add-story/page.tsx # Updated to use PortableTextEditor
|
||||
└── stories/[id]/edit/page.tsx # Updated to use PortableTextEditor
|
||||
```
|
||||
|
||||
The implementation is backward compatible and maintains all existing functionality while providing the benefits of structured content editing.
|
||||
269
REFRESH_TOKEN_IMPLEMENTATION.md
Normal file
269
REFRESH_TOKEN_IMPLEMENTATION.md
Normal file
@@ -0,0 +1,269 @@
|
||||
# Refresh Token Implementation
|
||||
|
||||
## Overview
|
||||
|
||||
This document describes the refresh token functionality implemented for StoryCove, allowing users to stay authenticated for up to 2 weeks with automatic token refresh.
|
||||
|
||||
## Architecture
|
||||
|
||||
### Token Types
|
||||
|
||||
1. **Access Token (JWT)**
|
||||
- Lifetime: 24 hours
|
||||
- Stored in: httpOnly cookie + localStorage
|
||||
- Used for: API authentication
|
||||
- Format: JWT with subject and libraryId claims
|
||||
|
||||
2. **Refresh Token**
|
||||
- Lifetime: 14 days (2 weeks)
|
||||
- Stored in: httpOnly cookie + database
|
||||
- Used for: Generating new access tokens
|
||||
- Format: Secure random 256-bit token (Base64 encoded)
|
||||
|
||||
### Token Flow
|
||||
|
||||
1. **Login**
|
||||
- User provides password
|
||||
- Backend validates password
|
||||
- Backend generates both access token and refresh token
|
||||
- Both tokens sent as httpOnly cookies
|
||||
- Access token also returned in response body for localStorage
|
||||
|
||||
2. **API Request**
|
||||
- Frontend sends access token via Authorization header and cookie
|
||||
- Backend validates access token
|
||||
- If valid: Request proceeds
|
||||
- If expired: Frontend attempts token refresh
|
||||
|
||||
3. **Token Refresh**
|
||||
- Frontend detects 401/403 response
|
||||
- Frontend automatically calls `/api/auth/refresh`
|
||||
- Backend validates refresh token from cookie
|
||||
- If valid: New access token generated and returned
|
||||
- If invalid/expired: User redirected to login
|
||||
|
||||
4. **Logout**
|
||||
- Frontend calls `/api/auth/logout`
|
||||
- Backend revokes refresh token in database
|
||||
- Both cookies cleared
|
||||
- User redirected to login page
|
||||
|
||||
## Backend Implementation
|
||||
|
||||
### New Files
|
||||
|
||||
1. **`RefreshToken.java`** - Entity class
|
||||
- Fields: id, token, expiresAt, createdAt, revokedAt, libraryId, userAgent, ipAddress
|
||||
- Helper methods: isExpired(), isRevoked(), isValid()
|
||||
|
||||
2. **`RefreshTokenRepository.java`** - Repository interface
|
||||
- findByToken(String)
|
||||
- deleteExpiredTokens(LocalDateTime)
|
||||
- revokeAllByLibraryId(String, LocalDateTime)
|
||||
- revokeAll(LocalDateTime)
|
||||
|
||||
3. **`RefreshTokenService.java`** - Service class
|
||||
- createRefreshToken(libraryId, userAgent, ipAddress)
|
||||
- verifyRefreshToken(token)
|
||||
- revokeToken(token)
|
||||
- revokeAllByLibraryId(libraryId)
|
||||
- cleanupExpiredTokens() - Scheduled daily at 3 AM
|
||||
|
||||
### Modified Files
|
||||
|
||||
1. **`JwtUtil.java`**
|
||||
- Added `refreshExpiration` property (14 days)
|
||||
- Added `generateRefreshToken()` method
|
||||
- Added `getRefreshExpirationMs()` method
|
||||
|
||||
2. **`AuthController.java`**
|
||||
- Updated `/login` endpoint to create and return refresh token
|
||||
- Added `/refresh` endpoint to handle token refresh
|
||||
- Updated `/logout` endpoint to revoke refresh token
|
||||
- Added helper methods: `getRefreshTokenFromCookies()`, `getClientIpAddress()`
|
||||
|
||||
3. **`SecurityConfig.java`**
|
||||
- Added `/api/auth/refresh` to public endpoints
|
||||
|
||||
4. **`application.yml`**
|
||||
- Added `storycove.jwt.refresh-expiration: 1209600000` (14 days)
|
||||
|
||||
## Frontend Implementation
|
||||
|
||||
### Modified Files
|
||||
|
||||
1. **`api.ts`**
|
||||
- Added automatic token refresh logic in response interceptor
|
||||
- Added request queuing during token refresh
|
||||
- Prevents multiple simultaneous refresh attempts
|
||||
- Automatically retries failed requests after refresh
|
||||
|
||||
### Token Refresh Logic
|
||||
|
||||
```typescript
|
||||
// On 401/403 response:
|
||||
1. Check if already retrying -> if yes, queue request
|
||||
2. Check if refresh/login endpoint -> if yes, logout
|
||||
3. Attempt token refresh via /api/auth/refresh
|
||||
4. If successful:
|
||||
- Update localStorage with new token
|
||||
- Retry original request
|
||||
- Process queued requests
|
||||
5. If failed:
|
||||
- Clear token
|
||||
- Redirect to login
|
||||
- Reject queued requests
|
||||
```
|
||||
|
||||
## Security Features
|
||||
|
||||
1. **httpOnly Cookies**: Prevents XSS attacks
|
||||
2. **Token Revocation**: Refresh tokens can be revoked
|
||||
3. **Database Storage**: Refresh tokens stored server-side
|
||||
4. **Expiration Tracking**: Tokens have strict expiration dates
|
||||
5. **IP & User Agent Tracking**: Stored for security auditing
|
||||
6. **Library Isolation**: Tokens scoped to specific library
|
||||
|
||||
## Database Schema
|
||||
|
||||
```sql
|
||||
CREATE TABLE refresh_tokens (
|
||||
id UUID PRIMARY KEY,
|
||||
token VARCHAR(255) UNIQUE NOT NULL,
|
||||
expires_at TIMESTAMP NOT NULL,
|
||||
created_at TIMESTAMP NOT NULL,
|
||||
revoked_at TIMESTAMP,
|
||||
library_id VARCHAR(255),
|
||||
user_agent VARCHAR(255) NOT NULL,
|
||||
ip_address VARCHAR(255) NOT NULL
|
||||
);
|
||||
|
||||
CREATE INDEX idx_refresh_token ON refresh_tokens(token);
|
||||
CREATE INDEX idx_expires_at ON refresh_tokens(expires_at);
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Backend (`application.yml`)
|
||||
|
||||
```yaml
|
||||
storycove:
|
||||
jwt:
|
||||
expiration: 86400000 # 24 hours (access token)
|
||||
refresh-expiration: 1209600000 # 14 days (refresh token)
|
||||
```
|
||||
|
||||
### Environment Variables
|
||||
|
||||
No new environment variables required. Existing `JWT_SECRET` is used.
|
||||
|
||||
## Testing
|
||||
|
||||
Comprehensive test suite in `RefreshTokenServiceTest.java`:
|
||||
- Token creation
|
||||
- Token validation
|
||||
- Expired token handling
|
||||
- Revoked token handling
|
||||
- Token revocation
|
||||
- Cleanup operations
|
||||
|
||||
Run tests:
|
||||
```bash
|
||||
cd backend
|
||||
mvn test -Dtest=RefreshTokenServiceTest
|
||||
```
|
||||
|
||||
## Maintenance
|
||||
|
||||
### Automated Cleanup
|
||||
|
||||
Expired tokens are automatically cleaned up daily at 3 AM via scheduled task in `RefreshTokenService.cleanupExpiredTokens()`.
|
||||
|
||||
### Manual Revocation
|
||||
|
||||
```java
|
||||
// Revoke all tokens for a library
|
||||
refreshTokenService.revokeAllByLibraryId("library-id");
|
||||
|
||||
// Revoke all tokens (logout all users)
|
||||
refreshTokenService.revokeAll();
|
||||
```
|
||||
|
||||
## User Experience
|
||||
|
||||
1. **Seamless Authentication**: Users stay logged in for 2 weeks
|
||||
2. **Automatic Refresh**: Token refresh happens transparently
|
||||
3. **No Interruptions**: API calls succeed even when access token expires
|
||||
4. **Backend Restart**: Users must re-login (JWT secret rotates on startup)
|
||||
5. **Cross-Device Library Switching**: Automatic library switching when using different devices with different libraries
|
||||
|
||||
## Cross-Device Library Switching
|
||||
|
||||
### Feature Overview
|
||||
|
||||
The system automatically detects and switches libraries when you use different devices authenticated to different libraries. This ensures you always see the correct library's data.
|
||||
|
||||
### How It Works
|
||||
|
||||
**Scenario 1: Active Access Token (within 24 hours)**
|
||||
1. Request comes in with valid JWT access token
|
||||
2. `JwtAuthenticationFilter` extracts `libraryId` from token
|
||||
3. Compares with `currentLibraryId` in backend
|
||||
4. **If different**: Automatically switches to token's library
|
||||
5. **If same**: Early return (no overhead, just string comparison)
|
||||
6. Request proceeds with correct library
|
||||
|
||||
**Scenario 2: Token Refresh (after 24 hours)**
|
||||
1. Access token expired, refresh token still valid
|
||||
2. `/api/auth/refresh` endpoint validates refresh token
|
||||
3. Extracts `libraryId` from refresh token
|
||||
4. Compares with `currentLibraryId` in backend
|
||||
5. **If different**: Automatically switches to token's library
|
||||
6. **If same**: Early return (no overhead)
|
||||
7. Generates new access token with correct `libraryId`
|
||||
|
||||
**Scenario 3: After Backend Restart**
|
||||
1. `currentLibraryId` is null (no active library)
|
||||
2. First request with any token automatically switches to that token's library
|
||||
3. Subsequent requests use early return optimization
|
||||
|
||||
### Performance
|
||||
|
||||
**When libraries match** (most common case):
|
||||
- Simple string comparison: `libraryId.equals(currentLibraryId)`
|
||||
- Immediate return - zero overhead
|
||||
- No datasource changes, no reindexing
|
||||
|
||||
**When libraries differ** (switching devices):
|
||||
- Synchronized library switch
|
||||
- Datasource routing updated instantly
|
||||
- Solr reindex runs asynchronously (doesn't block request)
|
||||
- Takes 2-3 seconds in background
|
||||
|
||||
### Edge Cases
|
||||
|
||||
**Multi-device simultaneous use:**
|
||||
- If two devices with different libraries are used simultaneously
|
||||
- Last request "wins" and switches backend to its library
|
||||
- Not recommended but handled gracefully
|
||||
- Each device corrects itself on next request
|
||||
|
||||
**Library doesn't exist:**
|
||||
- If token contains invalid `libraryId`
|
||||
- Library switch fails with error
|
||||
- Request is rejected with 500 error
|
||||
- User must re-login with valid credentials
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
Potential improvements:
|
||||
1. Persistent JWT secret (survive backend restarts)
|
||||
2. Sliding refresh token expiration (extend on use)
|
||||
3. Multiple device management (view/revoke sessions)
|
||||
4. Configurable token lifetimes via environment variables
|
||||
5. Token rotation (new refresh token on each use)
|
||||
6. Thread-local library context for true stateless operation
|
||||
|
||||
## Summary
|
||||
|
||||
The refresh token implementation provides a robust, secure authentication system that balances user convenience (2-week sessions) with security (short-lived access tokens, automatic refresh). The implementation follows industry best practices and provides a solid foundation for future enhancements.
|
||||
244
SOLR_LIBRARY_MIGRATION.md
Normal file
244
SOLR_LIBRARY_MIGRATION.md
Normal file
@@ -0,0 +1,244 @@
|
||||
# Solr Library Separation Migration Guide
|
||||
|
||||
This guide explains how to migrate existing StoryCove deployments to support proper library separation in Solr search.
|
||||
|
||||
## What Changed
|
||||
|
||||
The Solr service has been enhanced to support multi-tenant library separation by:
|
||||
- Adding a `libraryId` field to all Solr documents
|
||||
- Filtering all search queries by the current library context
|
||||
- Ensuring complete data isolation between libraries
|
||||
|
||||
## Migration Options
|
||||
|
||||
### Option 1: Docker Volume Reset (Recommended for Docker)
|
||||
|
||||
**Best for**: Development, staging, and Docker-based deployments where data loss is acceptable.
|
||||
|
||||
```bash
|
||||
# Stop the application
|
||||
docker-compose down
|
||||
|
||||
# Remove only the Solr data volume (preserves database and images)
|
||||
docker volume rm storycove_solr_data
|
||||
|
||||
# Restart - Solr will recreate cores with new schema
|
||||
docker-compose up -d
|
||||
|
||||
# Wait for services to start, then trigger reindex via admin panel
|
||||
```
|
||||
|
||||
**Pros**: Clean, simple, guaranteed to work
|
||||
**Cons**: Requires downtime, loses existing search index
|
||||
|
||||
### Option 2: Schema API Migration (Production Safe)
|
||||
|
||||
**Best for**: Production environments where you need to preserve uptime.
|
||||
|
||||
**Method A: Automatic (Recommended)**
|
||||
```bash
|
||||
# Single endpoint that adds field and migrates data
|
||||
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
|
||||
-H "Authorization: Bearer YOUR_JWT_TOKEN"
|
||||
```
|
||||
|
||||
**Method B: Manual Steps**
|
||||
```bash
|
||||
# Step 1: Add libraryId field via app API
|
||||
curl -X POST "http://your-app-host/api/admin/search/solr/add-library-field" \
|
||||
-H "Authorization: Bearer YOUR_JWT_TOKEN"
|
||||
|
||||
# Step 2: Run migration
|
||||
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
|
||||
-H "Authorization: Bearer YOUR_JWT_TOKEN"
|
||||
```
|
||||
|
||||
**Method C: Direct Solr API (if app API fails)**
|
||||
```bash
|
||||
# Add libraryId field to stories core
|
||||
curl -X POST "http://your-solr-host:8983/solr/storycove_stories/schema" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"add-field": {
|
||||
"name": "libraryId",
|
||||
"type": "string",
|
||||
"indexed": true,
|
||||
"stored": true,
|
||||
"required": false
|
||||
}
|
||||
}'
|
||||
|
||||
# Add libraryId field to authors core
|
||||
curl -X POST "http://your-solr-host:8983/solr/storycove_authors/schema" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{
|
||||
"add-field": {
|
||||
"name": "libraryId",
|
||||
"type": "string",
|
||||
"indexed": true,
|
||||
"stored": true,
|
||||
"required": false
|
||||
}
|
||||
}'
|
||||
|
||||
# Then run the migration
|
||||
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
|
||||
-H "Authorization: Bearer YOUR_JWT_TOKEN"
|
||||
```
|
||||
|
||||
**Pros**: No downtime, preserves service availability, automatic field addition
|
||||
**Cons**: Requires API access
|
||||
|
||||
### Option 3: Application-Level Migration (Recommended for Production)
|
||||
|
||||
**Best for**: Production environments with proper admin access.
|
||||
|
||||
1. **Deploy the code changes** to your environment
|
||||
2. **Access the admin panel** of your application
|
||||
3. **Navigate to search settings**
|
||||
4. **Use the "Migrate Library Schema" button** or API endpoint:
|
||||
```
|
||||
POST /api/admin/search/solr/migrate-library-schema
|
||||
```
|
||||
|
||||
**Pros**: User-friendly, handles all complexity internally
|
||||
**Cons**: Requires admin access to application
|
||||
|
||||
## Step-by-Step Migration Process
|
||||
|
||||
### For Docker Deployments
|
||||
|
||||
1. **Backup your data** (optional but recommended):
|
||||
```bash
|
||||
# Backup database
|
||||
docker-compose exec postgres pg_dump -U storycove storycove > backup.sql
|
||||
```
|
||||
|
||||
2. **Pull the latest code** with library separation fixes
|
||||
|
||||
3. **Choose migration approach**:
|
||||
- **Quick & Clean**: Use Option 1 (volume reset)
|
||||
- **Production**: Use Option 2 or 3
|
||||
|
||||
4. **Verify migration**:
|
||||
- Log in with different library passwords
|
||||
- Perform searches to confirm isolation
|
||||
- Check that new content gets indexed with library IDs
|
||||
|
||||
### For Kubernetes/Production Deployments
|
||||
|
||||
1. **Update your deployment** with the new container images
|
||||
|
||||
2. **Add the libraryId field** to Solr schema using Option 2
|
||||
|
||||
3. **Use the migration endpoint** (Option 3):
|
||||
```bash
|
||||
kubectl exec -it deployment/storycove-backend -- \
|
||||
curl -X POST http://localhost:8080/api/admin/search/solr/migrate-library-schema
|
||||
```
|
||||
|
||||
4. **Monitor logs** for successful migration
|
||||
|
||||
## Verification Steps
|
||||
|
||||
After migration, verify that library separation is working:
|
||||
|
||||
1. **Test with multiple libraries**:
|
||||
- Log in with Library A password
|
||||
- Add/search content
|
||||
- Log in with Library B password
|
||||
- Confirm Library A content is not visible
|
||||
|
||||
2. **Check Solr directly** (if accessible):
|
||||
```bash
|
||||
# Should show documents with libraryId field
|
||||
curl "http://solr:8983/solr/storycove_stories/select?q=*:*&fl=id,title,libraryId&rows=5"
|
||||
```
|
||||
|
||||
3. **Monitor application logs** for any library separation errors
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "unknown field 'libraryId'" Error
|
||||
|
||||
**Problem**: `ERROR: [doc=xxx] unknown field 'libraryId'`
|
||||
|
||||
**Cause**: The Solr schema doesn't have the libraryId field yet.
|
||||
|
||||
**Solutions**:
|
||||
|
||||
1. **Use the automated migration** (adds field automatically):
|
||||
```bash
|
||||
curl -X POST "http://your-app/api/admin/search/solr/migrate-library-schema"
|
||||
```
|
||||
|
||||
2. **Add field manually first**:
|
||||
```bash
|
||||
# Add field via app API
|
||||
curl -X POST "http://your-app/api/admin/search/solr/add-library-field"
|
||||
|
||||
# Then run migration
|
||||
curl -X POST "http://your-app/api/admin/search/solr/migrate-library-schema"
|
||||
```
|
||||
|
||||
3. **Direct Solr API** (if app API fails):
|
||||
```bash
|
||||
# Add to both cores
|
||||
curl -X POST "http://solr:8983/solr/storycove_stories/schema" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"add-field":{"name":"libraryId","type":"string","indexed":true,"stored":true}}'
|
||||
|
||||
curl -X POST "http://solr:8983/solr/storycove_authors/schema" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"add-field":{"name":"libraryId","type":"string","indexed":true,"stored":true}}'
|
||||
```
|
||||
|
||||
4. **For development**: Use Option 1 (volume reset) for clean restart
|
||||
|
||||
### Migration Endpoint Returns Error
|
||||
|
||||
Common causes:
|
||||
- Solr is not available (check connectivity)
|
||||
- No active library context (ensure user is authenticated)
|
||||
- Insufficient permissions (check JWT token/authentication)
|
||||
|
||||
### Search Results Still Mixed
|
||||
|
||||
This indicates incomplete migration:
|
||||
- Clear all Solr data and reindex completely
|
||||
- Verify that all documents have libraryId field
|
||||
- Check that search queries include library filters
|
||||
|
||||
## Environment-Specific Notes
|
||||
|
||||
### Development
|
||||
- Use Option 1 (volume reset) for simplicity
|
||||
- Data loss is acceptable in dev environments
|
||||
|
||||
### Staging
|
||||
- Use Option 2 or 3 to test production migration procedures
|
||||
- Verify migration process before applying to production
|
||||
|
||||
### Production
|
||||
- **Always backup data first**
|
||||
- Use Option 2 (Schema API) or Option 3 (Admin endpoint)
|
||||
- Plan for brief performance impact during reindexing
|
||||
- Monitor system resources during bulk reindexing
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
- **Reindexing time**: Depends on data size (typically 1000 docs/second)
|
||||
- **Memory usage**: May increase during bulk indexing
|
||||
- **Search performance**: Minimal impact from library filtering
|
||||
- **Storage**: Slight increase due to libraryId field
|
||||
|
||||
## Rollback Plan
|
||||
|
||||
If issues occur:
|
||||
|
||||
1. **Immediate**: Restart Solr to previous state (if using Option 1)
|
||||
2. **Schema revert**: Remove libraryId field via Schema API
|
||||
3. **Code rollback**: Deploy previous version without library separation
|
||||
4. **Data restore**: Restore from backup if necessary
|
||||
|
||||
This migration enables proper multi-tenant isolation while maintaining search performance and functionality.
|
||||
45
apply_migration_production.sh
Executable file
45
apply_migration_production.sh
Executable file
@@ -0,0 +1,45 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Run this script on your production server to apply the backup_jobs table migration
|
||||
# to all library databases
|
||||
|
||||
echo "Applying backup_jobs table migration to all databases..."
|
||||
echo ""
|
||||
|
||||
# Apply to each database
|
||||
for DB in storycove storycove_afterdark storycove_clas storycove_secret; do
|
||||
echo "Applying to $DB..."
|
||||
docker-compose exec -T postgres psql -U storycove -d "$DB" <<'SQL'
|
||||
CREATE TABLE IF NOT EXISTS backup_jobs (
|
||||
id UUID PRIMARY KEY,
|
||||
library_id VARCHAR(255) NOT NULL,
|
||||
type VARCHAR(50) NOT NULL CHECK (type IN ('DATABASE_ONLY', 'COMPLETE')),
|
||||
status VARCHAR(50) NOT NULL CHECK (status IN ('PENDING', 'IN_PROGRESS', 'COMPLETED', 'FAILED', 'EXPIRED')),
|
||||
file_path VARCHAR(1000),
|
||||
file_size_bytes BIGINT,
|
||||
progress_percent INTEGER,
|
||||
error_message VARCHAR(1000),
|
||||
created_at TIMESTAMP NOT NULL,
|
||||
started_at TIMESTAMP,
|
||||
completed_at TIMESTAMP,
|
||||
expires_at TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_library_id ON backup_jobs(library_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_status ON backup_jobs(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_expires_at ON backup_jobs(expires_at);
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_created_at ON backup_jobs(created_at DESC);
|
||||
SQL
|
||||
echo "✓ Done with $DB"
|
||||
echo ""
|
||||
done
|
||||
|
||||
echo "Migration complete! Verifying..."
|
||||
echo ""
|
||||
|
||||
# Verify tables exist
|
||||
for DB in storycove storycove_afterdark storycove_clas storycove_secret; do
|
||||
echo "Checking $DB:"
|
||||
docker-compose exec -T postgres psql -U storycove -d "$DB" -c "\d backup_jobs" 2>&1 | grep -E "Table|does not exist" || echo " ✓ Table exists"
|
||||
echo ""
|
||||
done
|
||||
@@ -2,8 +2,13 @@ FROM openjdk:17-jdk-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install Maven
|
||||
RUN apt-get update && apt-get install -y maven && rm -rf /var/lib/apt/lists/*
|
||||
# Install Maven and PostgreSQL 15 client tools
|
||||
RUN apt-get update && apt-get install -y wget ca-certificates gnupg maven && \
|
||||
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | apt-key add - && \
|
||||
echo "deb http://apt.postgresql.org/pub/repos/apt/ bullseye-pgdg main" > /etc/apt/sources.list.d/pgdg.list && \
|
||||
apt-get update && \
|
||||
apt-get install -y postgresql-client-15 && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Copy source code
|
||||
COPY . .
|
||||
|
||||
54
backend/apply_backup_jobs_migration.sh
Executable file
54
backend/apply_backup_jobs_migration.sh
Executable file
@@ -0,0 +1,54 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Script to apply backup_jobs table migration to all library databases
|
||||
# This should be run from the backend directory
|
||||
|
||||
set -e
|
||||
|
||||
# Use full docker path
|
||||
DOCKER="/usr/local/bin/docker"
|
||||
|
||||
echo "Applying backup_jobs table migration..."
|
||||
|
||||
# Get database connection details from environment or use defaults
|
||||
DB_HOST="${POSTGRES_HOST:-postgres}"
|
||||
DB_PORT="${POSTGRES_PORT:-5432}"
|
||||
DB_USER="${POSTGRES_USER:-storycove}"
|
||||
DB_PASSWORD="${POSTGRES_PASSWORD:-password}"
|
||||
|
||||
# List of databases to update
|
||||
DATABASES=("storycove" "storycove_afterdark")
|
||||
|
||||
for DB_NAME in "${DATABASES[@]}"; do
|
||||
echo ""
|
||||
echo "Applying migration to database: $DB_NAME"
|
||||
|
||||
# Check if database exists
|
||||
if $DOCKER exec storycove-postgres-1 psql -U "$DB_USER" -lqt | cut -d \| -f 1 | grep -qw "$DB_NAME"; then
|
||||
echo "Database $DB_NAME exists, applying migration..."
|
||||
|
||||
# Apply migration
|
||||
$DOCKER exec -i storycove-postgres-1 psql -U "$DB_USER" -d "$DB_NAME" < create_backup_jobs_table.sql
|
||||
|
||||
if [ $? -eq 0 ]; then
|
||||
echo "✓ Migration applied successfully to $DB_NAME"
|
||||
else
|
||||
echo "✗ Failed to apply migration to $DB_NAME"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "⚠ Database $DB_NAME does not exist, skipping..."
|
||||
fi
|
||||
done
|
||||
|
||||
echo ""
|
||||
echo "Migration complete!"
|
||||
echo ""
|
||||
echo "Verifying table creation..."
|
||||
for DB_NAME in "${DATABASES[@]}"; do
|
||||
if $DOCKER exec storycove-postgres-1 psql -U "$DB_USER" -lqt | cut -d \| -f 1 | grep -qw "$DB_NAME"; then
|
||||
echo ""
|
||||
echo "Checking $DB_NAME:"
|
||||
$DOCKER exec storycove-postgres-1 psql -U "$DB_USER" -d "$DB_NAME" -c "\d backup_jobs" 2>/dev/null || echo " Table not found in $DB_NAME"
|
||||
fi
|
||||
done
|
||||
29
backend/create_backup_jobs_table.sql
Normal file
29
backend/create_backup_jobs_table.sql
Normal file
@@ -0,0 +1,29 @@
|
||||
-- Create backup_jobs table for async backup job tracking
|
||||
-- This should be run on all library databases (default and afterdark)
|
||||
|
||||
CREATE TABLE IF NOT EXISTS backup_jobs (
|
||||
id UUID PRIMARY KEY,
|
||||
library_id VARCHAR(255) NOT NULL,
|
||||
type VARCHAR(50) NOT NULL CHECK (type IN ('DATABASE_ONLY', 'COMPLETE')),
|
||||
status VARCHAR(50) NOT NULL CHECK (status IN ('PENDING', 'IN_PROGRESS', 'COMPLETED', 'FAILED', 'EXPIRED')),
|
||||
file_path VARCHAR(1000),
|
||||
file_size_bytes BIGINT,
|
||||
progress_percent INTEGER,
|
||||
error_message VARCHAR(1000),
|
||||
created_at TIMESTAMP NOT NULL,
|
||||
started_at TIMESTAMP,
|
||||
completed_at TIMESTAMP,
|
||||
expires_at TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create index on library_id for faster lookups
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_library_id ON backup_jobs(library_id);
|
||||
|
||||
-- Create index on status for cleanup queries
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_status ON backup_jobs(status);
|
||||
|
||||
-- Create index on expires_at for cleanup queries
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_expires_at ON backup_jobs(expires_at);
|
||||
|
||||
-- Create index on created_at for ordering
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_created_at ON backup_jobs(created_at DESC);
|
||||
@@ -49,6 +49,10 @@
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-validation</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-actuator</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.postgresql</groupId>
|
||||
<artifactId>postgresql</artifactId>
|
||||
@@ -80,9 +84,33 @@
|
||||
<artifactId>httpclient5</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.typesense</groupId>
|
||||
<artifactId>typesense-java</artifactId>
|
||||
<version>1.3.0</version>
|
||||
<groupId>org.apache.solr</groupId>
|
||||
<artifactId>solr-solrj</artifactId>
|
||||
<version>9.9.0</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.eclipse.jetty</groupId>
|
||||
<artifactId>jetty-client</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.eclipse.jetty</groupId>
|
||||
<artifactId>jetty-util</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.eclipse.jetty</groupId>
|
||||
<artifactId>jetty-http</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.eclipse.jetty</groupId>
|
||||
<artifactId>jetty-io</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.httpcomponents.core5</groupId>
|
||||
<artifactId>httpcore5</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.httpcomponents.core5</groupId>
|
||||
<artifactId>httpcore5-h2</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.positiondev.epublib</groupId>
|
||||
@@ -119,6 +147,13 @@
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-maven-plugin</artifactId>
|
||||
</plugin>
|
||||
<plugin>
|
||||
<groupId>org.apache.maven.plugins</groupId>
|
||||
<artifactId>maven-compiler-plugin</artifactId>
|
||||
<configuration>
|
||||
<parameters>true</parameters>
|
||||
</configuration>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</build>
|
||||
</project>
|
||||
@@ -2,10 +2,12 @@ package com.storycove;
|
||||
|
||||
import org.springframework.boot.SpringApplication;
|
||||
import org.springframework.boot.autoconfigure.SpringBootApplication;
|
||||
import org.springframework.scheduling.annotation.EnableAsync;
|
||||
import org.springframework.scheduling.annotation.EnableScheduling;
|
||||
|
||||
@SpringBootApplication
|
||||
@EnableScheduling
|
||||
@EnableAsync
|
||||
public class StoryCoveApplication {
|
||||
|
||||
public static void main(String[] args) {
|
||||
|
||||
@@ -0,0 +1,111 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.boot.CommandLineRunner;
|
||||
import org.springframework.core.annotation.Order;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import javax.sql.DataSource;
|
||||
import java.sql.Connection;
|
||||
import java.sql.Statement;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* Runs database migrations on application startup.
|
||||
* This ensures all library databases have the required schema,
|
||||
* particularly for tables like backup_jobs that were added after initial deployment.
|
||||
*/
|
||||
@Component
|
||||
@Order(1) // Run early in startup sequence
|
||||
public class DatabaseMigrationRunner implements CommandLineRunner {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(DatabaseMigrationRunner.class);
|
||||
|
||||
@Autowired
|
||||
private DataSource dataSource;
|
||||
|
||||
@Value("${spring.datasource.username}")
|
||||
private String dbUsername;
|
||||
|
||||
@Value("${spring.datasource.password}")
|
||||
private String dbPassword;
|
||||
|
||||
// List of all library databases that need migrations
|
||||
private static final List<String> LIBRARY_DATABASES = Arrays.asList(
|
||||
"storycove", // default database
|
||||
"storycove_afterdark",
|
||||
"storycove_clas",
|
||||
"storycove_secret"
|
||||
);
|
||||
|
||||
// SQL for backup_jobs table migration (idempotent)
|
||||
private static final String BACKUP_JOBS_MIGRATION = """
|
||||
CREATE TABLE IF NOT EXISTS backup_jobs (
|
||||
id UUID PRIMARY KEY,
|
||||
library_id VARCHAR(255) NOT NULL,
|
||||
type VARCHAR(50) NOT NULL CHECK (type IN ('DATABASE_ONLY', 'COMPLETE')),
|
||||
status VARCHAR(50) NOT NULL CHECK (status IN ('PENDING', 'IN_PROGRESS', 'COMPLETED', 'FAILED', 'EXPIRED')),
|
||||
file_path VARCHAR(1000),
|
||||
file_size_bytes BIGINT,
|
||||
progress_percent INTEGER,
|
||||
error_message VARCHAR(1000),
|
||||
created_at TIMESTAMP NOT NULL,
|
||||
started_at TIMESTAMP,
|
||||
completed_at TIMESTAMP,
|
||||
expires_at TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_library_id ON backup_jobs(library_id);
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_status ON backup_jobs(status);
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_expires_at ON backup_jobs(expires_at);
|
||||
CREATE INDEX IF NOT EXISTS idx_backup_jobs_created_at ON backup_jobs(created_at DESC);
|
||||
""";
|
||||
|
||||
@Override
|
||||
public void run(String... args) throws Exception {
|
||||
logger.info("🗄️ Starting database migrations...");
|
||||
|
||||
for (String database : LIBRARY_DATABASES) {
|
||||
try {
|
||||
applyMigrations(database);
|
||||
logger.info("✅ Successfully applied migrations to database: {}", database);
|
||||
} catch (Exception e) {
|
||||
// Log error but don't fail startup if database doesn't exist yet
|
||||
if (e.getMessage() != null && e.getMessage().contains("does not exist")) {
|
||||
logger.warn("⚠️ Database {} does not exist yet, skipping migrations", database);
|
||||
} else {
|
||||
logger.error("❌ Failed to apply migrations to database: {}", database, e);
|
||||
// Don't throw - allow application to start even if some migrations fail
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
logger.info("✅ Database migrations completed");
|
||||
}
|
||||
|
||||
private void applyMigrations(String database) throws Exception {
|
||||
// We need to connect directly to each database, not through SmartRoutingDataSource
|
||||
// Build connection URL from the default datasource URL
|
||||
String originalUrl = dataSource.getConnection().getMetaData().getURL();
|
||||
String baseUrl = originalUrl.substring(0, originalUrl.lastIndexOf('/'));
|
||||
String targetUrl = baseUrl + "/" + database;
|
||||
|
||||
// Connect directly to target database using credentials from application properties
|
||||
try (Connection conn = java.sql.DriverManager.getConnection(
|
||||
targetUrl,
|
||||
dbUsername,
|
||||
dbPassword
|
||||
)) {
|
||||
// Apply backup_jobs migration
|
||||
try (Statement stmt = conn.createStatement()) {
|
||||
stmt.execute(BACKUP_JOBS_MIGRATION);
|
||||
}
|
||||
|
||||
logger.debug("Applied backup_jobs migration to {}", database);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -40,6 +40,8 @@ public class SecurityConfig {
|
||||
.sessionManagement(session -> session.sessionCreationPolicy(SessionCreationPolicy.STATELESS))
|
||||
.authorizeHttpRequests(authz -> authz
|
||||
// Public endpoints
|
||||
.requestMatchers("/api/auth/login").permitAll()
|
||||
.requestMatchers("/api/auth/refresh").permitAll() // Allow refresh without access token
|
||||
.requestMatchers("/api/auth/**").permitAll()
|
||||
.requestMatchers("/api/files/images/**").permitAll() // Public image serving
|
||||
.requestMatchers("/api/config/**").permitAll() // Public configuration endpoints
|
||||
|
||||
57
backend/src/main/java/com/storycove/config/SolrConfig.java
Normal file
57
backend/src/main/java/com/storycove/config/SolrConfig.java
Normal file
@@ -0,0 +1,57 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import org.apache.solr.client.solrj.SolrClient;
|
||||
import org.apache.solr.client.solrj.impl.HttpSolrClient;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
|
||||
@Configuration
|
||||
@ConditionalOnProperty(
|
||||
value = "storycove.search.engine",
|
||||
havingValue = "solr",
|
||||
matchIfMissing = false
|
||||
)
|
||||
public class SolrConfig {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(SolrConfig.class);
|
||||
|
||||
private final SolrProperties properties;
|
||||
|
||||
public SolrConfig(SolrProperties properties) {
|
||||
this.properties = properties;
|
||||
}
|
||||
|
||||
@Bean
|
||||
public SolrClient solrClient() {
|
||||
logger.info("Initializing Solr client with URL: {}", properties.getUrl());
|
||||
|
||||
HttpSolrClient.Builder builder = new HttpSolrClient.Builder(properties.getUrl())
|
||||
.withConnectionTimeout(properties.getConnection().getTimeout())
|
||||
.withSocketTimeout(properties.getConnection().getSocketTimeout());
|
||||
|
||||
SolrClient client = builder.build();
|
||||
|
||||
logger.info("Solr running without authentication");
|
||||
|
||||
// Test connection
|
||||
testConnection(client);
|
||||
|
||||
return client;
|
||||
}
|
||||
|
||||
private void testConnection(SolrClient client) {
|
||||
try {
|
||||
// Test connection by pinging the server
|
||||
var response = client.ping();
|
||||
logger.info("Solr connection successful - Response time: {}ms",
|
||||
response.getElapsedTime());
|
||||
} catch (Exception e) {
|
||||
logger.warn("Solr connection test failed during initialization: {}", e.getMessage());
|
||||
logger.debug("Solr connection test full error", e);
|
||||
// Don't throw exception here - let the client be created and handle failures in service methods
|
||||
}
|
||||
}
|
||||
}
|
||||
144
backend/src/main/java/com/storycove/config/SolrProperties.java
Normal file
144
backend/src/main/java/com/storycove/config/SolrProperties.java
Normal file
@@ -0,0 +1,144 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import org.springframework.boot.context.properties.ConfigurationProperties;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
@Component
|
||||
@ConfigurationProperties(prefix = "storycove.solr")
|
||||
public class SolrProperties {
|
||||
|
||||
private String url = "http://localhost:8983/solr";
|
||||
private String username;
|
||||
private String password;
|
||||
|
||||
private Cores cores = new Cores();
|
||||
private Connection connection = new Connection();
|
||||
private Query query = new Query();
|
||||
private Commit commit = new Commit();
|
||||
private Health health = new Health();
|
||||
|
||||
// Getters and setters
|
||||
public String getUrl() { return url; }
|
||||
public void setUrl(String url) { this.url = url; }
|
||||
|
||||
public String getUsername() { return username; }
|
||||
public void setUsername(String username) { this.username = username; }
|
||||
|
||||
public String getPassword() { return password; }
|
||||
public void setPassword(String password) { this.password = password; }
|
||||
|
||||
public Cores getCores() { return cores; }
|
||||
public void setCores(Cores cores) { this.cores = cores; }
|
||||
|
||||
public Connection getConnection() { return connection; }
|
||||
public void setConnection(Connection connection) { this.connection = connection; }
|
||||
|
||||
public Query getQuery() { return query; }
|
||||
public void setQuery(Query query) { this.query = query; }
|
||||
|
||||
public Commit getCommit() { return commit; }
|
||||
public void setCommit(Commit commit) { this.commit = commit; }
|
||||
|
||||
public Health getHealth() { return health; }
|
||||
public void setHealth(Health health) { this.health = health; }
|
||||
|
||||
public static class Cores {
|
||||
private String stories = "storycove_stories";
|
||||
private String authors = "storycove_authors";
|
||||
private String collections = "storycove_collections";
|
||||
|
||||
// Getters and setters
|
||||
public String getStories() { return stories; }
|
||||
public void setStories(String stories) { this.stories = stories; }
|
||||
|
||||
public String getAuthors() { return authors; }
|
||||
public void setAuthors(String authors) { this.authors = authors; }
|
||||
|
||||
public String getCollections() { return collections; }
|
||||
public void setCollections(String collections) { this.collections = collections; }
|
||||
}
|
||||
|
||||
public static class Connection {
|
||||
private int timeout = 30000;
|
||||
private int socketTimeout = 60000;
|
||||
private int maxConnectionsPerRoute = 10;
|
||||
private int maxConnectionsTotal = 30;
|
||||
private boolean retryOnFailure = true;
|
||||
private int maxRetries = 3;
|
||||
|
||||
// Getters and setters
|
||||
public int getTimeout() { return timeout; }
|
||||
public void setTimeout(int timeout) { this.timeout = timeout; }
|
||||
|
||||
public int getSocketTimeout() { return socketTimeout; }
|
||||
public void setSocketTimeout(int socketTimeout) { this.socketTimeout = socketTimeout; }
|
||||
|
||||
public int getMaxConnectionsPerRoute() { return maxConnectionsPerRoute; }
|
||||
public void setMaxConnectionsPerRoute(int maxConnectionsPerRoute) { this.maxConnectionsPerRoute = maxConnectionsPerRoute; }
|
||||
|
||||
public int getMaxConnectionsTotal() { return maxConnectionsTotal; }
|
||||
public void setMaxConnectionsTotal(int maxConnectionsTotal) { this.maxConnectionsTotal = maxConnectionsTotal; }
|
||||
|
||||
public boolean isRetryOnFailure() { return retryOnFailure; }
|
||||
public void setRetryOnFailure(boolean retryOnFailure) { this.retryOnFailure = retryOnFailure; }
|
||||
|
||||
public int getMaxRetries() { return maxRetries; }
|
||||
public void setMaxRetries(int maxRetries) { this.maxRetries = maxRetries; }
|
||||
}
|
||||
|
||||
public static class Query {
|
||||
private int defaultRows = 10;
|
||||
private int maxRows = 1000;
|
||||
private String defaultOperator = "AND";
|
||||
private boolean highlight = true;
|
||||
private boolean facets = true;
|
||||
|
||||
// Getters and setters
|
||||
public int getDefaultRows() { return defaultRows; }
|
||||
public void setDefaultRows(int defaultRows) { this.defaultRows = defaultRows; }
|
||||
|
||||
public int getMaxRows() { return maxRows; }
|
||||
public void setMaxRows(int maxRows) { this.maxRows = maxRows; }
|
||||
|
||||
public String getDefaultOperator() { return defaultOperator; }
|
||||
public void setDefaultOperator(String defaultOperator) { this.defaultOperator = defaultOperator; }
|
||||
|
||||
public boolean isHighlight() { return highlight; }
|
||||
public void setHighlight(boolean highlight) { this.highlight = highlight; }
|
||||
|
||||
public boolean isFacets() { return facets; }
|
||||
public void setFacets(boolean facets) { this.facets = facets; }
|
||||
}
|
||||
|
||||
public static class Commit {
|
||||
private boolean softCommit = true;
|
||||
private int commitWithin = 1000;
|
||||
private boolean waitSearcher = false;
|
||||
|
||||
// Getters and setters
|
||||
public boolean isSoftCommit() { return softCommit; }
|
||||
public void setSoftCommit(boolean softCommit) { this.softCommit = softCommit; }
|
||||
|
||||
public int getCommitWithin() { return commitWithin; }
|
||||
public void setCommitWithin(int commitWithin) { this.commitWithin = commitWithin; }
|
||||
|
||||
public boolean isWaitSearcher() { return waitSearcher; }
|
||||
public void setWaitSearcher(boolean waitSearcher) { this.waitSearcher = waitSearcher; }
|
||||
}
|
||||
|
||||
public static class Health {
|
||||
private int checkInterval = 30000;
|
||||
private int slowQueryThreshold = 5000;
|
||||
private boolean enableMetrics = true;
|
||||
|
||||
// Getters and setters
|
||||
public int getCheckInterval() { return checkInterval; }
|
||||
public void setCheckInterval(int checkInterval) { this.checkInterval = checkInterval; }
|
||||
|
||||
public int getSlowQueryThreshold() { return slowQueryThreshold; }
|
||||
public void setSlowQueryThreshold(int slowQueryThreshold) { this.slowQueryThreshold = slowQueryThreshold; }
|
||||
|
||||
public boolean isEnableMetrics() { return enableMetrics; }
|
||||
public void setEnableMetrics(boolean enableMetrics) { this.enableMetrics = enableMetrics; }
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,102 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Collection;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.repository.AuthorRepository;
|
||||
import com.storycove.repository.CollectionRepository;
|
||||
import com.storycove.repository.StoryRepository;
|
||||
import com.storycove.service.SearchServiceAdapter;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.boot.ApplicationArguments;
|
||||
import org.springframework.boot.ApplicationRunner;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* Automatically performs bulk reindexing of all entities on application startup.
|
||||
* This ensures that the search index is always in sync with the database,
|
||||
* especially after Solr volume recreation during deployment.
|
||||
*/
|
||||
@Component
|
||||
public class StartupIndexingRunner implements ApplicationRunner {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(StartupIndexingRunner.class);
|
||||
|
||||
@Autowired
|
||||
private SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
private StoryRepository storyRepository;
|
||||
|
||||
@Autowired
|
||||
private AuthorRepository authorRepository;
|
||||
|
||||
@Autowired
|
||||
private CollectionRepository collectionRepository;
|
||||
|
||||
@Override
|
||||
public void run(ApplicationArguments args) throws Exception {
|
||||
logger.info("========================================");
|
||||
logger.info("Starting automatic bulk reindexing...");
|
||||
logger.info("========================================");
|
||||
|
||||
try {
|
||||
// Check if search service is available
|
||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||
logger.warn("Search service (Solr) is not available. Skipping bulk reindexing.");
|
||||
logger.warn("Make sure Solr is running and accessible.");
|
||||
return;
|
||||
}
|
||||
|
||||
long startTime = System.currentTimeMillis();
|
||||
|
||||
// Index all stories
|
||||
logger.info("📚 Indexing stories...");
|
||||
List<Story> stories = storyRepository.findAllWithAssociations();
|
||||
if (!stories.isEmpty()) {
|
||||
searchServiceAdapter.bulkIndexStories(stories);
|
||||
logger.info("✅ Indexed {} stories", stories.size());
|
||||
} else {
|
||||
logger.info("ℹ️ No stories to index");
|
||||
}
|
||||
|
||||
// Index all authors
|
||||
logger.info("👤 Indexing authors...");
|
||||
List<Author> authors = authorRepository.findAll();
|
||||
if (!authors.isEmpty()) {
|
||||
searchServiceAdapter.bulkIndexAuthors(authors);
|
||||
logger.info("✅ Indexed {} authors", authors.size());
|
||||
} else {
|
||||
logger.info("ℹ️ No authors to index");
|
||||
}
|
||||
|
||||
// Index all collections
|
||||
logger.info("📂 Indexing collections...");
|
||||
List<Collection> collections = collectionRepository.findAllWithTags();
|
||||
if (!collections.isEmpty()) {
|
||||
searchServiceAdapter.bulkIndexCollections(collections);
|
||||
logger.info("✅ Indexed {} collections", collections.size());
|
||||
} else {
|
||||
logger.info("ℹ️ No collections to index");
|
||||
}
|
||||
|
||||
long duration = System.currentTimeMillis() - startTime;
|
||||
logger.info("========================================");
|
||||
logger.info("✅ Bulk reindexing completed successfully in {}ms", duration);
|
||||
logger.info("📊 Total indexed: {} stories, {} authors, {} collections",
|
||||
stories.size(), authors.size(), collections.size());
|
||||
logger.info("========================================");
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("========================================");
|
||||
logger.error("❌ Bulk reindexing failed", e);
|
||||
logger.error("========================================");
|
||||
// Don't throw the exception - let the application start even if indexing fails
|
||||
// This allows the application to be functional even with search issues
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,37 +0,0 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
import org.typesense.api.Client;
|
||||
import org.typesense.resources.Node;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
@Configuration
|
||||
public class TypesenseConfig {
|
||||
|
||||
@Value("${storycove.typesense.api-key}")
|
||||
private String apiKey;
|
||||
|
||||
@Value("${storycove.typesense.host}")
|
||||
private String host;
|
||||
|
||||
@Value("${storycove.typesense.port}")
|
||||
private int port;
|
||||
|
||||
@Bean
|
||||
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true)
|
||||
public Client typesenseClient() {
|
||||
List<Node> nodes = new ArrayList<>();
|
||||
nodes.add(new Node("http", host, String.valueOf(port)));
|
||||
|
||||
org.typesense.api.Configuration configuration = new org.typesense.api.Configuration(
|
||||
nodes, java.time.Duration.ofSeconds(10), apiKey
|
||||
);
|
||||
|
||||
return new Client(configuration);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,309 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.service.AuthorService;
|
||||
import com.storycove.service.SolrService;
|
||||
import com.storycove.service.SearchServiceAdapter;
|
||||
import com.storycove.service.StoryService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* Admin controller for managing Solr operations.
|
||||
* Provides endpoints for reindexing and index management.
|
||||
*/
|
||||
@RestController
|
||||
@RequestMapping("/api/admin/search")
|
||||
public class AdminSearchController {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AdminSearchController.class);
|
||||
|
||||
@Autowired
|
||||
private SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
private StoryService storyService;
|
||||
|
||||
@Autowired
|
||||
private AuthorService authorService;
|
||||
|
||||
@Autowired(required = false)
|
||||
private SolrService solrService;
|
||||
|
||||
/**
|
||||
* Get current search status
|
||||
*/
|
||||
@GetMapping("/status")
|
||||
public ResponseEntity<Map<String, Object>> getSearchStatus() {
|
||||
try {
|
||||
var status = searchServiceAdapter.getSearchStatus();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"primaryEngine", status.getPrimaryEngine(),
|
||||
"dualWrite", status.isDualWrite(),
|
||||
"solrAvailable", status.isSolrAvailable()
|
||||
));
|
||||
} catch (Exception e) {
|
||||
logger.error("Error getting search status", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"error", "Failed to get search status: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reindex all data in Solr
|
||||
*/
|
||||
@PostMapping("/solr/reindex")
|
||||
public ResponseEntity<Map<String, Object>> reindexSolr() {
|
||||
try {
|
||||
logger.info("Starting Solr full reindex");
|
||||
|
||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr is not available or healthy"
|
||||
));
|
||||
}
|
||||
|
||||
// Get all data from services
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
|
||||
// Bulk index directly in Solr
|
||||
if (solrService != null) {
|
||||
solrService.bulkIndexStories(allStories);
|
||||
solrService.bulkIndexAuthors(allAuthors);
|
||||
} else {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr service not available"
|
||||
));
|
||||
}
|
||||
|
||||
int totalIndexed = allStories.size() + allAuthors.size();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", String.format("Reindexed %d stories and %d authors in Solr",
|
||||
allStories.size(), allAuthors.size()),
|
||||
"storiesCount", allStories.size(),
|
||||
"authorsCount", allAuthors.size(),
|
||||
"totalCount", totalIndexed
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during Solr reindex", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr reindex failed: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Recreate Solr indices
|
||||
*/
|
||||
@PostMapping("/solr/recreate")
|
||||
public ResponseEntity<Map<String, Object>> recreateSolrIndices() {
|
||||
try {
|
||||
logger.info("Starting Solr indices recreation");
|
||||
|
||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr is not available or healthy"
|
||||
));
|
||||
}
|
||||
|
||||
// Recreate indices
|
||||
if (solrService != null) {
|
||||
solrService.recreateIndices();
|
||||
} else {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr service not available"
|
||||
));
|
||||
}
|
||||
|
||||
// Get all data and reindex
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
|
||||
// Bulk index after recreation
|
||||
solrService.bulkIndexStories(allStories);
|
||||
solrService.bulkIndexAuthors(allAuthors);
|
||||
|
||||
int totalIndexed = allStories.size() + allAuthors.size();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", String.format("Recreated Solr indices and indexed %d stories and %d authors",
|
||||
allStories.size(), allAuthors.size()),
|
||||
"storiesCount", allStories.size(),
|
||||
"authorsCount", allAuthors.size(),
|
||||
"totalCount", totalIndexed
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during Solr indices recreation", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr indices recreation failed: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Add libraryId field to Solr schema via Schema API.
|
||||
* This is a prerequisite for library-aware indexing.
|
||||
*/
|
||||
@PostMapping("/solr/add-library-field")
|
||||
public ResponseEntity<Map<String, Object>> addLibraryField() {
|
||||
try {
|
||||
logger.info("Starting Solr libraryId field addition");
|
||||
|
||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr is not available or healthy"
|
||||
));
|
||||
}
|
||||
|
||||
if (solrService == null) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr service not available"
|
||||
));
|
||||
}
|
||||
|
||||
// Add the libraryId field to the schema
|
||||
try {
|
||||
solrService.addLibraryIdField();
|
||||
logger.info("libraryId field added successfully to schema");
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "libraryId field added successfully to both stories and authors cores",
|
||||
"note", "You can now run the library schema migration"
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to add libraryId field to schema", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Failed to add libraryId field to schema: " + e.getMessage(),
|
||||
"details", "Check that Solr is accessible and schema is modifiable"
|
||||
));
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during libraryId field addition", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "libraryId field addition failed: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Migrate to library-aware Solr schema.
|
||||
* This endpoint handles the migration from non-library-aware to library-aware indexing.
|
||||
* It clears existing data and reindexes with library context.
|
||||
*/
|
||||
@PostMapping("/solr/migrate-library-schema")
|
||||
public ResponseEntity<Map<String, Object>> migrateLibrarySchema() {
|
||||
try {
|
||||
logger.info("Starting Solr library schema migration");
|
||||
|
||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr is not available or healthy"
|
||||
));
|
||||
}
|
||||
|
||||
if (solrService == null) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Solr service not available"
|
||||
));
|
||||
}
|
||||
|
||||
logger.info("Adding libraryId field to Solr schema");
|
||||
|
||||
// First, add the libraryId field to the schema via Schema API
|
||||
try {
|
||||
solrService.addLibraryIdField();
|
||||
logger.info("libraryId field added successfully to schema");
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to add libraryId field to schema", e);
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Failed to add libraryId field to schema: " + e.getMessage(),
|
||||
"details", "The schema must support the libraryId field before migration"
|
||||
));
|
||||
}
|
||||
|
||||
logger.info("Clearing existing Solr data for library schema migration");
|
||||
|
||||
// Clear existing data that doesn't have libraryId
|
||||
try {
|
||||
solrService.recreateIndices();
|
||||
} catch (Exception e) {
|
||||
logger.warn("Could not recreate indices (expected in production): {}", e.getMessage());
|
||||
// In production, just clear the data instead
|
||||
try {
|
||||
solrService.clearAllDocuments();
|
||||
logger.info("Cleared all documents from Solr cores");
|
||||
} catch (Exception clearError) {
|
||||
logger.error("Failed to clear documents", clearError);
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Failed to clear existing data: " + clearError.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
// Get all data and reindex with library context
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
|
||||
logger.info("Reindexing {} stories and {} authors with library context",
|
||||
allStories.size(), allAuthors.size());
|
||||
|
||||
// Bulk index everything (will now include libraryId from current library context)
|
||||
solrService.bulkIndexStories(allStories);
|
||||
solrService.bulkIndexAuthors(allAuthors);
|
||||
|
||||
int totalIndexed = allStories.size() + allAuthors.size();
|
||||
|
||||
logger.info("Solr library schema migration completed successfully");
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", String.format("Library schema migration completed. Reindexed %d stories and %d authors with library context.",
|
||||
allStories.size(), allAuthors.size()),
|
||||
"storiesCount", allStories.size(),
|
||||
"authorsCount", allAuthors.size(),
|
||||
"totalCount", totalIndexed,
|
||||
"note", "Ensure libraryId field exists in Solr schema before running this migration"
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during Solr library schema migration", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "Library schema migration failed: " + e.getMessage(),
|
||||
"details", "Make sure the libraryId field has been added to both stories and authors Solr cores"
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,11 +1,17 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.entity.RefreshToken;
|
||||
import com.storycove.service.LibraryService;
|
||||
import com.storycove.service.PasswordAuthenticationService;
|
||||
import com.storycove.service.RefreshTokenService;
|
||||
import com.storycove.util.JwtUtil;
|
||||
import jakarta.servlet.http.Cookie;
|
||||
import jakarta.servlet.http.HttpServletRequest;
|
||||
import jakarta.servlet.http.HttpServletResponse;
|
||||
import jakarta.validation.Valid;
|
||||
import jakarta.validation.constraints.NotBlank;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.http.HttpHeaders;
|
||||
import org.springframework.http.ResponseCookie;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
@@ -13,36 +19,61 @@ import org.springframework.security.core.Authentication;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.time.Duration;
|
||||
import java.util.Arrays;
|
||||
import java.util.Optional;
|
||||
|
||||
@RestController
|
||||
@RequestMapping("/api/auth")
|
||||
public class AuthController {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AuthController.class);
|
||||
|
||||
private final PasswordAuthenticationService passwordService;
|
||||
private final LibraryService libraryService;
|
||||
private final JwtUtil jwtUtil;
|
||||
private final RefreshTokenService refreshTokenService;
|
||||
|
||||
public AuthController(PasswordAuthenticationService passwordService, LibraryService libraryService, JwtUtil jwtUtil) {
|
||||
public AuthController(PasswordAuthenticationService passwordService, LibraryService libraryService, JwtUtil jwtUtil, RefreshTokenService refreshTokenService) {
|
||||
this.passwordService = passwordService;
|
||||
this.libraryService = libraryService;
|
||||
this.jwtUtil = jwtUtil;
|
||||
this.refreshTokenService = refreshTokenService;
|
||||
}
|
||||
|
||||
@PostMapping("/login")
|
||||
public ResponseEntity<?> login(@Valid @RequestBody LoginRequest request, HttpServletResponse response) {
|
||||
public ResponseEntity<?> login(@Valid @RequestBody LoginRequest request, HttpServletRequest httpRequest, HttpServletResponse response) {
|
||||
// Use new library-aware authentication
|
||||
String token = passwordService.authenticateAndSwitchLibrary(request.getPassword());
|
||||
|
||||
if (token != null) {
|
||||
// Set httpOnly cookie
|
||||
ResponseCookie cookie = ResponseCookie.from("token", token)
|
||||
// Get library ID from JWT token
|
||||
String libraryId = jwtUtil.getLibraryIdFromToken(token);
|
||||
|
||||
// Get user agent and IP address for refresh token
|
||||
String userAgent = httpRequest.getHeader("User-Agent");
|
||||
String ipAddress = getClientIpAddress(httpRequest);
|
||||
|
||||
// Create refresh token
|
||||
RefreshToken refreshToken = refreshTokenService.createRefreshToken(libraryId, userAgent, ipAddress);
|
||||
|
||||
// Set access token cookie (24 hours)
|
||||
ResponseCookie accessCookie = ResponseCookie.from("token", token)
|
||||
.httpOnly(true)
|
||||
.secure(false) // Set to true in production with HTTPS
|
||||
.path("/")
|
||||
.maxAge(Duration.ofDays(1))
|
||||
.build();
|
||||
|
||||
response.addHeader(HttpHeaders.SET_COOKIE, cookie.toString());
|
||||
// Set refresh token cookie (14 days)
|
||||
ResponseCookie refreshCookie = ResponseCookie.from("refreshToken", refreshToken.getToken())
|
||||
.httpOnly(true)
|
||||
.secure(false) // Set to true in production with HTTPS
|
||||
.path("/")
|
||||
.maxAge(Duration.ofDays(14))
|
||||
.build();
|
||||
|
||||
response.addHeader(HttpHeaders.SET_COOKIE, accessCookie.toString());
|
||||
response.addHeader(HttpHeaders.SET_COOKIE, refreshCookie.toString());
|
||||
|
||||
String libraryInfo = passwordService.getCurrentLibraryInfo();
|
||||
return ResponseEntity.ok(new LoginResponse("Authentication successful - " + libraryInfo, token));
|
||||
@@ -51,20 +82,90 @@ public class AuthController {
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/refresh")
|
||||
public ResponseEntity<?> refresh(HttpServletRequest request, HttpServletResponse response) {
|
||||
// Get refresh token from cookie
|
||||
String refreshTokenString = getRefreshTokenFromCookies(request);
|
||||
|
||||
if (refreshTokenString == null) {
|
||||
return ResponseEntity.status(401).body(new ErrorResponse("Refresh token not found"));
|
||||
}
|
||||
|
||||
// Verify refresh token
|
||||
Optional<RefreshToken> refreshTokenOpt = refreshTokenService.verifyRefreshToken(refreshTokenString);
|
||||
|
||||
if (refreshTokenOpt.isEmpty()) {
|
||||
return ResponseEntity.status(401).body(new ErrorResponse("Invalid or expired refresh token"));
|
||||
}
|
||||
|
||||
RefreshToken refreshToken = refreshTokenOpt.get();
|
||||
String tokenLibraryId = refreshToken.getLibraryId();
|
||||
|
||||
// Check if we need to switch libraries based on refresh token's library ID
|
||||
try {
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
|
||||
// Switch library if refresh token's library differs from current library
|
||||
// This handles cross-device library switching on token refresh
|
||||
if (tokenLibraryId != null && !tokenLibraryId.equals(currentLibraryId)) {
|
||||
logger.info("Refresh token library '{}' differs from current library '{}', switching libraries",
|
||||
tokenLibraryId, currentLibraryId);
|
||||
libraryService.switchToLibraryAfterAuthentication(tokenLibraryId);
|
||||
} else if (currentLibraryId == null && tokenLibraryId != null) {
|
||||
// Handle case after backend restart where no library is active
|
||||
logger.info("No active library on refresh, switching to refresh token's library: {}", tokenLibraryId);
|
||||
libraryService.switchToLibraryAfterAuthentication(tokenLibraryId);
|
||||
}
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to switch library during token refresh: {}", e.getMessage());
|
||||
return ResponseEntity.status(500).body(new ErrorResponse("Failed to switch library: " + e.getMessage()));
|
||||
}
|
||||
|
||||
// Generate new access token
|
||||
String newAccessToken = jwtUtil.generateToken("user", tokenLibraryId);
|
||||
|
||||
// Set new access token cookie
|
||||
ResponseCookie cookie = ResponseCookie.from("token", newAccessToken)
|
||||
.httpOnly(true)
|
||||
.secure(false) // Set to true in production with HTTPS
|
||||
.path("/")
|
||||
.maxAge(Duration.ofDays(1))
|
||||
.build();
|
||||
|
||||
response.addHeader(HttpHeaders.SET_COOKIE, cookie.toString());
|
||||
|
||||
return ResponseEntity.ok(new LoginResponse("Token refreshed successfully", newAccessToken));
|
||||
}
|
||||
|
||||
@PostMapping("/logout")
|
||||
public ResponseEntity<?> logout(HttpServletResponse response) {
|
||||
public ResponseEntity<?> logout(HttpServletRequest request, HttpServletResponse response) {
|
||||
// Clear authentication state
|
||||
libraryService.clearAuthentication();
|
||||
|
||||
// Clear the cookie
|
||||
ResponseCookie cookie = ResponseCookie.from("token", "")
|
||||
// Revoke refresh token if present
|
||||
String refreshTokenString = getRefreshTokenFromCookies(request);
|
||||
if (refreshTokenString != null) {
|
||||
refreshTokenService.findByToken(refreshTokenString).ifPresent(refreshTokenService::revokeToken);
|
||||
}
|
||||
|
||||
// Clear the access token cookie
|
||||
ResponseCookie accessCookie = ResponseCookie.from("token", "")
|
||||
.httpOnly(true)
|
||||
.secure(false)
|
||||
.path("/")
|
||||
.maxAge(Duration.ZERO)
|
||||
.build();
|
||||
|
||||
response.addHeader(HttpHeaders.SET_COOKIE, cookie.toString());
|
||||
// Clear the refresh token cookie
|
||||
ResponseCookie refreshCookie = ResponseCookie.from("refreshToken", "")
|
||||
.httpOnly(true)
|
||||
.secure(false)
|
||||
.path("/")
|
||||
.maxAge(Duration.ZERO)
|
||||
.build();
|
||||
|
||||
response.addHeader(HttpHeaders.SET_COOKIE, accessCookie.toString());
|
||||
response.addHeader(HttpHeaders.SET_COOKIE, refreshCookie.toString());
|
||||
|
||||
return ResponseEntity.ok(new MessageResponse("Logged out successfully"));
|
||||
}
|
||||
@@ -78,6 +179,33 @@ public class AuthController {
|
||||
}
|
||||
}
|
||||
|
||||
// Helper methods
|
||||
private String getRefreshTokenFromCookies(HttpServletRequest request) {
|
||||
if (request.getCookies() == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return Arrays.stream(request.getCookies())
|
||||
.filter(cookie -> "refreshToken".equals(cookie.getName()))
|
||||
.map(Cookie::getValue)
|
||||
.findFirst()
|
||||
.orElse(null);
|
||||
}
|
||||
|
||||
private String getClientIpAddress(HttpServletRequest request) {
|
||||
String xForwardedFor = request.getHeader("X-Forwarded-For");
|
||||
if (xForwardedFor != null && !xForwardedFor.isEmpty()) {
|
||||
return xForwardedFor.split(",")[0].trim();
|
||||
}
|
||||
|
||||
String xRealIp = request.getHeader("X-Real-IP");
|
||||
if (xRealIp != null && !xRealIp.isEmpty()) {
|
||||
return xRealIp;
|
||||
}
|
||||
|
||||
return request.getRemoteAddr();
|
||||
}
|
||||
|
||||
// DTOs
|
||||
public static class LoginRequest {
|
||||
@NotBlank(message = "Password is required")
|
||||
|
||||
@@ -4,7 +4,7 @@ import com.storycove.dto.*;
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.service.AuthorService;
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import com.storycove.service.SearchServiceAdapter;
|
||||
import jakarta.servlet.http.HttpServletRequest;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
@@ -32,12 +32,12 @@ public class AuthorController {
|
||||
|
||||
private final AuthorService authorService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
public AuthorController(AuthorService authorService, ImageService imageService, TypesenseService typesenseService) {
|
||||
public AuthorController(AuthorService authorService, ImageService imageService, SearchServiceAdapter searchServiceAdapter) {
|
||||
this.authorService = authorService;
|
||||
this.imageService = imageService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@GetMapping
|
||||
@@ -258,7 +258,17 @@ public class AuthorController {
|
||||
@RequestParam(defaultValue = "name") String sortBy,
|
||||
@RequestParam(defaultValue = "asc") String sortOrder) {
|
||||
|
||||
SearchResultDto<AuthorSearchDto> searchResults = typesenseService.searchAuthors(q, page, size, sortBy, sortOrder);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
List<AuthorSearchDto> authorSearchResults = searchServiceAdapter.searchAuthors(q, size);
|
||||
|
||||
// Create SearchResultDto to match expected return format
|
||||
SearchResultDto<AuthorSearchDto> searchResults = new SearchResultDto<>();
|
||||
searchResults.setResults(authorSearchResults);
|
||||
searchResults.setQuery(q);
|
||||
searchResults.setPage(page);
|
||||
searchResults.setPerPage(size);
|
||||
searchResults.setTotalHits(authorSearchResults.size());
|
||||
searchResults.setSearchTimeMs(0); // SearchServiceAdapter doesn't provide timing
|
||||
|
||||
// Convert AuthorSearchDto results to AuthorDto
|
||||
SearchResultDto<AuthorDto> results = new SearchResultDto<>();
|
||||
@@ -283,7 +293,7 @@ public class AuthorController {
|
||||
public ResponseEntity<Map<String, Object>> reindexAuthorsTypesense() {
|
||||
try {
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
typesenseService.reindexAllAuthors(allAuthors);
|
||||
searchServiceAdapter.bulkIndexAuthors(allAuthors);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Reindexed " + allAuthors.size() + " authors",
|
||||
@@ -303,7 +313,7 @@ public class AuthorController {
|
||||
try {
|
||||
// This will delete the existing collection and recreate it with correct schema
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
typesenseService.reindexAllAuthors(allAuthors);
|
||||
searchServiceAdapter.bulkIndexAuthors(allAuthors);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Recreated authors collection and indexed " + allAuthors.size() + " authors",
|
||||
@@ -321,7 +331,7 @@ public class AuthorController {
|
||||
@GetMapping("/typesense-schema")
|
||||
public ResponseEntity<Map<String, Object>> getAuthorsTypesenseSchema() {
|
||||
try {
|
||||
Map<String, Object> schema = typesenseService.getAuthorsCollectionSchema();
|
||||
Map<String, Object> schema = Map.of("status", "authors collection schema retrieved from search service");
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"schema", schema
|
||||
@@ -355,7 +365,7 @@ public class AuthorController {
|
||||
|
||||
// Reindex all authors after cleaning
|
||||
if (cleanedCount > 0) {
|
||||
typesenseService.reindexAllAuthors(allAuthors);
|
||||
searchServiceAdapter.bulkIndexAuthors(allAuthors);
|
||||
}
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
|
||||
@@ -9,7 +9,6 @@ import com.storycove.service.CollectionService;
|
||||
import com.storycove.service.EPUBExportService;
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.ReadingTimeService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
@@ -31,19 +30,16 @@ public class CollectionController {
|
||||
|
||||
private final CollectionService collectionService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
private final EPUBExportService epubExportService;
|
||||
|
||||
@Autowired
|
||||
public CollectionController(CollectionService collectionService,
|
||||
ImageService imageService,
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
ReadingTimeService readingTimeService,
|
||||
EPUBExportService epubExportService) {
|
||||
this.collectionService = collectionService;
|
||||
this.imageService = imageService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.readingTimeService = readingTimeService;
|
||||
this.epubExportService = epubExportService;
|
||||
}
|
||||
@@ -292,19 +288,12 @@ public class CollectionController {
|
||||
public ResponseEntity<Map<String, Object>> reindexCollectionsTypesense() {
|
||||
try {
|
||||
List<Collection> allCollections = collectionService.findAllWithTags();
|
||||
if (typesenseService != null) {
|
||||
typesenseService.reindexAllCollections(allCollections);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Successfully reindexed all collections",
|
||||
"count", allCollections.size()
|
||||
));
|
||||
} else {
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", false,
|
||||
"message", "Typesense service not available"
|
||||
));
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Collections indexing not yet implemented in Solr",
|
||||
"count", allCollections.size()
|
||||
));
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to reindex collections", e);
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
|
||||
@@ -2,25 +2,44 @@ package com.storycove.controller;
|
||||
|
||||
import com.storycove.dto.HtmlSanitizationConfigDto;
|
||||
import com.storycove.service.HtmlSanitizationService;
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.StoryService;
|
||||
import com.storycove.entity.Story;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
|
||||
import java.util.Map;
|
||||
import java.util.List;
|
||||
import java.util.HashMap;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.nio.file.Files;
|
||||
import java.io.IOException;
|
||||
|
||||
@RestController
|
||||
@RequestMapping("/api/config")
|
||||
public class ConfigController {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(ConfigController.class);
|
||||
|
||||
private final HtmlSanitizationService htmlSanitizationService;
|
||||
private final ImageService imageService;
|
||||
private final StoryService storyService;
|
||||
|
||||
@Value("${app.reading.speed.default:200}")
|
||||
private int defaultReadingSpeed;
|
||||
|
||||
@Autowired
|
||||
public ConfigController(HtmlSanitizationService htmlSanitizationService) {
|
||||
public ConfigController(HtmlSanitizationService htmlSanitizationService, ImageService imageService, StoryService storyService) {
|
||||
this.htmlSanitizationService = htmlSanitizationService;
|
||||
this.imageService = imageService;
|
||||
this.storyService = storyService;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -51,4 +70,177 @@ public class ConfigController {
|
||||
public ResponseEntity<Map<String, Integer>> getReadingSpeed() {
|
||||
return ResponseEntity.ok(Map.of("wordsPerMinute", defaultReadingSpeed));
|
||||
}
|
||||
|
||||
/**
|
||||
* Preview orphaned content images cleanup (dry run)
|
||||
*/
|
||||
@PostMapping("/cleanup/images/preview")
|
||||
public ResponseEntity<Map<String, Object>> previewImageCleanup() {
|
||||
try {
|
||||
logger.info("Starting image cleanup preview");
|
||||
ImageService.ContentImageCleanupResult result = imageService.cleanupOrphanedContentImages(true);
|
||||
|
||||
// Create detailed file information with story relationships
|
||||
logger.info("Processing {} orphaned files for detailed information", result.getOrphanedImages().size());
|
||||
List<Map<String, Object>> orphanedFiles = result.getOrphanedImages().stream()
|
||||
.map(filePath -> {
|
||||
try {
|
||||
return createFileInfo(filePath);
|
||||
} catch (Exception e) {
|
||||
logger.error("Error processing file {}: {}", filePath, e.getMessage());
|
||||
// Return a basic error entry instead of failing completely
|
||||
Map<String, Object> errorEntry = new HashMap<>();
|
||||
errorEntry.put("filePath", filePath);
|
||||
errorEntry.put("fileName", Paths.get(filePath).getFileName().toString());
|
||||
errorEntry.put("fileSize", 0L);
|
||||
errorEntry.put("formattedSize", "0 B");
|
||||
errorEntry.put("storyId", "error");
|
||||
errorEntry.put("storyTitle", null);
|
||||
errorEntry.put("storyExists", false);
|
||||
errorEntry.put("canAccessStory", false);
|
||||
errorEntry.put("error", e.getMessage());
|
||||
return errorEntry;
|
||||
}
|
||||
})
|
||||
.toList();
|
||||
|
||||
// Use HashMap to avoid Map.of() null value issues
|
||||
Map<String, Object> response = new HashMap<>();
|
||||
response.put("success", true);
|
||||
response.put("orphanedCount", result.getOrphanedImages().size());
|
||||
response.put("totalSizeBytes", result.getTotalSizeBytes());
|
||||
response.put("formattedSize", result.getFormattedSize());
|
||||
response.put("foldersToDelete", result.getFoldersToDelete());
|
||||
response.put("referencedImagesCount", result.getTotalReferencedImages());
|
||||
response.put("errors", result.getErrors());
|
||||
response.put("hasErrors", result.hasErrors());
|
||||
response.put("dryRun", true);
|
||||
response.put("orphanedFiles", orphanedFiles);
|
||||
|
||||
logger.info("Image cleanup preview completed successfully");
|
||||
return ResponseEntity.ok(response);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to preview image cleanup", e);
|
||||
Map<String, Object> errorResponse = new HashMap<>();
|
||||
errorResponse.put("success", false);
|
||||
errorResponse.put("error", "Failed to preview image cleanup: " + (e.getMessage() != null ? e.getMessage() : e.getClass().getSimpleName()));
|
||||
return ResponseEntity.status(500).body(errorResponse);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute orphaned content images cleanup
|
||||
*/
|
||||
@PostMapping("/cleanup/images/execute")
|
||||
public ResponseEntity<Map<String, Object>> executeImageCleanup() {
|
||||
try {
|
||||
ImageService.ContentImageCleanupResult result = imageService.cleanupOrphanedContentImages(false);
|
||||
|
||||
Map<String, Object> response = Map.of(
|
||||
"success", true,
|
||||
"deletedCount", result.getOrphanedImages().size(),
|
||||
"totalSizeBytes", result.getTotalSizeBytes(),
|
||||
"formattedSize", result.getFormattedSize(),
|
||||
"foldersDeleted", result.getFoldersToDelete(),
|
||||
"referencedImagesCount", result.getTotalReferencedImages(),
|
||||
"errors", result.getErrors(),
|
||||
"hasErrors", result.hasErrors(),
|
||||
"dryRun", false
|
||||
);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.status(500).body(Map.of(
|
||||
"success", false,
|
||||
"error", "Failed to execute image cleanup: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create detailed file information for orphaned image including story relationship
|
||||
*/
|
||||
private Map<String, Object> createFileInfo(String filePath) {
|
||||
try {
|
||||
Path path = Paths.get(filePath);
|
||||
String fileName = path.getFileName().toString();
|
||||
long fileSize = Files.exists(path) ? Files.size(path) : 0;
|
||||
|
||||
// Extract story UUID from the path (content images are stored in /content/{storyId}/)
|
||||
String storyId = extractStoryIdFromPath(filePath);
|
||||
|
||||
// Look up the story if we have a valid UUID
|
||||
Story relatedStory = null;
|
||||
if (storyId != null) {
|
||||
try {
|
||||
UUID storyUuid = UUID.fromString(storyId);
|
||||
relatedStory = storyService.findById(storyUuid);
|
||||
} catch (Exception e) {
|
||||
logger.debug("Could not find story with ID {}: {}", storyId, e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
Map<String, Object> fileInfo = new HashMap<>();
|
||||
fileInfo.put("filePath", filePath);
|
||||
fileInfo.put("fileName", fileName);
|
||||
fileInfo.put("fileSize", fileSize);
|
||||
fileInfo.put("formattedSize", formatBytes(fileSize));
|
||||
fileInfo.put("storyId", storyId != null ? storyId : "unknown");
|
||||
fileInfo.put("storyTitle", relatedStory != null ? relatedStory.getTitle() : null);
|
||||
fileInfo.put("storyExists", relatedStory != null);
|
||||
fileInfo.put("canAccessStory", relatedStory != null);
|
||||
|
||||
return fileInfo;
|
||||
} catch (Exception e) {
|
||||
logger.error("Error creating file info for {}: {}", filePath, e.getMessage());
|
||||
Map<String, Object> errorInfo = new HashMap<>();
|
||||
errorInfo.put("filePath", filePath);
|
||||
errorInfo.put("fileName", Paths.get(filePath).getFileName().toString());
|
||||
errorInfo.put("fileSize", 0L);
|
||||
errorInfo.put("formattedSize", "0 B");
|
||||
errorInfo.put("storyId", "error");
|
||||
errorInfo.put("storyTitle", null);
|
||||
errorInfo.put("storyExists", false);
|
||||
errorInfo.put("canAccessStory", false);
|
||||
errorInfo.put("error", e.getMessage() != null ? e.getMessage() : e.getClass().getSimpleName());
|
||||
return errorInfo;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract story ID from content image file path
|
||||
*/
|
||||
private String extractStoryIdFromPath(String filePath) {
|
||||
try {
|
||||
// Content images are stored in: /path/to/uploads/content/{storyId}/filename.ext
|
||||
Path path = Paths.get(filePath);
|
||||
Path parent = path.getParent();
|
||||
if (parent != null) {
|
||||
String potentialUuid = parent.getFileName().toString();
|
||||
// Basic UUID validation (36 characters with dashes in right places)
|
||||
if (potentialUuid.length() == 36 &&
|
||||
potentialUuid.charAt(8) == '-' &&
|
||||
potentialUuid.charAt(13) == '-' &&
|
||||
potentialUuid.charAt(18) == '-' &&
|
||||
potentialUuid.charAt(23) == '-') {
|
||||
return potentialUuid;
|
||||
}
|
||||
}
|
||||
} catch (Exception e) {
|
||||
// Invalid path or other error
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Format file size in human readable format
|
||||
*/
|
||||
private String formatBytes(long bytes) {
|
||||
if (bytes < 1024) return bytes + " B";
|
||||
if (bytes < 1024 * 1024) return String.format("%.1f KB", bytes / 1024.0);
|
||||
if (bytes < 1024 * 1024 * 1024) return String.format("%.1f MB", bytes / (1024.0 * 1024.0));
|
||||
return String.format("%.1f GB", bytes / (1024.0 * 1024.0 * 1024.0));
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,8 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.service.AsyncBackupService;
|
||||
import com.storycove.service.DatabaseManagementService;
|
||||
import com.storycove.service.LibraryService;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.http.HttpHeaders;
|
||||
@@ -12,6 +14,7 @@ import org.springframework.web.multipart.MultipartFile;
|
||||
import java.io.IOException;
|
||||
import java.time.LocalDateTime;
|
||||
import java.time.format.DateTimeFormatter;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
@RestController
|
||||
@@ -21,6 +24,12 @@ public class DatabaseController {
|
||||
@Autowired
|
||||
private DatabaseManagementService databaseManagementService;
|
||||
|
||||
@Autowired
|
||||
private AsyncBackupService asyncBackupService;
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
@PostMapping("/backup")
|
||||
public ResponseEntity<Resource> backupDatabase() {
|
||||
try {
|
||||
@@ -83,19 +92,141 @@ public class DatabaseController {
|
||||
}
|
||||
|
||||
@PostMapping("/backup-complete")
|
||||
public ResponseEntity<Resource> backupComplete() {
|
||||
public ResponseEntity<Map<String, Object>> backupCompleteAsync() {
|
||||
try {
|
||||
Resource backup = databaseManagementService.createCompleteBackup();
|
||||
String libraryId = libraryService.getCurrentLibraryId();
|
||||
if (libraryId == null) {
|
||||
return ResponseEntity.badRequest()
|
||||
.body(Map.of("success", false, "message", "No library selected"));
|
||||
}
|
||||
|
||||
String timestamp = LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss"));
|
||||
String filename = "storycove_complete_backup_" + timestamp + ".zip";
|
||||
// Start backup job asynchronously
|
||||
com.storycove.entity.BackupJob job = asyncBackupService.startBackupJob(
|
||||
libraryId,
|
||||
com.storycove.entity.BackupJob.BackupType.COMPLETE
|
||||
);
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Backup started",
|
||||
"jobId", job.getId().toString(),
|
||||
"status", job.getStatus().toString()
|
||||
));
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to start backup: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
@GetMapping("/backup-status/{jobId}")
|
||||
public ResponseEntity<Map<String, Object>> getBackupStatus(@PathVariable String jobId) {
|
||||
try {
|
||||
java.util.UUID uuid = java.util.UUID.fromString(jobId);
|
||||
java.util.Optional<com.storycove.entity.BackupJob> jobOpt = asyncBackupService.getJobStatus(uuid);
|
||||
|
||||
if (jobOpt.isEmpty()) {
|
||||
return ResponseEntity.notFound().build();
|
||||
}
|
||||
|
||||
com.storycove.entity.BackupJob job = jobOpt.get();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"jobId", job.getId().toString(),
|
||||
"status", job.getStatus().toString(),
|
||||
"progress", job.getProgressPercent(),
|
||||
"fileSizeBytes", job.getFileSizeBytes() != null ? job.getFileSizeBytes() : 0,
|
||||
"createdAt", job.getCreatedAt().toString(),
|
||||
"completedAt", job.getCompletedAt() != null ? job.getCompletedAt().toString() : "",
|
||||
"errorMessage", job.getErrorMessage() != null ? job.getErrorMessage() : ""
|
||||
));
|
||||
} catch (IllegalArgumentException e) {
|
||||
return ResponseEntity.badRequest()
|
||||
.body(Map.of("success", false, "message", "Invalid job ID"));
|
||||
}
|
||||
}
|
||||
|
||||
@GetMapping("/backup-download/{jobId}")
|
||||
public ResponseEntity<Resource> downloadBackup(@PathVariable String jobId) {
|
||||
try {
|
||||
java.util.UUID uuid = java.util.UUID.fromString(jobId);
|
||||
Resource backup = asyncBackupService.getBackupFile(uuid);
|
||||
|
||||
java.util.Optional<com.storycove.entity.BackupJob> jobOpt = asyncBackupService.getJobStatus(uuid);
|
||||
if (jobOpt.isEmpty()) {
|
||||
return ResponseEntity.notFound().build();
|
||||
}
|
||||
|
||||
com.storycove.entity.BackupJob job = jobOpt.get();
|
||||
String timestamp = job.getCreatedAt().format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss"));
|
||||
String extension = job.getType() == com.storycove.entity.BackupJob.BackupType.COMPLETE ? "zip" : "sql";
|
||||
String filename = "storycove_backup_" + timestamp + "." + extension;
|
||||
|
||||
return ResponseEntity.ok()
|
||||
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"" + filename + "\"")
|
||||
.header(HttpHeaders.CONTENT_TYPE, "application/zip")
|
||||
.header(HttpHeaders.CONTENT_TYPE,
|
||||
job.getType() == com.storycove.entity.BackupJob.BackupType.COMPLETE
|
||||
? "application/zip"
|
||||
: "application/sql")
|
||||
.body(backup);
|
||||
} catch (IllegalArgumentException e) {
|
||||
return ResponseEntity.badRequest().build();
|
||||
} catch (Exception e) {
|
||||
throw new RuntimeException("Failed to create complete backup: " + e.getMessage(), e);
|
||||
throw new RuntimeException("Failed to download backup: " + e.getMessage(), e);
|
||||
}
|
||||
}
|
||||
|
||||
@GetMapping("/backup-list")
|
||||
public ResponseEntity<Map<String, Object>> listBackups() {
|
||||
try {
|
||||
String libraryId = libraryService.getCurrentLibraryId();
|
||||
if (libraryId == null) {
|
||||
return ResponseEntity.badRequest()
|
||||
.body(Map.of("success", false, "message", "No library selected"));
|
||||
}
|
||||
|
||||
List<com.storycove.entity.BackupJob> jobs = asyncBackupService.listBackupJobs(libraryId);
|
||||
|
||||
List<Map<String, Object>> jobsList = jobs.stream()
|
||||
.map(job -> {
|
||||
Map<String, Object> jobMap = new java.util.HashMap<>();
|
||||
jobMap.put("jobId", job.getId().toString());
|
||||
jobMap.put("type", job.getType().toString());
|
||||
jobMap.put("status", job.getStatus().toString());
|
||||
jobMap.put("progress", job.getProgressPercent());
|
||||
jobMap.put("fileSizeBytes", job.getFileSizeBytes() != null ? job.getFileSizeBytes() : 0L);
|
||||
jobMap.put("createdAt", job.getCreatedAt().toString());
|
||||
jobMap.put("completedAt", job.getCompletedAt() != null ? job.getCompletedAt().toString() : "");
|
||||
return jobMap;
|
||||
})
|
||||
.collect(java.util.stream.Collectors.toList());
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"backups", jobsList
|
||||
));
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to list backups: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
@DeleteMapping("/backup/{jobId}")
|
||||
public ResponseEntity<Map<String, Object>> deleteBackup(@PathVariable String jobId) {
|
||||
try {
|
||||
java.util.UUID uuid = java.util.UUID.fromString(jobId);
|
||||
asyncBackupService.deleteBackupJob(uuid);
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Backup deleted successfully"
|
||||
));
|
||||
} catch (IllegalArgumentException e) {
|
||||
return ResponseEntity.badRequest()
|
||||
.body(Map.of("success", false, "message", "Invalid job ID"));
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to delete backup: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -2,6 +2,8 @@ package com.storycove.controller;
|
||||
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.LibraryService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.core.io.FileSystemResource;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.http.HttpHeaders;
|
||||
@@ -21,6 +23,7 @@ import java.util.Map;
|
||||
@RestController
|
||||
@RequestMapping("/api/files")
|
||||
public class FileController {
|
||||
private static final Logger log = LoggerFactory.getLogger(FileController.class);
|
||||
|
||||
private final ImageService imageService;
|
||||
private final LibraryService libraryService;
|
||||
@@ -32,7 +35,7 @@ public class FileController {
|
||||
|
||||
private String getCurrentLibraryId() {
|
||||
String libraryId = libraryService.getCurrentLibraryId();
|
||||
System.out.println("FileController - Current Library ID: " + libraryId);
|
||||
log.debug("FileController - Current Library ID: {}", libraryId);
|
||||
return libraryId != null ? libraryId : "default";
|
||||
}
|
||||
|
||||
@@ -48,7 +51,7 @@ public class FileController {
|
||||
String imageUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
||||
response.put("url", imageUrl);
|
||||
|
||||
System.out.println("Upload response - path: " + imagePath + ", url: " + imageUrl);
|
||||
log.debug("Upload response - path: {}, url: {}", imagePath, imageUrl);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
} catch (IllegalArgumentException e) {
|
||||
|
||||
@@ -0,0 +1,57 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.dto.LibraryOverviewStatsDto;
|
||||
import com.storycove.service.LibraryService;
|
||||
import com.storycove.service.LibraryStatisticsService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
@RestController
|
||||
@RequestMapping("/api/libraries/{libraryId}/statistics")
|
||||
public class LibraryStatisticsController {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(LibraryStatisticsController.class);
|
||||
|
||||
@Autowired
|
||||
private LibraryStatisticsService statisticsService;
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
/**
|
||||
* Get overview statistics for a library
|
||||
*/
|
||||
@GetMapping("/overview")
|
||||
public ResponseEntity<?> getOverviewStatistics(@PathVariable String libraryId) {
|
||||
try {
|
||||
// Verify library exists
|
||||
if (libraryService.getLibraryById(libraryId) == null) {
|
||||
return ResponseEntity.notFound().build();
|
||||
}
|
||||
|
||||
LibraryOverviewStatsDto stats = statisticsService.getOverviewStatistics(libraryId);
|
||||
return ResponseEntity.ok(stats);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to get overview statistics for library: {}", libraryId, e);
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(new ErrorResponse("Failed to retrieve statistics: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
// Error response DTO
|
||||
private static class ErrorResponse {
|
||||
private String error;
|
||||
|
||||
public ErrorResponse(String error) {
|
||||
this.error = error;
|
||||
}
|
||||
|
||||
public String getError() {
|
||||
return error;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -2,7 +2,7 @@ package com.storycove.controller;
|
||||
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.service.StoryService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import com.storycove.service.SearchServiceAdapter;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
@@ -14,25 +14,19 @@ import java.util.Map;
|
||||
@RequestMapping("/api/search")
|
||||
public class SearchController {
|
||||
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
private final StoryService storyService;
|
||||
|
||||
public SearchController(@Autowired(required = false) TypesenseService typesenseService, StoryService storyService) {
|
||||
this.typesenseService = typesenseService;
|
||||
public SearchController(SearchServiceAdapter searchServiceAdapter, StoryService storyService) {
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
this.storyService = storyService;
|
||||
}
|
||||
|
||||
@PostMapping("/reindex")
|
||||
public ResponseEntity<?> reindexAllStories() {
|
||||
if (typesenseService == null) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"error", "Typesense service is not available"
|
||||
));
|
||||
}
|
||||
|
||||
try {
|
||||
List<Story> allStories = storyService.findAll();
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
searchServiceAdapter.bulkIndexStories(allStories);
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"message", "Successfully reindexed all stories",
|
||||
@@ -47,17 +41,8 @@ public class SearchController {
|
||||
|
||||
@GetMapping("/health")
|
||||
public ResponseEntity<?> searchHealthCheck() {
|
||||
if (typesenseService == null) {
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"status", "disabled",
|
||||
"message", "Typesense service is disabled"
|
||||
));
|
||||
}
|
||||
|
||||
try {
|
||||
// Try a simple search to test connectivity
|
||||
typesenseService.searchSuggestions("test", 1);
|
||||
|
||||
// Search service is operational if it's injected
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"status", "healthy",
|
||||
"message", "Search service is operational"
|
||||
|
||||
@@ -12,9 +12,7 @@ import com.storycove.service.*;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.PageImpl;
|
||||
import org.springframework.data.domain.PageRequest;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.data.domain.Sort;
|
||||
@@ -41,11 +39,13 @@ public class StoryController {
|
||||
private final SeriesService seriesService;
|
||||
private final HtmlSanitizationService sanitizationService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
private final CollectionService collectionService;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
private final EPUBImportService epubImportService;
|
||||
private final EPUBExportService epubExportService;
|
||||
private final AsyncImageProcessingService asyncImageProcessingService;
|
||||
private final ImageProcessingProgressService progressService;
|
||||
|
||||
public StoryController(StoryService storyService,
|
||||
AuthorService authorService,
|
||||
@@ -53,20 +53,24 @@ public class StoryController {
|
||||
HtmlSanitizationService sanitizationService,
|
||||
ImageService imageService,
|
||||
CollectionService collectionService,
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
SearchServiceAdapter searchServiceAdapter,
|
||||
ReadingTimeService readingTimeService,
|
||||
EPUBImportService epubImportService,
|
||||
EPUBExportService epubExportService) {
|
||||
EPUBExportService epubExportService,
|
||||
AsyncImageProcessingService asyncImageProcessingService,
|
||||
ImageProcessingProgressService progressService) {
|
||||
this.storyService = storyService;
|
||||
this.authorService = authorService;
|
||||
this.seriesService = seriesService;
|
||||
this.sanitizationService = sanitizationService;
|
||||
this.imageService = imageService;
|
||||
this.collectionService = collectionService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
this.readingTimeService = readingTimeService;
|
||||
this.epubImportService = epubImportService;
|
||||
this.epubExportService = epubExportService;
|
||||
this.asyncImageProcessingService = asyncImageProcessingService;
|
||||
this.progressService = progressService;
|
||||
}
|
||||
|
||||
@GetMapping
|
||||
@@ -146,6 +150,10 @@ public class StoryController {
|
||||
updateStoryFromRequest(story, request);
|
||||
|
||||
Story savedStory = storyService.createWithTagNames(story, request.getTagNames());
|
||||
|
||||
// Process external images in content after saving
|
||||
savedStory = processExternalImagesIfNeeded(savedStory);
|
||||
|
||||
logger.info("Successfully created story: {} (ID: {})", savedStory.getTitle(), savedStory.getId());
|
||||
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedStory));
|
||||
}
|
||||
@@ -163,6 +171,10 @@ public class StoryController {
|
||||
}
|
||||
|
||||
Story updatedStory = storyService.updateWithTagNames(id, request);
|
||||
|
||||
// Process external images in content after saving
|
||||
updatedStory = processExternalImagesIfNeeded(updatedStory);
|
||||
|
||||
logger.info("Successfully updated story: {}", updatedStory.getTitle());
|
||||
return ResponseEntity.ok(convertToDto(updatedStory));
|
||||
}
|
||||
@@ -263,13 +275,10 @@ public class StoryController {
|
||||
|
||||
@PostMapping("/reindex")
|
||||
public ResponseEntity<String> manualReindex() {
|
||||
if (typesenseService == null) {
|
||||
return ResponseEntity.ok("Typesense is not enabled, no reindexing performed");
|
||||
}
|
||||
|
||||
try {
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
searchServiceAdapter.bulkIndexStories(allStories);
|
||||
return ResponseEntity.ok("Successfully reindexed " + allStories.size() + " stories");
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.status(500).body("Failed to reindex stories: " + e.getMessage());
|
||||
@@ -280,7 +289,7 @@ public class StoryController {
|
||||
public ResponseEntity<Map<String, Object>> reindexStoriesTypesense() {
|
||||
try {
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
searchServiceAdapter.bulkIndexStories(allStories);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Reindexed " + allStories.size() + " stories",
|
||||
@@ -300,7 +309,7 @@ public class StoryController {
|
||||
try {
|
||||
// This will delete the existing collection and recreate it with correct schema
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
searchServiceAdapter.bulkIndexStories(allStories);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Recreated stories collection and indexed " + allStories.size() + " stories",
|
||||
@@ -326,7 +335,7 @@ public class StoryController {
|
||||
@RequestParam(required = false) Integer maxRating,
|
||||
@RequestParam(required = false) String sortBy,
|
||||
@RequestParam(required = false) String sortDir,
|
||||
@RequestParam(required = false) String facetBy,
|
||||
@RequestParam(required = false) List<String> facetBy,
|
||||
// Advanced filters
|
||||
@RequestParam(required = false) Integer minWordCount,
|
||||
@RequestParam(required = false) Integer maxWordCount,
|
||||
@@ -345,16 +354,35 @@ public class StoryController {
|
||||
@RequestParam(required = false) Boolean hiddenGemsOnly) {
|
||||
|
||||
|
||||
if (typesenseService != null) {
|
||||
SearchResultDto<StorySearchDto> results = typesenseService.searchStories(
|
||||
query, page, size, authors, tags, minRating, maxRating, sortBy, sortDir, facetBy,
|
||||
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||
unratedOnly, readingStatus, hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter,
|
||||
minTagCount, popularOnly, hiddenGemsOnly);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
try {
|
||||
// Convert authors list to single author string (for now, use first author)
|
||||
String authorFilter = (authors != null && !authors.isEmpty()) ? authors.get(0) : null;
|
||||
|
||||
// DEBUG: Log all received parameters
|
||||
logger.info("CONTROLLER DEBUG - Received parameters:");
|
||||
logger.info(" readingStatus: '{}'", readingStatus);
|
||||
logger.info(" seriesFilter: '{}'", seriesFilter);
|
||||
logger.info(" hasReadingProgress: {}", hasReadingProgress);
|
||||
logger.info(" hasCoverImage: {}", hasCoverImage);
|
||||
logger.info(" createdAfter: '{}'", createdAfter);
|
||||
logger.info(" lastReadAfter: '{}'", lastReadAfter);
|
||||
logger.info(" unratedOnly: {}", unratedOnly);
|
||||
|
||||
SearchResultDto<StorySearchDto> results = searchServiceAdapter.searchStories(
|
||||
query, tags, authorFilter, seriesFilter, minWordCount, maxWordCount,
|
||||
minRating != null ? minRating.floatValue() : null,
|
||||
null, // isRead - now handled by readingStatus advanced filter
|
||||
null, // isFavorite - now handled by readingStatus advanced filter
|
||||
sortBy, sortDir, page, size, facetBy,
|
||||
// Advanced filters
|
||||
createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||
unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
|
||||
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
|
||||
return ResponseEntity.ok(results);
|
||||
} else {
|
||||
// Fallback to basic search if Typesense is not available
|
||||
return ResponseEntity.badRequest().body(null);
|
||||
} catch (Exception e) {
|
||||
logger.error("Search failed", e);
|
||||
return ResponseEntity.internalServerError().body(null);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -363,10 +391,12 @@ public class StoryController {
|
||||
@RequestParam String query,
|
||||
@RequestParam(defaultValue = "5") int limit) {
|
||||
|
||||
if (typesenseService != null) {
|
||||
List<String> suggestions = typesenseService.searchSuggestions(query, limit);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
try {
|
||||
List<String> suggestions = searchServiceAdapter.getTagSuggestions(query, limit);
|
||||
return ResponseEntity.ok(suggestions);
|
||||
} else {
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to get search suggestions", e);
|
||||
return ResponseEntity.ok(new ArrayList<>());
|
||||
}
|
||||
}
|
||||
@@ -456,7 +486,9 @@ public class StoryController {
|
||||
story.setTitle(createReq.getTitle());
|
||||
story.setSummary(createReq.getSummary());
|
||||
story.setDescription(createReq.getDescription());
|
||||
|
||||
story.setContentHtml(sanitizationService.sanitize(createReq.getContentHtml()));
|
||||
|
||||
story.setSourceUrl(createReq.getSourceUrl());
|
||||
story.setVolume(createReq.getVolume());
|
||||
|
||||
@@ -689,6 +721,50 @@ public class StoryController {
|
||||
return dto;
|
||||
}
|
||||
|
||||
private Story processExternalImagesIfNeeded(Story story) {
|
||||
try {
|
||||
if (story.getContentHtml() != null && !story.getContentHtml().trim().isEmpty()) {
|
||||
logger.debug("Starting async image processing for story: {}", story.getId());
|
||||
|
||||
// Start async processing - this returns immediately
|
||||
asyncImageProcessingService.processStoryImagesAsync(story.getId(), story.getContentHtml());
|
||||
|
||||
logger.info("Async image processing started for story: {}", story.getId());
|
||||
}
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to start async image processing for story {}: {}",
|
||||
story.getId(), e.getMessage(), e);
|
||||
// Don't fail the entire operation if image processing fails
|
||||
}
|
||||
|
||||
return story;
|
||||
}
|
||||
|
||||
@GetMapping("/{id}/image-processing-progress")
|
||||
public ResponseEntity<Map<String, Object>> getImageProcessingProgress(@PathVariable UUID id) {
|
||||
ImageProcessingProgressService.ImageProcessingProgress progress = progressService.getProgress(id);
|
||||
|
||||
if (progress == null) {
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"isProcessing", false,
|
||||
"message", "No active image processing"
|
||||
));
|
||||
}
|
||||
|
||||
Map<String, Object> response = Map.of(
|
||||
"isProcessing", !progress.isCompleted(),
|
||||
"totalImages", progress.getTotalImages(),
|
||||
"processedImages", progress.getProcessedImages(),
|
||||
"currentImageUrl", progress.getCurrentImageUrl() != null ? progress.getCurrentImageUrl() : "",
|
||||
"status", progress.getStatus(),
|
||||
"progressPercentage", progress.getProgressPercentage(),
|
||||
"completed", progress.isCompleted(),
|
||||
"error", progress.getErrorMessage() != null ? progress.getErrorMessage() : ""
|
||||
);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
}
|
||||
|
||||
@GetMapping("/check-duplicate")
|
||||
public ResponseEntity<Map<String, Object>> checkDuplicate(
|
||||
@RequestParam String title,
|
||||
|
||||
@@ -0,0 +1,183 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
public class LibraryOverviewStatsDto {
|
||||
|
||||
// Collection Overview
|
||||
private long totalStories;
|
||||
private long totalAuthors;
|
||||
private long totalSeries;
|
||||
private long totalTags;
|
||||
private long totalCollections;
|
||||
private long uniqueSourceDomains;
|
||||
|
||||
// Content Metrics
|
||||
private long totalWordCount;
|
||||
private double averageWordsPerStory;
|
||||
private StoryWordCountDto longestStory;
|
||||
private StoryWordCountDto shortestStory;
|
||||
|
||||
// Reading Time (based on 250 words/minute)
|
||||
private long totalReadingTimeMinutes;
|
||||
private double averageReadingTimeMinutes;
|
||||
|
||||
// Constructor
|
||||
public LibraryOverviewStatsDto() {
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public long getTotalStories() {
|
||||
return totalStories;
|
||||
}
|
||||
|
||||
public void setTotalStories(long totalStories) {
|
||||
this.totalStories = totalStories;
|
||||
}
|
||||
|
||||
public long getTotalAuthors() {
|
||||
return totalAuthors;
|
||||
}
|
||||
|
||||
public void setTotalAuthors(long totalAuthors) {
|
||||
this.totalAuthors = totalAuthors;
|
||||
}
|
||||
|
||||
public long getTotalSeries() {
|
||||
return totalSeries;
|
||||
}
|
||||
|
||||
public void setTotalSeries(long totalSeries) {
|
||||
this.totalSeries = totalSeries;
|
||||
}
|
||||
|
||||
public long getTotalTags() {
|
||||
return totalTags;
|
||||
}
|
||||
|
||||
public void setTotalTags(long totalTags) {
|
||||
this.totalTags = totalTags;
|
||||
}
|
||||
|
||||
public long getTotalCollections() {
|
||||
return totalCollections;
|
||||
}
|
||||
|
||||
public void setTotalCollections(long totalCollections) {
|
||||
this.totalCollections = totalCollections;
|
||||
}
|
||||
|
||||
public long getUniqueSourceDomains() {
|
||||
return uniqueSourceDomains;
|
||||
}
|
||||
|
||||
public void setUniqueSourceDomains(long uniqueSourceDomains) {
|
||||
this.uniqueSourceDomains = uniqueSourceDomains;
|
||||
}
|
||||
|
||||
public long getTotalWordCount() {
|
||||
return totalWordCount;
|
||||
}
|
||||
|
||||
public void setTotalWordCount(long totalWordCount) {
|
||||
this.totalWordCount = totalWordCount;
|
||||
}
|
||||
|
||||
public double getAverageWordsPerStory() {
|
||||
return averageWordsPerStory;
|
||||
}
|
||||
|
||||
public void setAverageWordsPerStory(double averageWordsPerStory) {
|
||||
this.averageWordsPerStory = averageWordsPerStory;
|
||||
}
|
||||
|
||||
public StoryWordCountDto getLongestStory() {
|
||||
return longestStory;
|
||||
}
|
||||
|
||||
public void setLongestStory(StoryWordCountDto longestStory) {
|
||||
this.longestStory = longestStory;
|
||||
}
|
||||
|
||||
public StoryWordCountDto getShortestStory() {
|
||||
return shortestStory;
|
||||
}
|
||||
|
||||
public void setShortestStory(StoryWordCountDto shortestStory) {
|
||||
this.shortestStory = shortestStory;
|
||||
}
|
||||
|
||||
public long getTotalReadingTimeMinutes() {
|
||||
return totalReadingTimeMinutes;
|
||||
}
|
||||
|
||||
public void setTotalReadingTimeMinutes(long totalReadingTimeMinutes) {
|
||||
this.totalReadingTimeMinutes = totalReadingTimeMinutes;
|
||||
}
|
||||
|
||||
public double getAverageReadingTimeMinutes() {
|
||||
return averageReadingTimeMinutes;
|
||||
}
|
||||
|
||||
public void setAverageReadingTimeMinutes(double averageReadingTimeMinutes) {
|
||||
this.averageReadingTimeMinutes = averageReadingTimeMinutes;
|
||||
}
|
||||
|
||||
// Nested DTO for story word count info
|
||||
public static class StoryWordCountDto {
|
||||
private String id;
|
||||
private String title;
|
||||
private String authorName;
|
||||
private int wordCount;
|
||||
private long readingTimeMinutes;
|
||||
|
||||
public StoryWordCountDto() {
|
||||
}
|
||||
|
||||
public StoryWordCountDto(String id, String title, String authorName, int wordCount, long readingTimeMinutes) {
|
||||
this.id = id;
|
||||
this.title = title;
|
||||
this.authorName = authorName;
|
||||
this.wordCount = wordCount;
|
||||
this.readingTimeMinutes = readingTimeMinutes;
|
||||
}
|
||||
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(String id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getTitle() {
|
||||
return title;
|
||||
}
|
||||
|
||||
public void setTitle(String title) {
|
||||
this.title = title;
|
||||
}
|
||||
|
||||
public String getAuthorName() {
|
||||
return authorName;
|
||||
}
|
||||
|
||||
public void setAuthorName(String authorName) {
|
||||
this.authorName = authorName;
|
||||
}
|
||||
|
||||
public int getWordCount() {
|
||||
return wordCount;
|
||||
}
|
||||
|
||||
public void setWordCount(int wordCount) {
|
||||
this.wordCount = wordCount;
|
||||
}
|
||||
|
||||
public long getReadingTimeMinutes() {
|
||||
return readingTimeMinutes;
|
||||
}
|
||||
|
||||
public void setReadingTimeMinutes(long readingTimeMinutes) {
|
||||
this.readingTimeMinutes = readingTimeMinutes;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -34,6 +34,18 @@ public class SearchResultDto<T> {
|
||||
this.facets = facets;
|
||||
}
|
||||
|
||||
// Simple constructor for basic search results with facet list
|
||||
public SearchResultDto(List<T> results, long totalHits, int resultCount, List<FacetCountDto> facetsList) {
|
||||
this.results = results;
|
||||
this.totalHits = totalHits;
|
||||
this.page = 0;
|
||||
this.perPage = resultCount;
|
||||
this.query = "";
|
||||
this.searchTimeMs = 0;
|
||||
// Convert list to map if needed - for now just set empty map
|
||||
this.facets = java.util.Collections.emptyMap();
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public List<T> getResults() {
|
||||
return results;
|
||||
|
||||
@@ -17,6 +17,7 @@ public class StorySearchDto {
|
||||
|
||||
// Reading status
|
||||
private Boolean isRead;
|
||||
private Integer readingPosition;
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
// Author info
|
||||
@@ -33,6 +34,9 @@ public class StorySearchDto {
|
||||
private LocalDateTime createdAt;
|
||||
private LocalDateTime updatedAt;
|
||||
|
||||
// Alias for createdAt to match frontend expectations
|
||||
private LocalDateTime dateAdded;
|
||||
|
||||
// Search-specific fields
|
||||
private double searchScore;
|
||||
private List<String> highlights;
|
||||
@@ -121,6 +125,14 @@ public class StorySearchDto {
|
||||
this.lastReadAt = lastReadAt;
|
||||
}
|
||||
|
||||
public Integer getReadingPosition() {
|
||||
return readingPosition;
|
||||
}
|
||||
|
||||
public void setReadingPosition(Integer readingPosition) {
|
||||
this.readingPosition = readingPosition;
|
||||
}
|
||||
|
||||
public UUID getAuthorId() {
|
||||
return authorId;
|
||||
}
|
||||
@@ -177,6 +189,14 @@ public class StorySearchDto {
|
||||
this.updatedAt = updatedAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getDateAdded() {
|
||||
return dateAdded;
|
||||
}
|
||||
|
||||
public void setDateAdded(LocalDateTime dateAdded) {
|
||||
this.dateAdded = dateAdded;
|
||||
}
|
||||
|
||||
public double getSearchScore() {
|
||||
return searchScore;
|
||||
}
|
||||
|
||||
195
backend/src/main/java/com/storycove/entity/BackupJob.java
Normal file
195
backend/src/main/java/com/storycove/entity/BackupJob.java
Normal file
@@ -0,0 +1,195 @@
|
||||
package com.storycove.entity;
|
||||
|
||||
import jakarta.persistence.*;
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.UUID;
|
||||
|
||||
@Entity
|
||||
@Table(name = "backup_jobs")
|
||||
public class BackupJob {
|
||||
|
||||
@Id
|
||||
@GeneratedValue(strategy = GenerationType.UUID)
|
||||
private UUID id;
|
||||
|
||||
@Column(nullable = false)
|
||||
private String libraryId;
|
||||
|
||||
@Column(nullable = false)
|
||||
@Enumerated(EnumType.STRING)
|
||||
private BackupType type;
|
||||
|
||||
@Column(nullable = false)
|
||||
@Enumerated(EnumType.STRING)
|
||||
private BackupStatus status;
|
||||
|
||||
@Column
|
||||
private String filePath;
|
||||
|
||||
@Column
|
||||
private Long fileSizeBytes;
|
||||
|
||||
@Column
|
||||
private Integer progressPercent;
|
||||
|
||||
@Column(length = 1000)
|
||||
private String errorMessage;
|
||||
|
||||
@Column(nullable = false)
|
||||
private LocalDateTime createdAt;
|
||||
|
||||
@Column
|
||||
private LocalDateTime startedAt;
|
||||
|
||||
@Column
|
||||
private LocalDateTime completedAt;
|
||||
|
||||
@Column
|
||||
private LocalDateTime expiresAt;
|
||||
|
||||
@PrePersist
|
||||
protected void onCreate() {
|
||||
createdAt = LocalDateTime.now();
|
||||
// Backups expire after 24 hours
|
||||
expiresAt = LocalDateTime.now().plusDays(1);
|
||||
}
|
||||
|
||||
// Enums
|
||||
public enum BackupType {
|
||||
DATABASE_ONLY,
|
||||
COMPLETE
|
||||
}
|
||||
|
||||
public enum BackupStatus {
|
||||
PENDING,
|
||||
IN_PROGRESS,
|
||||
COMPLETED,
|
||||
FAILED,
|
||||
EXPIRED
|
||||
}
|
||||
|
||||
// Constructors
|
||||
public BackupJob() {
|
||||
}
|
||||
|
||||
public BackupJob(String libraryId, BackupType type) {
|
||||
this.libraryId = libraryId;
|
||||
this.type = type;
|
||||
this.status = BackupStatus.PENDING;
|
||||
this.progressPercent = 0;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public UUID getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(UUID id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getLibraryId() {
|
||||
return libraryId;
|
||||
}
|
||||
|
||||
public void setLibraryId(String libraryId) {
|
||||
this.libraryId = libraryId;
|
||||
}
|
||||
|
||||
public BackupType getType() {
|
||||
return type;
|
||||
}
|
||||
|
||||
public void setType(BackupType type) {
|
||||
this.type = type;
|
||||
}
|
||||
|
||||
public BackupStatus getStatus() {
|
||||
return status;
|
||||
}
|
||||
|
||||
public void setStatus(BackupStatus status) {
|
||||
this.status = status;
|
||||
}
|
||||
|
||||
public String getFilePath() {
|
||||
return filePath;
|
||||
}
|
||||
|
||||
public void setFilePath(String filePath) {
|
||||
this.filePath = filePath;
|
||||
}
|
||||
|
||||
public Long getFileSizeBytes() {
|
||||
return fileSizeBytes;
|
||||
}
|
||||
|
||||
public void setFileSizeBytes(Long fileSizeBytes) {
|
||||
this.fileSizeBytes = fileSizeBytes;
|
||||
}
|
||||
|
||||
public Integer getProgressPercent() {
|
||||
return progressPercent;
|
||||
}
|
||||
|
||||
public void setProgressPercent(Integer progressPercent) {
|
||||
this.progressPercent = progressPercent;
|
||||
}
|
||||
|
||||
public String getErrorMessage() {
|
||||
return errorMessage;
|
||||
}
|
||||
|
||||
public void setErrorMessage(String errorMessage) {
|
||||
this.errorMessage = errorMessage;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
public void setCreatedAt(LocalDateTime createdAt) {
|
||||
this.createdAt = createdAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getStartedAt() {
|
||||
return startedAt;
|
||||
}
|
||||
|
||||
public void setStartedAt(LocalDateTime startedAt) {
|
||||
this.startedAt = startedAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getCompletedAt() {
|
||||
return completedAt;
|
||||
}
|
||||
|
||||
public void setCompletedAt(LocalDateTime completedAt) {
|
||||
this.completedAt = completedAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getExpiresAt() {
|
||||
return expiresAt;
|
||||
}
|
||||
|
||||
public void setExpiresAt(LocalDateTime expiresAt) {
|
||||
this.expiresAt = expiresAt;
|
||||
}
|
||||
|
||||
// Helper methods
|
||||
public boolean isExpired() {
|
||||
return LocalDateTime.now().isAfter(expiresAt);
|
||||
}
|
||||
|
||||
public boolean isCompleted() {
|
||||
return status == BackupStatus.COMPLETED;
|
||||
}
|
||||
|
||||
public boolean isFailed() {
|
||||
return status == BackupStatus.FAILED;
|
||||
}
|
||||
|
||||
public boolean isInProgress() {
|
||||
return status == BackupStatus.IN_PROGRESS;
|
||||
}
|
||||
}
|
||||
130
backend/src/main/java/com/storycove/entity/RefreshToken.java
Normal file
130
backend/src/main/java/com/storycove/entity/RefreshToken.java
Normal file
@@ -0,0 +1,130 @@
|
||||
package com.storycove.entity;
|
||||
|
||||
import jakarta.persistence.*;
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.UUID;
|
||||
|
||||
@Entity
|
||||
@Table(name = "refresh_tokens")
|
||||
public class RefreshToken {
|
||||
|
||||
@Id
|
||||
@GeneratedValue(strategy = GenerationType.UUID)
|
||||
private UUID id;
|
||||
|
||||
@Column(nullable = false, unique = true)
|
||||
private String token;
|
||||
|
||||
@Column(nullable = false)
|
||||
private LocalDateTime expiresAt;
|
||||
|
||||
@Column(nullable = false)
|
||||
private LocalDateTime createdAt;
|
||||
|
||||
@Column
|
||||
private LocalDateTime revokedAt;
|
||||
|
||||
@Column
|
||||
private String libraryId;
|
||||
|
||||
@Column(nullable = false)
|
||||
private String userAgent;
|
||||
|
||||
@Column(nullable = false)
|
||||
private String ipAddress;
|
||||
|
||||
@PrePersist
|
||||
protected void onCreate() {
|
||||
createdAt = LocalDateTime.now();
|
||||
}
|
||||
|
||||
// Constructors
|
||||
public RefreshToken() {
|
||||
}
|
||||
|
||||
public RefreshToken(String token, LocalDateTime expiresAt, String libraryId, String userAgent, String ipAddress) {
|
||||
this.token = token;
|
||||
this.expiresAt = expiresAt;
|
||||
this.libraryId = libraryId;
|
||||
this.userAgent = userAgent;
|
||||
this.ipAddress = ipAddress;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public UUID getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(UUID id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getToken() {
|
||||
return token;
|
||||
}
|
||||
|
||||
public void setToken(String token) {
|
||||
this.token = token;
|
||||
}
|
||||
|
||||
public LocalDateTime getExpiresAt() {
|
||||
return expiresAt;
|
||||
}
|
||||
|
||||
public void setExpiresAt(LocalDateTime expiresAt) {
|
||||
this.expiresAt = expiresAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
public void setCreatedAt(LocalDateTime createdAt) {
|
||||
this.createdAt = createdAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getRevokedAt() {
|
||||
return revokedAt;
|
||||
}
|
||||
|
||||
public void setRevokedAt(LocalDateTime revokedAt) {
|
||||
this.revokedAt = revokedAt;
|
||||
}
|
||||
|
||||
public String getLibraryId() {
|
||||
return libraryId;
|
||||
}
|
||||
|
||||
public void setLibraryId(String libraryId) {
|
||||
this.libraryId = libraryId;
|
||||
}
|
||||
|
||||
public String getUserAgent() {
|
||||
return userAgent;
|
||||
}
|
||||
|
||||
public void setUserAgent(String userAgent) {
|
||||
this.userAgent = userAgent;
|
||||
}
|
||||
|
||||
public String getIpAddress() {
|
||||
return ipAddress;
|
||||
}
|
||||
|
||||
public void setIpAddress(String ipAddress) {
|
||||
this.ipAddress = ipAddress;
|
||||
}
|
||||
|
||||
// Helper methods
|
||||
public boolean isExpired() {
|
||||
return LocalDateTime.now().isAfter(expiresAt);
|
||||
}
|
||||
|
||||
public boolean isRevoked() {
|
||||
return revokedAt != null;
|
||||
}
|
||||
|
||||
public boolean isValid() {
|
||||
return !isExpired() && !isRevoked();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,34 @@
|
||||
package com.storycove.event;
|
||||
|
||||
import org.springframework.context.ApplicationEvent;
|
||||
|
||||
import java.util.UUID;
|
||||
|
||||
/**
|
||||
* Event published when a story's content is created or updated
|
||||
*/
|
||||
public class StoryContentUpdatedEvent extends ApplicationEvent {
|
||||
|
||||
private final UUID storyId;
|
||||
private final String contentHtml;
|
||||
private final boolean isNewStory;
|
||||
|
||||
public StoryContentUpdatedEvent(Object source, UUID storyId, String contentHtml, boolean isNewStory) {
|
||||
super(source);
|
||||
this.storyId = storyId;
|
||||
this.contentHtml = contentHtml;
|
||||
this.isNewStory = isNewStory;
|
||||
}
|
||||
|
||||
public UUID getStoryId() {
|
||||
return storyId;
|
||||
}
|
||||
|
||||
public String getContentHtml() {
|
||||
return contentHtml;
|
||||
}
|
||||
|
||||
public boolean isNewStory() {
|
||||
return isNewStory;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,25 @@
|
||||
package com.storycove.repository;
|
||||
|
||||
import com.storycove.entity.BackupJob;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Modifying;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
@Repository
|
||||
public interface BackupJobRepository extends JpaRepository<BackupJob, UUID> {
|
||||
|
||||
List<BackupJob> findByLibraryIdOrderByCreatedAtDesc(String libraryId);
|
||||
|
||||
@Query("SELECT bj FROM BackupJob bj WHERE bj.expiresAt < :now AND bj.status = 'COMPLETED'")
|
||||
List<BackupJob> findExpiredJobs(@Param("now") LocalDateTime now);
|
||||
|
||||
@Modifying
|
||||
@Query("UPDATE BackupJob bj SET bj.status = 'EXPIRED' WHERE bj.expiresAt < :now AND bj.status = 'COMPLETED'")
|
||||
int markExpiredJobs(@Param("now") LocalDateTime now);
|
||||
}
|
||||
@@ -0,0 +1,30 @@
|
||||
package com.storycove.repository;
|
||||
|
||||
import com.storycove.entity.RefreshToken;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Modifying;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
|
||||
@Repository
|
||||
public interface RefreshTokenRepository extends JpaRepository<RefreshToken, UUID> {
|
||||
|
||||
Optional<RefreshToken> findByToken(String token);
|
||||
|
||||
@Modifying
|
||||
@Query("DELETE FROM RefreshToken rt WHERE rt.expiresAt < :now")
|
||||
void deleteExpiredTokens(@Param("now") LocalDateTime now);
|
||||
|
||||
@Modifying
|
||||
@Query("UPDATE RefreshToken rt SET rt.revokedAt = :now WHERE rt.libraryId = :libraryId AND rt.revokedAt IS NULL")
|
||||
void revokeAllByLibraryId(@Param("libraryId") String libraryId, @Param("now") LocalDateTime now);
|
||||
|
||||
@Modifying
|
||||
@Query("UPDATE RefreshToken rt SET rt.revokedAt = :now WHERE rt.revokedAt IS NULL")
|
||||
void revokeAll(@Param("now") LocalDateTime now);
|
||||
}
|
||||
@@ -87,6 +87,9 @@ public interface StoryRepository extends JpaRepository<Story, UUID> {
|
||||
@Query("SELECT COUNT(s) FROM Story s WHERE s.createdAt >= :since")
|
||||
long countStoriesCreatedSince(@Param("since") LocalDateTime since);
|
||||
|
||||
@Query("SELECT COUNT(s) FROM Story s WHERE s.createdAt >= :since OR s.updatedAt >= :since")
|
||||
long countStoriesModifiedAfter(@Param("since") LocalDateTime since);
|
||||
|
||||
@Query("SELECT AVG(s.wordCount) FROM Story s")
|
||||
Double findAverageWordCount();
|
||||
|
||||
|
||||
@@ -1,84 +0,0 @@
|
||||
package com.storycove.scheduled;
|
||||
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.service.StoryService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||
import org.springframework.scheduling.annotation.Scheduled;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* Scheduled task to periodically reindex all stories in Typesense
|
||||
* to ensure search index stays synchronized with database changes.
|
||||
*/
|
||||
@Component
|
||||
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true)
|
||||
public class TypesenseIndexScheduler {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(TypesenseIndexScheduler.class);
|
||||
|
||||
private final StoryService storyService;
|
||||
private final TypesenseService typesenseService;
|
||||
|
||||
@Autowired
|
||||
public TypesenseIndexScheduler(StoryService storyService,
|
||||
@Autowired(required = false) TypesenseService typesenseService) {
|
||||
this.storyService = storyService;
|
||||
this.typesenseService = typesenseService;
|
||||
}
|
||||
|
||||
/**
|
||||
* Scheduled task that runs periodically to reindex all stories in Typesense.
|
||||
* This ensures the search index stays synchronized with any database changes
|
||||
* that might have occurred outside of the normal story update flow.
|
||||
*
|
||||
* Interval is configurable via storycove.typesense.reindex-interval property (default: 1 hour).
|
||||
*/
|
||||
@Scheduled(fixedRateString = "${storycove.typesense.reindex-interval:3600000}")
|
||||
public void reindexAllStories() {
|
||||
if (typesenseService == null) {
|
||||
logger.debug("TypesenseService is not available, skipping scheduled reindexing");
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info("Starting scheduled Typesense reindexing at {}", LocalDateTime.now());
|
||||
|
||||
try {
|
||||
long startTime = System.currentTimeMillis();
|
||||
|
||||
// Get all stories from database with eagerly loaded associations
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
|
||||
if (allStories.isEmpty()) {
|
||||
logger.info("No stories found in database, skipping reindexing");
|
||||
return;
|
||||
}
|
||||
|
||||
// Perform full reindex
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
|
||||
long endTime = System.currentTimeMillis();
|
||||
long duration = endTime - startTime;
|
||||
|
||||
logger.info("Completed scheduled Typesense reindexing of {} stories in {}ms",
|
||||
allStories.size(), duration);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to complete scheduled Typesense reindexing", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Manual trigger for reindexing - can be called from other services or endpoints if needed
|
||||
*/
|
||||
public void triggerManualReindex() {
|
||||
logger.info("Manual Typesense reindexing triggered");
|
||||
reindexAllStories();
|
||||
}
|
||||
}
|
||||
@@ -1,11 +1,14 @@
|
||||
package com.storycove.security;
|
||||
|
||||
import com.storycove.service.LibraryService;
|
||||
import com.storycove.util.JwtUtil;
|
||||
import jakarta.servlet.FilterChain;
|
||||
import jakarta.servlet.ServletException;
|
||||
import jakarta.servlet.http.Cookie;
|
||||
import jakarta.servlet.http.HttpServletRequest;
|
||||
import jakarta.servlet.http.HttpServletResponse;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.security.authentication.UsernamePasswordAuthenticationToken;
|
||||
import org.springframework.security.core.context.SecurityContextHolder;
|
||||
import org.springframework.security.web.authentication.WebAuthenticationDetailsSource;
|
||||
@@ -18,10 +21,14 @@ import java.util.ArrayList;
|
||||
@Component
|
||||
public class JwtAuthenticationFilter extends OncePerRequestFilter {
|
||||
|
||||
private final JwtUtil jwtUtil;
|
||||
private static final Logger logger = LoggerFactory.getLogger(JwtAuthenticationFilter.class);
|
||||
|
||||
public JwtAuthenticationFilter(JwtUtil jwtUtil) {
|
||||
private final JwtUtil jwtUtil;
|
||||
private final LibraryService libraryService;
|
||||
|
||||
public JwtAuthenticationFilter(JwtUtil jwtUtil, LibraryService libraryService) {
|
||||
this.jwtUtil = jwtUtil;
|
||||
this.libraryService = libraryService;
|
||||
}
|
||||
|
||||
@Override
|
||||
@@ -53,6 +60,28 @@ public class JwtAuthenticationFilter extends OncePerRequestFilter {
|
||||
if (token != null && jwtUtil.validateToken(token) && !jwtUtil.isTokenExpired(token)) {
|
||||
String subject = jwtUtil.getSubjectFromToken(token);
|
||||
|
||||
// Check if we need to switch libraries based on token's library ID
|
||||
try {
|
||||
String tokenLibraryId = jwtUtil.getLibraryIdFromToken(token);
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
|
||||
// Switch library if token's library differs from current library
|
||||
// This handles cross-device library switching automatically
|
||||
if (tokenLibraryId != null && !tokenLibraryId.equals(currentLibraryId)) {
|
||||
logger.info("Token library '{}' differs from current library '{}', switching libraries",
|
||||
tokenLibraryId, currentLibraryId);
|
||||
libraryService.switchToLibraryAfterAuthentication(tokenLibraryId);
|
||||
} else if (currentLibraryId == null && tokenLibraryId != null) {
|
||||
// Handle case after backend restart where no library is active
|
||||
logger.info("No active library, switching to token's library: {}", tokenLibraryId);
|
||||
libraryService.switchToLibraryAfterAuthentication(tokenLibraryId);
|
||||
}
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to switch library from token: {}", e.getMessage());
|
||||
// Don't fail the request - authentication can still proceed
|
||||
// but user might see wrong library data until next login
|
||||
}
|
||||
|
||||
if (subject != null && SecurityContextHolder.getContext().getAuthentication() == null) {
|
||||
UsernamePasswordAuthenticationToken authToken =
|
||||
new UsernamePasswordAuthenticationToken(subject, null, new ArrayList<>());
|
||||
|
||||
@@ -0,0 +1,125 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.BackupJob;
|
||||
import com.storycove.repository.BackupJobRepository;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.scheduling.annotation.Async;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Propagation;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
|
||||
/**
|
||||
* Separate service for async backup execution.
|
||||
* This is needed because @Async doesn't work when called from within the same class.
|
||||
*/
|
||||
@Service
|
||||
public class AsyncBackupExecutor {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AsyncBackupExecutor.class);
|
||||
|
||||
@Value("${storycove.upload.dir:/app/images}")
|
||||
private String uploadDir;
|
||||
|
||||
@Autowired
|
||||
private BackupJobRepository backupJobRepository;
|
||||
|
||||
@Autowired
|
||||
private DatabaseManagementService databaseManagementService;
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
/**
|
||||
* Execute backup asynchronously.
|
||||
* This method MUST be in a separate service class for @Async to work properly.
|
||||
*/
|
||||
@Async
|
||||
@Transactional(propagation = Propagation.REQUIRES_NEW)
|
||||
public void executeBackupAsync(UUID jobId) {
|
||||
logger.info("Async executor starting for job {}", jobId);
|
||||
|
||||
Optional<BackupJob> jobOpt = backupJobRepository.findById(jobId);
|
||||
if (jobOpt.isEmpty()) {
|
||||
logger.error("Backup job not found: {}", jobId);
|
||||
return;
|
||||
}
|
||||
|
||||
BackupJob job = jobOpt.get();
|
||||
job.setStatus(BackupJob.BackupStatus.IN_PROGRESS);
|
||||
job.setStartedAt(LocalDateTime.now());
|
||||
job.setProgressPercent(0);
|
||||
backupJobRepository.save(job);
|
||||
|
||||
try {
|
||||
logger.info("Starting backup job {} for library {}", job.getId(), job.getLibraryId());
|
||||
|
||||
// Switch to the correct library
|
||||
if (!job.getLibraryId().equals(libraryService.getCurrentLibraryId())) {
|
||||
libraryService.switchToLibraryAfterAuthentication(job.getLibraryId());
|
||||
}
|
||||
|
||||
// Create backup file
|
||||
Path backupDir = Paths.get(uploadDir, "backups", job.getLibraryId());
|
||||
Files.createDirectories(backupDir);
|
||||
|
||||
String filename = String.format("backup_%s_%s.%s",
|
||||
job.getId().toString(),
|
||||
LocalDateTime.now().toString().replaceAll(":", "-"),
|
||||
job.getType() == BackupJob.BackupType.COMPLETE ? "zip" : "sql");
|
||||
|
||||
Path backupFile = backupDir.resolve(filename);
|
||||
|
||||
job.setProgressPercent(10);
|
||||
backupJobRepository.save(job);
|
||||
|
||||
// Create the backup
|
||||
Resource backupResource;
|
||||
if (job.getType() == BackupJob.BackupType.COMPLETE) {
|
||||
backupResource = databaseManagementService.createCompleteBackup();
|
||||
} else {
|
||||
backupResource = databaseManagementService.createBackup();
|
||||
}
|
||||
|
||||
job.setProgressPercent(80);
|
||||
backupJobRepository.save(job);
|
||||
|
||||
// Copy resource to permanent file
|
||||
try (var inputStream = backupResource.getInputStream();
|
||||
var outputStream = Files.newOutputStream(backupFile)) {
|
||||
inputStream.transferTo(outputStream);
|
||||
}
|
||||
|
||||
job.setProgressPercent(95);
|
||||
backupJobRepository.save(job);
|
||||
|
||||
// Set file info
|
||||
job.setFilePath(backupFile.toString());
|
||||
job.setFileSizeBytes(Files.size(backupFile));
|
||||
job.setStatus(BackupJob.BackupStatus.COMPLETED);
|
||||
job.setCompletedAt(LocalDateTime.now());
|
||||
job.setProgressPercent(100);
|
||||
|
||||
logger.info("Backup job {} completed successfully. File size: {} bytes",
|
||||
job.getId(), job.getFileSizeBytes());
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Backup job {} failed", job.getId(), e);
|
||||
job.setStatus(BackupJob.BackupStatus.FAILED);
|
||||
job.setErrorMessage(e.getMessage());
|
||||
job.setCompletedAt(LocalDateTime.now());
|
||||
} finally {
|
||||
backupJobRepository.save(job);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,167 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.BackupJob;
|
||||
import com.storycove.repository.BackupJobRepository;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.core.io.FileSystemResource;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.scheduling.annotation.Scheduled;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
|
||||
@Service
|
||||
public class AsyncBackupService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AsyncBackupService.class);
|
||||
|
||||
@Value("${storycove.upload.dir:/app/images}")
|
||||
private String uploadDir;
|
||||
|
||||
@Autowired
|
||||
private BackupJobRepository backupJobRepository;
|
||||
|
||||
@Autowired
|
||||
private AsyncBackupExecutor asyncBackupExecutor;
|
||||
|
||||
/**
|
||||
* Start a backup job asynchronously.
|
||||
* This method returns immediately after creating the job record.
|
||||
*/
|
||||
@Transactional
|
||||
public BackupJob startBackupJob(String libraryId, BackupJob.BackupType type) {
|
||||
logger.info("Creating backup job for library: {}, type: {}", libraryId, type);
|
||||
|
||||
BackupJob job = new BackupJob(libraryId, type);
|
||||
job = backupJobRepository.save(job);
|
||||
|
||||
logger.info("Backup job created with ID: {}. Starting async execution...", job.getId());
|
||||
|
||||
// Start backup in background using separate service (ensures @Async works properly)
|
||||
asyncBackupExecutor.executeBackupAsync(job.getId());
|
||||
|
||||
logger.info("Async backup execution triggered for job: {}", job.getId());
|
||||
|
||||
return job;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get backup job status
|
||||
*/
|
||||
public Optional<BackupJob> getJobStatus(UUID jobId) {
|
||||
return backupJobRepository.findById(jobId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get backup file for download
|
||||
*/
|
||||
public Resource getBackupFile(UUID jobId) throws IOException {
|
||||
Optional<BackupJob> jobOpt = backupJobRepository.findById(jobId);
|
||||
if (jobOpt.isEmpty()) {
|
||||
throw new IOException("Backup job not found");
|
||||
}
|
||||
|
||||
BackupJob job = jobOpt.get();
|
||||
|
||||
if (!job.isCompleted()) {
|
||||
throw new IOException("Backup is not completed yet");
|
||||
}
|
||||
|
||||
if (job.isExpired()) {
|
||||
throw new IOException("Backup has expired");
|
||||
}
|
||||
|
||||
if (job.getFilePath() == null) {
|
||||
throw new IOException("Backup file path not set");
|
||||
}
|
||||
|
||||
Path backupPath = Paths.get(job.getFilePath());
|
||||
if (!Files.exists(backupPath)) {
|
||||
throw new IOException("Backup file not found");
|
||||
}
|
||||
|
||||
return new FileSystemResource(backupPath);
|
||||
}
|
||||
|
||||
/**
|
||||
* List backup jobs for a library
|
||||
*/
|
||||
public List<BackupJob> listBackupJobs(String libraryId) {
|
||||
return backupJobRepository.findByLibraryIdOrderByCreatedAtDesc(libraryId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up expired backup jobs and their files
|
||||
* Runs daily at 2 AM
|
||||
*/
|
||||
@Scheduled(cron = "0 0 2 * * ?")
|
||||
@Transactional
|
||||
public void cleanupExpiredBackups() {
|
||||
logger.info("Starting cleanup of expired backups");
|
||||
|
||||
LocalDateTime now = LocalDateTime.now();
|
||||
|
||||
// Mark expired jobs
|
||||
int markedCount = backupJobRepository.markExpiredJobs(now);
|
||||
logger.info("Marked {} jobs as expired", markedCount);
|
||||
|
||||
// Find all expired jobs to delete their files
|
||||
List<BackupJob> expiredJobs = backupJobRepository.findExpiredJobs(now);
|
||||
|
||||
for (BackupJob job : expiredJobs) {
|
||||
if (job.getFilePath() != null) {
|
||||
try {
|
||||
Path filePath = Paths.get(job.getFilePath());
|
||||
if (Files.exists(filePath)) {
|
||||
Files.delete(filePath);
|
||||
logger.info("Deleted expired backup file: {}", filePath);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
logger.warn("Failed to delete expired backup file: {}", job.getFilePath(), e);
|
||||
}
|
||||
}
|
||||
|
||||
// Delete the job record
|
||||
backupJobRepository.delete(job);
|
||||
}
|
||||
|
||||
logger.info("Cleanup completed. Deleted {} expired backups", expiredJobs.size());
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a specific backup job and its file
|
||||
*/
|
||||
@Transactional
|
||||
public void deleteBackupJob(UUID jobId) throws IOException {
|
||||
Optional<BackupJob> jobOpt = backupJobRepository.findById(jobId);
|
||||
if (jobOpt.isEmpty()) {
|
||||
throw new IOException("Backup job not found");
|
||||
}
|
||||
|
||||
BackupJob job = jobOpt.get();
|
||||
|
||||
// Delete file if it exists
|
||||
if (job.getFilePath() != null) {
|
||||
Path filePath = Paths.get(job.getFilePath());
|
||||
if (Files.exists(filePath)) {
|
||||
Files.delete(filePath);
|
||||
logger.info("Deleted backup file: {}", filePath);
|
||||
}
|
||||
}
|
||||
|
||||
// Delete job record
|
||||
backupJobRepository.delete(job);
|
||||
logger.info("Deleted backup job: {}", jobId);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,169 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.scheduling.annotation.Async;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.util.UUID;
|
||||
import java.util.concurrent.CompletableFuture;
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
@Service
|
||||
public class AsyncImageProcessingService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AsyncImageProcessingService.class);
|
||||
|
||||
private final ImageService imageService;
|
||||
private final StoryService storyService;
|
||||
private final ImageProcessingProgressService progressService;
|
||||
|
||||
@org.springframework.beans.factory.annotation.Value("${storycove.app.public-url:http://localhost:6925}")
|
||||
private String publicUrl;
|
||||
|
||||
@Autowired
|
||||
public AsyncImageProcessingService(ImageService imageService,
|
||||
StoryService storyService,
|
||||
ImageProcessingProgressService progressService) {
|
||||
this.imageService = imageService;
|
||||
this.storyService = storyService;
|
||||
this.progressService = progressService;
|
||||
}
|
||||
|
||||
@Async
|
||||
public CompletableFuture<Void> processStoryImagesAsync(UUID storyId, String contentHtml) {
|
||||
logger.info("Starting async image processing for story: {}", storyId);
|
||||
|
||||
try {
|
||||
// Count external images first
|
||||
int externalImageCount = countExternalImages(contentHtml);
|
||||
|
||||
if (externalImageCount == 0) {
|
||||
logger.debug("No external images found for story {}", storyId);
|
||||
return CompletableFuture.completedFuture(null);
|
||||
}
|
||||
|
||||
// Start progress tracking
|
||||
ImageProcessingProgressService.ImageProcessingProgress progress =
|
||||
progressService.startProgress(storyId, externalImageCount);
|
||||
|
||||
// Process images with progress updates
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
processImagesWithProgress(contentHtml, storyId, progress);
|
||||
|
||||
// Update story with processed content if changed
|
||||
if (!result.getProcessedContent().equals(contentHtml)) {
|
||||
progressService.updateProgress(storyId, progress.getTotalImages(),
|
||||
"Saving processed content", "Updating story content");
|
||||
|
||||
storyService.updateContentOnly(storyId, result.getProcessedContent());
|
||||
|
||||
progressService.completeProgress(storyId,
|
||||
String.format("Completed: %d images processed", result.getDownloadedImages().size()));
|
||||
|
||||
logger.info("Async image processing completed for story {}: {} images processed",
|
||||
storyId, result.getDownloadedImages().size());
|
||||
} else {
|
||||
progressService.completeProgress(storyId, "Completed: No images needed processing");
|
||||
}
|
||||
|
||||
// Clean up progress after a delay to allow frontend to see completion
|
||||
CompletableFuture.runAsync(() -> {
|
||||
try {
|
||||
Thread.sleep(5000); // 5 seconds delay
|
||||
progressService.removeProgress(storyId);
|
||||
} catch (InterruptedException e) {
|
||||
Thread.currentThread().interrupt();
|
||||
}
|
||||
});
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Async image processing failed for story {}: {}", storyId, e.getMessage(), e);
|
||||
progressService.setError(storyId, e.getMessage());
|
||||
}
|
||||
|
||||
return CompletableFuture.completedFuture(null);
|
||||
}
|
||||
|
||||
private int countExternalImages(String contentHtml) {
|
||||
if (contentHtml == null || contentHtml.trim().isEmpty()) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
Pattern imgPattern = Pattern.compile("<img[^>]+src=[\"']([^\"']+)[\"'][^>]*>", Pattern.CASE_INSENSITIVE);
|
||||
Matcher matcher = imgPattern.matcher(contentHtml);
|
||||
|
||||
int count = 0;
|
||||
while (matcher.find()) {
|
||||
String src = matcher.group(1);
|
||||
if (isExternalUrl(src)) {
|
||||
count++;
|
||||
}
|
||||
}
|
||||
|
||||
return count;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a URL is external (not from this application).
|
||||
* Returns true if the URL should be downloaded, false if it's already local.
|
||||
*/
|
||||
private boolean isExternalUrl(String url) {
|
||||
if (url == null || url.trim().isEmpty()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Skip data URLs
|
||||
if (url.startsWith("data:")) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Skip relative URLs (local paths)
|
||||
if (url.startsWith("/")) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Skip URLs that are already pointing to our API
|
||||
if (url.contains("/api/files/images/")) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check if URL starts with the public URL (our own domain)
|
||||
if (publicUrl != null && !publicUrl.trim().isEmpty()) {
|
||||
String normalizedUrl = url.trim().toLowerCase();
|
||||
String normalizedPublicUrl = publicUrl.trim().toLowerCase();
|
||||
|
||||
// Remove trailing slash from public URL for comparison
|
||||
if (normalizedPublicUrl.endsWith("/")) {
|
||||
normalizedPublicUrl = normalizedPublicUrl.substring(0, normalizedPublicUrl.length() - 1);
|
||||
}
|
||||
|
||||
if (normalizedUrl.startsWith(normalizedPublicUrl)) {
|
||||
logger.debug("URL is from this application (matches publicUrl): {}", url);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// If it's an HTTP(S) URL that didn't match our filters, it's external
|
||||
if (url.startsWith("http://") || url.startsWith("https://")) {
|
||||
logger.debug("URL is external: {}", url);
|
||||
return true;
|
||||
}
|
||||
|
||||
// For any other format, consider it non-external (safer default)
|
||||
return false;
|
||||
}
|
||||
|
||||
private ImageService.ContentImageProcessingResult processImagesWithProgress(
|
||||
String contentHtml, UUID storyId, ImageProcessingProgressService.ImageProcessingProgress progress) {
|
||||
|
||||
// Use a custom version of processContentImages that provides progress callbacks
|
||||
return imageService.processContentImagesWithProgress(contentHtml, storyId,
|
||||
(currentUrl, processedCount, totalCount) -> {
|
||||
progressService.updateProgress(storyId, processedCount, currentUrl,
|
||||
String.format("Processing image %d of %d", processedCount + 1, totalCount));
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -11,21 +11,21 @@ import org.springframework.stereotype.Component;
|
||||
import java.util.List;
|
||||
|
||||
@Component
|
||||
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true)
|
||||
@ConditionalOnProperty(name = "storycove.search.enabled", havingValue = "true", matchIfMissing = true)
|
||||
public class AuthorIndexScheduler {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AuthorIndexScheduler.class);
|
||||
|
||||
private final AuthorService authorService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
public AuthorIndexScheduler(AuthorService authorService, TypesenseService typesenseService) {
|
||||
public AuthorIndexScheduler(AuthorService authorService, SearchServiceAdapter searchServiceAdapter) {
|
||||
this.authorService = authorService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@Scheduled(fixedRateString = "${storycove.typesense.author-reindex-interval:7200000}") // 2 hours default
|
||||
@Scheduled(fixedRateString = "${storycove.search.author-reindex-interval:7200000}") // 2 hours default
|
||||
public void reindexAllAuthors() {
|
||||
try {
|
||||
logger.info("Starting scheduled author reindexing...");
|
||||
@@ -34,7 +34,7 @@ public class AuthorIndexScheduler {
|
||||
logger.info("Found {} authors to reindex", allAuthors.size());
|
||||
|
||||
if (!allAuthors.isEmpty()) {
|
||||
typesenseService.reindexAllAuthors(allAuthors);
|
||||
searchServiceAdapter.bulkIndexAuthors(allAuthors);
|
||||
logger.info("Successfully completed scheduled author reindexing");
|
||||
} else {
|
||||
logger.info("No authors found to reindex");
|
||||
|
||||
@@ -28,12 +28,12 @@ public class AuthorService {
|
||||
private static final Logger logger = LoggerFactory.getLogger(AuthorService.class);
|
||||
|
||||
private final AuthorRepository authorRepository;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
public AuthorService(AuthorRepository authorRepository, @Autowired(required = false) TypesenseService typesenseService) {
|
||||
public AuthorService(AuthorRepository authorRepository, SearchServiceAdapter searchServiceAdapter) {
|
||||
this.authorRepository = authorRepository;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
@@ -132,14 +132,8 @@ public class AuthorService {
|
||||
validateAuthorForCreate(author);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Index in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.indexAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to index author in Typesense: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Index in Solr
|
||||
searchServiceAdapter.indexAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -156,14 +150,8 @@ public class AuthorService {
|
||||
updateAuthorFields(existingAuthor, authorUpdates);
|
||||
Author savedAuthor = authorRepository.save(existingAuthor);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in Solr
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -178,14 +166,8 @@ public class AuthorService {
|
||||
|
||||
authorRepository.delete(author);
|
||||
|
||||
// Remove from Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.deleteAuthor(id.toString());
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to delete author from Typesense: " + author.getName(), e);
|
||||
}
|
||||
}
|
||||
// Remove from Solr
|
||||
searchServiceAdapter.deleteAuthor(id);
|
||||
}
|
||||
|
||||
public Author addUrl(UUID id, String url) {
|
||||
@@ -193,14 +175,8 @@ public class AuthorService {
|
||||
author.addUrl(url);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after adding URL: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in Solr
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -210,14 +186,8 @@ public class AuthorService {
|
||||
author.removeUrl(url);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after removing URL: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in Solr
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -251,14 +221,8 @@ public class AuthorService {
|
||||
logger.debug("Saved author rating: {} for author: {}",
|
||||
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(refreshedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after rating: " + refreshedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in Solr
|
||||
searchServiceAdapter.updateAuthor(refreshedAuthor);
|
||||
|
||||
return refreshedAuthor;
|
||||
}
|
||||
@@ -301,14 +265,8 @@ public class AuthorService {
|
||||
author.setAvatarImagePath(avatarPath);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after setting avatar: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in Solr
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -318,14 +276,8 @@ public class AuthorService {
|
||||
author.setAvatarImagePath(null);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after removing avatar: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in Solr
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
|
||||
@@ -0,0 +1,262 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.repository.StoryRepository;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.scheduling.annotation.Scheduled;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.time.LocalDateTime;
|
||||
import java.time.format.DateTimeFormatter;
|
||||
import java.util.Comparator;
|
||||
import java.util.List;
|
||||
import java.util.stream.Collectors;
|
||||
import java.util.stream.Stream;
|
||||
|
||||
/**
|
||||
* Service for automatic daily backups.
|
||||
* Runs at 4 AM daily and creates a backup if content has changed since last backup.
|
||||
* Keeps maximum of 5 backups, rotating old ones out.
|
||||
*/
|
||||
@Service
|
||||
public class AutomaticBackupService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AutomaticBackupService.class);
|
||||
private static final int MAX_BACKUPS = 5;
|
||||
private static final DateTimeFormatter FILENAME_FORMATTER = DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss");
|
||||
|
||||
@Value("${storycove.automatic-backup.dir:/app/automatic-backups}")
|
||||
private String automaticBackupDir;
|
||||
|
||||
@Autowired
|
||||
private StoryRepository storyRepository;
|
||||
|
||||
@Autowired
|
||||
private DatabaseManagementService databaseManagementService;
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
private LocalDateTime lastBackupCheck = null;
|
||||
|
||||
/**
|
||||
* Scheduled job that runs daily at 4 AM.
|
||||
* Creates a backup if content has changed since last backup.
|
||||
*/
|
||||
@Scheduled(cron = "0 0 4 * * ?")
|
||||
public void performAutomaticBackup() {
|
||||
logger.info("========================================");
|
||||
logger.info("Starting automatic backup check at 4 AM");
|
||||
logger.info("========================================");
|
||||
|
||||
try {
|
||||
// Get current library ID (or default)
|
||||
String libraryId = libraryService.getCurrentLibraryId();
|
||||
if (libraryId == null) {
|
||||
libraryId = "default";
|
||||
}
|
||||
|
||||
logger.info("Checking for content changes in library: {}", libraryId);
|
||||
|
||||
// Check if content has changed since last backup
|
||||
if (!hasContentChanged()) {
|
||||
logger.info("No content changes detected since last backup. Skipping backup.");
|
||||
logger.info("========================================");
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info("Content changes detected! Creating automatic backup...");
|
||||
|
||||
// Create backup directory for this library
|
||||
Path backupPath = Paths.get(automaticBackupDir, libraryId);
|
||||
Files.createDirectories(backupPath);
|
||||
|
||||
// Create the backup
|
||||
String timestamp = LocalDateTime.now().format(FILENAME_FORMATTER);
|
||||
String filename = String.format("auto_backup_%s.zip", timestamp);
|
||||
Path backupFile = backupPath.resolve(filename);
|
||||
|
||||
logger.info("Creating complete backup to: {}", backupFile);
|
||||
|
||||
Resource backup = databaseManagementService.createCompleteBackup();
|
||||
|
||||
// Write backup to file
|
||||
try (var inputStream = backup.getInputStream();
|
||||
var outputStream = Files.newOutputStream(backupFile)) {
|
||||
inputStream.transferTo(outputStream);
|
||||
}
|
||||
|
||||
long fileSize = Files.size(backupFile);
|
||||
logger.info("✅ Automatic backup created successfully");
|
||||
logger.info(" File: {}", backupFile.getFileName());
|
||||
logger.info(" Size: {} MB", fileSize / 1024 / 1024);
|
||||
|
||||
// Rotate old backups (keep only MAX_BACKUPS)
|
||||
rotateBackups(backupPath);
|
||||
|
||||
// Update last backup check time
|
||||
lastBackupCheck = LocalDateTime.now();
|
||||
|
||||
logger.info("========================================");
|
||||
logger.info("Automatic backup completed successfully");
|
||||
logger.info("========================================");
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("❌ Automatic backup failed", e);
|
||||
logger.info("========================================");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if content has changed since last backup.
|
||||
* Looks for stories created or updated after the last backup time.
|
||||
*/
|
||||
private boolean hasContentChanged() {
|
||||
try {
|
||||
if (lastBackupCheck == null) {
|
||||
// First run - check if there are any stories at all
|
||||
long storyCount = storyRepository.count();
|
||||
logger.info("First backup check - found {} stories", storyCount);
|
||||
return storyCount > 0;
|
||||
}
|
||||
|
||||
// Check for stories created or updated since last backup
|
||||
long changedCount = storyRepository.countStoriesModifiedAfter(lastBackupCheck);
|
||||
logger.info("Found {} stories modified since last backup ({})", changedCount, lastBackupCheck);
|
||||
return changedCount > 0;
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error checking for content changes", e);
|
||||
// On error, create backup to be safe
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Rotate backups - keep only MAX_BACKUPS most recent backups.
|
||||
* Deletes older backups.
|
||||
*/
|
||||
private void rotateBackups(Path backupPath) throws IOException {
|
||||
logger.info("Checking for old backups to rotate...");
|
||||
|
||||
// Find all backup files in the directory
|
||||
List<Path> backupFiles;
|
||||
try (Stream<Path> stream = Files.list(backupPath)) {
|
||||
backupFiles = stream
|
||||
.filter(Files::isRegularFile)
|
||||
.filter(p -> p.getFileName().toString().startsWith("auto_backup_"))
|
||||
.filter(p -> p.getFileName().toString().endsWith(".zip"))
|
||||
.sorted(Comparator.comparing((Path p) -> {
|
||||
try {
|
||||
return Files.getLastModifiedTime(p);
|
||||
} catch (IOException e) {
|
||||
return null;
|
||||
}
|
||||
}).reversed()) // Most recent first
|
||||
.collect(Collectors.toList());
|
||||
}
|
||||
|
||||
logger.info("Found {} automatic backups", backupFiles.size());
|
||||
|
||||
// Delete old backups if we exceed MAX_BACKUPS
|
||||
if (backupFiles.size() > MAX_BACKUPS) {
|
||||
List<Path> toDelete = backupFiles.subList(MAX_BACKUPS, backupFiles.size());
|
||||
logger.info("Deleting {} old backups to maintain maximum of {}", toDelete.size(), MAX_BACKUPS);
|
||||
|
||||
for (Path oldBackup : toDelete) {
|
||||
try {
|
||||
Files.delete(oldBackup);
|
||||
logger.info(" Deleted old backup: {}", oldBackup.getFileName());
|
||||
} catch (IOException e) {
|
||||
logger.warn("Failed to delete old backup: {}", oldBackup, e);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
logger.info("Backup count within limit ({}), no rotation needed", MAX_BACKUPS);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Manual trigger for testing - creates backup immediately if content changed.
|
||||
*/
|
||||
public void triggerManualBackup() {
|
||||
logger.info("Manual automatic backup triggered");
|
||||
performAutomaticBackup();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get list of automatic backups for the current library.
|
||||
*/
|
||||
public List<BackupInfo> listAutomaticBackups() throws IOException {
|
||||
String libraryId = libraryService.getCurrentLibraryId();
|
||||
if (libraryId == null) {
|
||||
libraryId = "default";
|
||||
}
|
||||
|
||||
Path backupPath = Paths.get(automaticBackupDir, libraryId);
|
||||
if (!Files.exists(backupPath)) {
|
||||
return List.of();
|
||||
}
|
||||
|
||||
try (Stream<Path> stream = Files.list(backupPath)) {
|
||||
return stream
|
||||
.filter(Files::isRegularFile)
|
||||
.filter(p -> p.getFileName().toString().startsWith("auto_backup_"))
|
||||
.filter(p -> p.getFileName().toString().endsWith(".zip"))
|
||||
.sorted(Comparator.comparing((Path p) -> {
|
||||
try {
|
||||
return Files.getLastModifiedTime(p);
|
||||
} catch (IOException e) {
|
||||
return null;
|
||||
}
|
||||
}).reversed())
|
||||
.map(p -> {
|
||||
try {
|
||||
return new BackupInfo(
|
||||
p.getFileName().toString(),
|
||||
Files.size(p),
|
||||
Files.getLastModifiedTime(p).toInstant().toString()
|
||||
);
|
||||
} catch (IOException e) {
|
||||
return null;
|
||||
}
|
||||
})
|
||||
.filter(info -> info != null)
|
||||
.collect(Collectors.toList());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Simple backup info class.
|
||||
*/
|
||||
public static class BackupInfo {
|
||||
private final String filename;
|
||||
private final long sizeBytes;
|
||||
private final String createdAt;
|
||||
|
||||
public BackupInfo(String filename, long sizeBytes, String createdAt) {
|
||||
this.filename = filename;
|
||||
this.sizeBytes = sizeBytes;
|
||||
this.createdAt = createdAt;
|
||||
}
|
||||
|
||||
public String getFilename() {
|
||||
return filename;
|
||||
}
|
||||
|
||||
public long getSizeBytes() {
|
||||
return sizeBytes;
|
||||
}
|
||||
|
||||
public String getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,6 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.CollectionDto;
|
||||
import com.storycove.dto.SearchResultDto;
|
||||
import com.storycove.dto.StoryReadingDto;
|
||||
import com.storycove.dto.TagDto;
|
||||
@@ -31,7 +32,7 @@ public class CollectionService {
|
||||
private final CollectionStoryRepository collectionStoryRepository;
|
||||
private final StoryRepository storyRepository;
|
||||
private final TagRepository tagRepository;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
|
||||
@Autowired
|
||||
@@ -39,28 +40,42 @@ public class CollectionService {
|
||||
CollectionStoryRepository collectionStoryRepository,
|
||||
StoryRepository storyRepository,
|
||||
TagRepository tagRepository,
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
SearchServiceAdapter searchServiceAdapter,
|
||||
ReadingTimeService readingTimeService) {
|
||||
this.collectionRepository = collectionRepository;
|
||||
this.collectionStoryRepository = collectionStoryRepository;
|
||||
this.storyRepository = storyRepository;
|
||||
this.tagRepository = tagRepository;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
this.readingTimeService = readingTimeService;
|
||||
}
|
||||
|
||||
/**
|
||||
* Search collections using Typesense (MANDATORY for all search/filter operations)
|
||||
* Search collections using Solr (MANDATORY for all search/filter operations)
|
||||
* This method MUST be used instead of JPA queries for listing collections
|
||||
*/
|
||||
public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) {
|
||||
if (typesenseService == null) {
|
||||
logger.warn("Typesense service not available, returning empty results");
|
||||
try {
|
||||
// Use SearchServiceAdapter to search collections
|
||||
SearchResultDto<CollectionDto> searchResult = searchServiceAdapter.searchCollections(query, tags, includeArchived, page, limit);
|
||||
|
||||
// Convert CollectionDto back to Collection entities by fetching from database
|
||||
List<Collection> collections = new ArrayList<>();
|
||||
for (CollectionDto dto : searchResult.getResults()) {
|
||||
try {
|
||||
Collection collection = findByIdBasic(dto.getId());
|
||||
collections.add(collection);
|
||||
} catch (ResourceNotFoundException e) {
|
||||
logger.warn("Collection {} found in search index but not in database", dto.getId());
|
||||
}
|
||||
}
|
||||
|
||||
return new SearchResultDto<>(collections, (int) searchResult.getTotalHits(), page, limit,
|
||||
query != null ? query : "", searchResult.getSearchTimeMs());
|
||||
} catch (Exception e) {
|
||||
logger.error("Collection search failed, falling back to empty results", e);
|
||||
return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0);
|
||||
}
|
||||
|
||||
// Delegate to TypesenseService for all search operations
|
||||
return typesenseService.searchCollections(query, tags, includeArchived, page, limit);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -107,10 +122,7 @@ public class CollectionService {
|
||||
savedCollection = findById(savedCollection.getId());
|
||||
}
|
||||
|
||||
// Index in Typesense
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexCollection(savedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("Created collection: {} with {} stories", name, initialStoryIds != null ? initialStoryIds.size() : 0);
|
||||
return savedCollection;
|
||||
@@ -140,10 +152,7 @@ public class CollectionService {
|
||||
|
||||
Collection savedCollection = collectionRepository.save(collection);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexCollection(savedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("Updated collection: {}", id);
|
||||
return savedCollection;
|
||||
@@ -155,10 +164,7 @@ public class CollectionService {
|
||||
public void deleteCollection(UUID id) {
|
||||
Collection collection = findByIdBasic(id);
|
||||
|
||||
// Remove from Typesense first
|
||||
if (typesenseService != null) {
|
||||
typesenseService.removeCollection(id);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
collectionRepository.delete(collection);
|
||||
logger.info("Deleted collection: {}", id);
|
||||
@@ -173,10 +179,7 @@ public class CollectionService {
|
||||
|
||||
Collection savedCollection = collectionRepository.save(collection);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexCollection(savedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("{} collection: {}", archived ? "Archived" : "Unarchived", id);
|
||||
return savedCollection;
|
||||
@@ -221,10 +224,7 @@ public class CollectionService {
|
||||
}
|
||||
|
||||
// Update collection in Typesense
|
||||
if (typesenseService != null) {
|
||||
Collection updatedCollection = findById(collectionId);
|
||||
typesenseService.indexCollection(updatedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
long totalStories = collectionStoryRepository.countByCollectionId(collectionId);
|
||||
|
||||
@@ -249,10 +249,7 @@ public class CollectionService {
|
||||
collectionStoryRepository.delete(collectionStory);
|
||||
|
||||
// Update collection in Typesense
|
||||
if (typesenseService != null) {
|
||||
Collection updatedCollection = findById(collectionId);
|
||||
typesenseService.indexCollection(updatedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("Removed story {} from collection {}", storyId, collectionId);
|
||||
}
|
||||
@@ -285,10 +282,7 @@ public class CollectionService {
|
||||
}
|
||||
|
||||
// Update collection in Typesense
|
||||
if (typesenseService != null) {
|
||||
Collection updatedCollection = findById(collectionId);
|
||||
typesenseService.indexCollection(updatedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("Reordered {} stories in collection {}", storyOrders.size(), collectionId);
|
||||
}
|
||||
@@ -423,7 +417,7 @@ public class CollectionService {
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all collections for indexing (used by TypesenseService)
|
||||
* Get all collections for indexing (used by SearchServiceAdapter)
|
||||
*/
|
||||
public List<Collection> findAllForIndexing() {
|
||||
return collectionRepository.findAllActiveCollections();
|
||||
|
||||
@@ -7,7 +7,6 @@ import org.springframework.beans.factory.annotation.Qualifier;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.context.ApplicationContext;
|
||||
import org.springframework.context.ApplicationContextAware;
|
||||
import org.springframework.core.io.ByteArrayResource;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
@@ -52,7 +51,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
private CollectionRepository collectionRepository;
|
||||
|
||||
@Autowired
|
||||
private TypesenseService typesenseService;
|
||||
private SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
@@ -70,11 +69,83 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
this.applicationContext = applicationContext;
|
||||
}
|
||||
|
||||
// Helper methods to extract database connection details
|
||||
private String extractDatabaseUrl() {
|
||||
try (Connection connection = getDataSource().getConnection()) {
|
||||
return connection.getMetaData().getURL();
|
||||
} catch (SQLException e) {
|
||||
throw new RuntimeException("Failed to extract database URL", e);
|
||||
}
|
||||
}
|
||||
|
||||
private String extractDatabaseHost() {
|
||||
String url = extractDatabaseUrl();
|
||||
// Extract host from jdbc:postgresql://host:port/database
|
||||
if (url.startsWith("jdbc:postgresql://")) {
|
||||
String hostPort = url.substring("jdbc:postgresql://".length());
|
||||
if (hostPort.contains("/")) {
|
||||
hostPort = hostPort.substring(0, hostPort.indexOf("/"));
|
||||
}
|
||||
if (hostPort.contains(":")) {
|
||||
return hostPort.substring(0, hostPort.indexOf(":"));
|
||||
}
|
||||
return hostPort;
|
||||
}
|
||||
return "localhost"; // fallback
|
||||
}
|
||||
|
||||
private String extractDatabasePort() {
|
||||
String url = extractDatabaseUrl();
|
||||
// Extract port from jdbc:postgresql://host:port/database
|
||||
if (url.startsWith("jdbc:postgresql://")) {
|
||||
String hostPort = url.substring("jdbc:postgresql://".length());
|
||||
if (hostPort.contains("/")) {
|
||||
hostPort = hostPort.substring(0, hostPort.indexOf("/"));
|
||||
}
|
||||
if (hostPort.contains(":")) {
|
||||
return hostPort.substring(hostPort.indexOf(":") + 1);
|
||||
}
|
||||
}
|
||||
return "5432"; // default PostgreSQL port
|
||||
}
|
||||
|
||||
private String extractDatabaseName() {
|
||||
String url = extractDatabaseUrl();
|
||||
// Extract database name from jdbc:postgresql://host:port/database
|
||||
if (url.startsWith("jdbc:postgresql://")) {
|
||||
String remaining = url.substring("jdbc:postgresql://".length());
|
||||
if (remaining.contains("/")) {
|
||||
String dbPart = remaining.substring(remaining.indexOf("/") + 1);
|
||||
// Remove any query parameters
|
||||
if (dbPart.contains("?")) {
|
||||
dbPart = dbPart.substring(0, dbPart.indexOf("?"));
|
||||
}
|
||||
return dbPart;
|
||||
}
|
||||
}
|
||||
return "storycove"; // fallback
|
||||
}
|
||||
|
||||
private String extractDatabaseUsername() {
|
||||
// Get from environment variable or default
|
||||
return System.getenv("SPRING_DATASOURCE_USERNAME") != null ?
|
||||
System.getenv("SPRING_DATASOURCE_USERNAME") : "storycove";
|
||||
}
|
||||
|
||||
private String extractDatabasePassword() {
|
||||
// Get from environment variable or default
|
||||
return System.getenv("SPRING_DATASOURCE_PASSWORD") != null ?
|
||||
System.getenv("SPRING_DATASOURCE_PASSWORD") : "password";
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a comprehensive backup including database and files in ZIP format
|
||||
* Returns a streaming resource to avoid loading large backups into memory
|
||||
*/
|
||||
public Resource createCompleteBackup() throws SQLException, IOException {
|
||||
// Create temp file with deleteOnExit as safety net
|
||||
Path tempZip = Files.createTempFile("storycove-backup", ".zip");
|
||||
tempZip.toFile().deleteOnExit();
|
||||
|
||||
try (ZipOutputStream zipOut = new ZipOutputStream(Files.newOutputStream(tempZip))) {
|
||||
// 1. Add database dump
|
||||
@@ -87,16 +158,36 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
addMetadataToZip(zipOut);
|
||||
}
|
||||
|
||||
// Return the ZIP file as a resource
|
||||
byte[] zipData = Files.readAllBytes(tempZip);
|
||||
Files.deleteIfExists(tempZip);
|
||||
|
||||
return new ByteArrayResource(zipData);
|
||||
// Return the ZIP file as a FileSystemResource for streaming
|
||||
// This avoids loading the entire file into memory
|
||||
return new org.springframework.core.io.FileSystemResource(tempZip.toFile()) {
|
||||
@Override
|
||||
public InputStream getInputStream() throws IOException {
|
||||
// Wrap the input stream to delete the temp file after it's fully read
|
||||
return new java.io.FilterInputStream(super.getInputStream()) {
|
||||
@Override
|
||||
public void close() throws IOException {
|
||||
try {
|
||||
super.close();
|
||||
} finally {
|
||||
// Clean up temp file after streaming is complete
|
||||
try {
|
||||
Files.deleteIfExists(tempZip);
|
||||
} catch (IOException e) {
|
||||
// Log but don't fail - deleteOnExit will handle it
|
||||
System.err.println("Warning: Could not delete temp backup file: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Restore from complete backup (ZIP format)
|
||||
*/
|
||||
@Transactional(timeout = 1800) // 30 minutes timeout for large backup restores
|
||||
public void restoreFromCompleteBackup(InputStream backupStream) throws IOException, SQLException {
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
System.err.println("Starting complete backup restore for library: " + currentLibraryId);
|
||||
@@ -145,15 +236,15 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
System.err.println("No files directory found in backup - skipping file restore.");
|
||||
}
|
||||
|
||||
// 6. Trigger complete Typesense reindex after data restoration
|
||||
// 6. Trigger complete search index reindex after data restoration
|
||||
try {
|
||||
System.err.println("Starting Typesense reindex after restore...");
|
||||
TypesenseService typesenseService = applicationContext.getBean(TypesenseService.class);
|
||||
typesenseService.performCompleteReindex();
|
||||
System.err.println("Typesense reindex completed successfully.");
|
||||
System.err.println("Starting search index reindex after restore...");
|
||||
SearchServiceAdapter searchServiceAdapter = applicationContext.getBean(SearchServiceAdapter.class);
|
||||
searchServiceAdapter.performCompleteReindex();
|
||||
System.err.println("Search index reindex completed successfully.");
|
||||
} catch (Exception e) {
|
||||
System.err.println("Warning: Failed to reindex Typesense after restore: " + e.getMessage());
|
||||
// Don't fail the entire restore for Typesense issues
|
||||
System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
|
||||
// Don't fail the entire restore for search issues
|
||||
}
|
||||
|
||||
System.err.println("Complete backup restore finished successfully.");
|
||||
@@ -171,157 +262,191 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
}
|
||||
|
||||
public Resource createBackup() throws SQLException, IOException {
|
||||
StringBuilder sqlDump = new StringBuilder();
|
||||
// Use PostgreSQL's native pg_dump for reliable backup
|
||||
String dbHost = extractDatabaseHost();
|
||||
String dbPort = extractDatabasePort();
|
||||
String dbName = extractDatabaseName();
|
||||
String dbUser = extractDatabaseUsername();
|
||||
String dbPassword = extractDatabasePassword();
|
||||
|
||||
try (Connection connection = getDataSource().getConnection()) {
|
||||
// Add header
|
||||
sqlDump.append("-- StoryCove Database Backup\n");
|
||||
sqlDump.append("-- Generated at: ").append(new java.util.Date()).append("\n\n");
|
||||
// Create temporary file for backup
|
||||
Path tempBackupFile = Files.createTempFile("storycove_backup_", ".sql");
|
||||
|
||||
// Disable foreign key checks during restore (PostgreSQL syntax)
|
||||
sqlDump.append("SET session_replication_role = replica;\n\n");
|
||||
|
||||
// List of tables in dependency order (parents first for insertion)
|
||||
List<String> insertTables = Arrays.asList(
|
||||
"authors", "series", "tags", "collections",
|
||||
"stories", "story_tags", "author_urls", "collection_stories"
|
||||
try {
|
||||
// Build pg_dump command
|
||||
ProcessBuilder pb = new ProcessBuilder(
|
||||
"pg_dump",
|
||||
"--host=" + dbHost,
|
||||
"--port=" + dbPort,
|
||||
"--username=" + dbUser,
|
||||
"--dbname=" + dbName,
|
||||
"--no-password",
|
||||
"--verbose",
|
||||
"--clean",
|
||||
"--if-exists",
|
||||
"--create",
|
||||
"--file=" + tempBackupFile.toString()
|
||||
);
|
||||
|
||||
// TRUNCATE in reverse order (children first)
|
||||
List<String> truncateTables = Arrays.asList(
|
||||
"collection_stories", "author_urls", "story_tags",
|
||||
"stories", "collections", "tags", "series", "authors"
|
||||
);
|
||||
// Set PGPASSWORD environment variable
|
||||
Map<String, String> env = pb.environment();
|
||||
env.put("PGPASSWORD", dbPassword);
|
||||
|
||||
// Generate TRUNCATE statements for each table (assuming tables already exist)
|
||||
for (String tableName : truncateTables) {
|
||||
sqlDump.append("-- Truncate Table: ").append(tableName).append("\n");
|
||||
sqlDump.append("TRUNCATE TABLE \"").append(tableName).append("\" CASCADE;\n");
|
||||
}
|
||||
sqlDump.append("\n");
|
||||
System.err.println("Starting PostgreSQL backup using pg_dump...");
|
||||
Process process = pb.start();
|
||||
|
||||
// Generate INSERT statements in dependency order
|
||||
for (String tableName : insertTables) {
|
||||
sqlDump.append("-- Data for Table: ").append(tableName).append("\n");
|
||||
|
||||
// Get table data
|
||||
try (PreparedStatement stmt = connection.prepareStatement("SELECT * FROM \"" + tableName + "\"");
|
||||
ResultSet rs = stmt.executeQuery()) {
|
||||
|
||||
ResultSetMetaData metaData = rs.getMetaData();
|
||||
int columnCount = metaData.getColumnCount();
|
||||
|
||||
// Build column names for INSERT statement
|
||||
StringBuilder columnNames = new StringBuilder();
|
||||
for (int i = 1; i <= columnCount; i++) {
|
||||
if (i > 1) columnNames.append(", ");
|
||||
columnNames.append("\"").append(metaData.getColumnName(i)).append("\"");
|
||||
}
|
||||
|
||||
while (rs.next()) {
|
||||
sqlDump.append("INSERT INTO \"").append(tableName).append("\" (")
|
||||
.append(columnNames).append(") VALUES (");
|
||||
|
||||
for (int i = 1; i <= columnCount; i++) {
|
||||
if (i > 1) sqlDump.append(", ");
|
||||
|
||||
Object value = rs.getObject(i);
|
||||
sqlDump.append(formatSqlValue(value));
|
||||
}
|
||||
|
||||
sqlDump.append(");\n");
|
||||
}
|
||||
// Capture output
|
||||
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
|
||||
String line;
|
||||
while ((line = reader.readLine()) != null) {
|
||||
System.err.println("pg_dump: " + line);
|
||||
}
|
||||
|
||||
sqlDump.append("\n");
|
||||
}
|
||||
|
||||
// Re-enable foreign key checks (PostgreSQL syntax)
|
||||
sqlDump.append("SET session_replication_role = DEFAULT;\n");
|
||||
}
|
||||
int exitCode = process.waitFor();
|
||||
if (exitCode != 0) {
|
||||
throw new RuntimeException("pg_dump failed with exit code: " + exitCode);
|
||||
}
|
||||
|
||||
byte[] backupData = sqlDump.toString().getBytes(StandardCharsets.UTF_8);
|
||||
return new ByteArrayResource(backupData);
|
||||
System.err.println("PostgreSQL backup completed successfully");
|
||||
|
||||
// Return the backup file as a streaming resource to avoid memory issues with large databases
|
||||
tempBackupFile.toFile().deleteOnExit();
|
||||
return new org.springframework.core.io.FileSystemResource(tempBackupFile.toFile()) {
|
||||
@Override
|
||||
public InputStream getInputStream() throws IOException {
|
||||
// Wrap the input stream to delete the temp file after it's fully read
|
||||
return new java.io.FilterInputStream(super.getInputStream()) {
|
||||
@Override
|
||||
public void close() throws IOException {
|
||||
try {
|
||||
super.close();
|
||||
} finally {
|
||||
// Clean up temp file after streaming is complete
|
||||
try {
|
||||
Files.deleteIfExists(tempBackupFile);
|
||||
} catch (IOException e) {
|
||||
// Log but don't fail - deleteOnExit will handle it
|
||||
System.err.println("Warning: Could not delete temp backup file: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
} catch (InterruptedException e) {
|
||||
Thread.currentThread().interrupt();
|
||||
throw new RuntimeException("Backup process was interrupted", e);
|
||||
}
|
||||
}
|
||||
|
||||
@Transactional
|
||||
@Transactional(timeout = 1800) // 30 minutes timeout for large backup restores
|
||||
public void restoreFromBackup(InputStream backupStream) throws IOException, SQLException {
|
||||
// Read the SQL file
|
||||
StringBuilder sqlContent = new StringBuilder();
|
||||
try (BufferedReader reader = new BufferedReader(new InputStreamReader(backupStream, StandardCharsets.UTF_8))) {
|
||||
String line;
|
||||
while ((line = reader.readLine()) != null) {
|
||||
// Skip comments and empty lines
|
||||
if (!line.trim().startsWith("--") && !line.trim().isEmpty()) {
|
||||
sqlContent.append(line).append("\n");
|
||||
// Use PostgreSQL's native psql for reliable restore
|
||||
String dbHost = extractDatabaseHost();
|
||||
String dbPort = extractDatabasePort();
|
||||
String dbName = extractDatabaseName();
|
||||
String dbUser = extractDatabaseUsername();
|
||||
String dbPassword = extractDatabasePassword();
|
||||
|
||||
// Create temporary file for the backup
|
||||
Path tempBackupFile = Files.createTempFile("storycove_restore_", ".sql");
|
||||
|
||||
try {
|
||||
// Write backup stream to temporary file
|
||||
System.err.println("Writing backup data to temporary file...");
|
||||
try (InputStream input = backupStream;
|
||||
OutputStream output = Files.newOutputStream(tempBackupFile)) {
|
||||
byte[] buffer = new byte[8192];
|
||||
int bytesRead;
|
||||
while ((bytesRead = input.read(buffer)) != -1) {
|
||||
output.write(buffer, 0, bytesRead);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Execute the SQL statements
|
||||
try (Connection connection = getDataSource().getConnection()) {
|
||||
connection.setAutoCommit(false);
|
||||
System.err.println("Starting PostgreSQL restore using psql...");
|
||||
|
||||
try {
|
||||
// Ensure database schema exists before restoring data
|
||||
ensureDatabaseSchemaExists(connection);
|
||||
// Build psql command to restore the backup
|
||||
ProcessBuilder pb = new ProcessBuilder(
|
||||
"psql",
|
||||
"--host=" + dbHost,
|
||||
"--port=" + dbPort,
|
||||
"--username=" + dbUser,
|
||||
"--dbname=" + dbName,
|
||||
"--no-password",
|
||||
"--echo-errors",
|
||||
"--file=" + tempBackupFile.toString()
|
||||
);
|
||||
|
||||
// Parse SQL statements properly (handle semicolons inside string literals)
|
||||
List<String> statements = parseStatements(sqlContent.toString());
|
||||
// Set PGPASSWORD environment variable
|
||||
Map<String, String> env = pb.environment();
|
||||
env.put("PGPASSWORD", dbPassword);
|
||||
|
||||
int successCount = 0;
|
||||
for (String statement : statements) {
|
||||
String trimmedStatement = statement.trim();
|
||||
if (!trimmedStatement.isEmpty()) {
|
||||
try (PreparedStatement stmt = connection.prepareStatement(trimmedStatement)) {
|
||||
stmt.executeUpdate();
|
||||
successCount++;
|
||||
} catch (SQLException e) {
|
||||
// Log detailed error information for failed statements
|
||||
System.err.println("ERROR: Failed to execute SQL statement #" + (successCount + 1));
|
||||
System.err.println("Error: " + e.getMessage());
|
||||
System.err.println("SQL State: " + e.getSQLState());
|
||||
System.err.println("Error Code: " + e.getErrorCode());
|
||||
Process process = pb.start();
|
||||
|
||||
// Show the problematic statement (first 500 chars)
|
||||
String statementPreview = trimmedStatement.length() > 500 ?
|
||||
trimmedStatement.substring(0, 500) + "..." : trimmedStatement;
|
||||
System.err.println("Statement: " + statementPreview);
|
||||
// Capture output
|
||||
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()));
|
||||
BufferedReader outputReader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
|
||||
|
||||
throw e; // Re-throw to trigger rollback
|
||||
// Read stderr in a separate thread
|
||||
Thread errorThread = new Thread(() -> {
|
||||
try {
|
||||
String line;
|
||||
while ((line = reader.readLine()) != null) {
|
||||
System.err.println("psql stderr: " + line);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
System.err.println("Error reading psql stderr: " + e.getMessage());
|
||||
}
|
||||
});
|
||||
errorThread.start();
|
||||
|
||||
// Read stdout
|
||||
String line;
|
||||
while ((line = outputReader.readLine()) != null) {
|
||||
System.err.println("psql stdout: " + line);
|
||||
}
|
||||
|
||||
connection.commit();
|
||||
System.err.println("Restore completed successfully. Executed " + successCount + " SQL statements.");
|
||||
errorThread.join();
|
||||
}
|
||||
|
||||
// Reindex search after successful restore
|
||||
try {
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
System.err.println("Starting Typesense reindex after successful restore for library: " + currentLibraryId);
|
||||
if (currentLibraryId == null) {
|
||||
System.err.println("ERROR: No current library set during restore - cannot reindex Typesense!");
|
||||
throw new IllegalStateException("No current library active during restore");
|
||||
}
|
||||
int exitCode = process.waitFor();
|
||||
if (exitCode != 0) {
|
||||
throw new RuntimeException("psql restore failed with exit code: " + exitCode);
|
||||
}
|
||||
|
||||
// Manually trigger reindexing using the correct database connection
|
||||
System.err.println("Triggering manual reindex from library-specific database for library: " + currentLibraryId);
|
||||
reindexStoriesAndAuthorsFromCurrentDatabase();
|
||||
System.err.println("PostgreSQL restore completed successfully");
|
||||
|
||||
// Note: Collections collection will be recreated when needed by the service
|
||||
System.err.println("Typesense reindex completed successfully for library: " + currentLibraryId);
|
||||
} catch (Exception e) {
|
||||
// Log the error but don't fail the restore
|
||||
System.err.println("Warning: Failed to reindex Typesense after restore: " + e.getMessage());
|
||||
e.printStackTrace();
|
||||
// Reindex search after successful restore
|
||||
try {
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
System.err.println("Starting search reindex after successful restore for library: " + currentLibraryId);
|
||||
if (currentLibraryId == null) {
|
||||
System.err.println("ERROR: No current library set during restore - cannot reindex search!");
|
||||
throw new IllegalStateException("No current library active during restore");
|
||||
}
|
||||
|
||||
} catch (SQLException e) {
|
||||
connection.rollback();
|
||||
throw e;
|
||||
} finally {
|
||||
connection.setAutoCommit(true);
|
||||
// Manually trigger reindexing using the correct database connection
|
||||
System.err.println("Triggering manual reindex from library-specific database for library: " + currentLibraryId);
|
||||
reindexStoriesAndAuthorsFromCurrentDatabase();
|
||||
|
||||
// Note: Collections collection will be recreated when needed by the service
|
||||
System.err.println("Search reindex completed successfully for library: " + currentLibraryId);
|
||||
} catch (Exception e) {
|
||||
// Log the error but don't fail the restore
|
||||
System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
|
||||
e.printStackTrace();
|
||||
}
|
||||
} catch (InterruptedException e) {
|
||||
Thread.currentThread().interrupt();
|
||||
throw new RuntimeException("Restore process was interrupted", e);
|
||||
} finally {
|
||||
// Clean up temporary file
|
||||
try {
|
||||
Files.deleteIfExists(tempBackupFile);
|
||||
} catch (IOException e) {
|
||||
System.err.println("Warning: Could not delete temporary restore file: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -351,7 +476,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
totalDeleted = collectionCount + storyCount + authorCount + seriesCount + tagCount;
|
||||
|
||||
// Note: Search indexes will need to be manually recreated after clearing
|
||||
// Use the settings page to recreate Typesense collections after clearing the database
|
||||
// Use the settings page to recreate search indices after clearing the database
|
||||
|
||||
} catch (Exception e) {
|
||||
throw new RuntimeException("Failed to clear database: " + e.getMessage(), e);
|
||||
@@ -449,7 +574,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
/**
|
||||
* Clear all data AND files (for complete restore)
|
||||
*/
|
||||
@Transactional
|
||||
@Transactional(timeout = 600) // 10 minutes timeout for clearing large datasets
|
||||
public int clearAllDataAndFiles() {
|
||||
// First clear the database
|
||||
int totalDeleted = clearAllData();
|
||||
@@ -506,8 +631,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
}
|
||||
|
||||
// For clearing, we only want to recreate empty collections (no data to index)
|
||||
typesenseService.recreateStoriesCollection();
|
||||
typesenseService.recreateAuthorsCollection();
|
||||
searchServiceAdapter.recreateIndices();
|
||||
// Note: Collections collection will be recreated when needed by the service
|
||||
System.err.println("Search indexes cleared successfully for library: " + currentLibraryId);
|
||||
} catch (Exception e) {
|
||||
@@ -959,10 +1083,9 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
try (Connection connection = getDataSource().getConnection()) {
|
||||
// First, recreate empty collections
|
||||
try {
|
||||
typesenseService.recreateStoriesCollection();
|
||||
typesenseService.recreateAuthorsCollection();
|
||||
searchServiceAdapter.recreateIndices();
|
||||
} catch (Exception e) {
|
||||
throw new SQLException("Failed to recreate Typesense collections", e);
|
||||
throw new SQLException("Failed to recreate search indices", e);
|
||||
}
|
||||
|
||||
// Count and reindex stories with full author and series information
|
||||
@@ -984,7 +1107,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
while (rs.next()) {
|
||||
// Create a complete Story object for indexing
|
||||
var story = createStoryFromResultSet(rs);
|
||||
typesenseService.indexStory(story);
|
||||
searchServiceAdapter.indexStory(story);
|
||||
storyCount++;
|
||||
}
|
||||
}
|
||||
@@ -999,7 +1122,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
||||
while (rs.next()) {
|
||||
// Create a minimal Author object for indexing
|
||||
var author = createAuthorFromResultSet(rs);
|
||||
typesenseService.indexAuthor(author);
|
||||
searchServiceAdapter.indexAuthor(author);
|
||||
authorCount++;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -16,6 +16,8 @@ import nl.siegmann.epublib.epub.EpubReader;
|
||||
|
||||
import org.jsoup.Jsoup;
|
||||
import org.jsoup.nodes.Document;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
@@ -30,6 +32,7 @@ import java.util.Optional;
|
||||
@Service
|
||||
@Transactional
|
||||
public class EPUBImportService {
|
||||
private static final Logger log = LoggerFactory.getLogger(EPUBImportService.class);
|
||||
|
||||
private final StoryService storyService;
|
||||
private final AuthorService authorService;
|
||||
@@ -74,6 +77,34 @@ public class EPUBImportService {
|
||||
|
||||
Story savedStory = storyService.create(story);
|
||||
|
||||
// Process embedded images if content contains any
|
||||
String originalContent = story.getContentHtml();
|
||||
if (originalContent != null && originalContent.contains("<img")) {
|
||||
try {
|
||||
ImageService.ContentImageProcessingResult imageResult =
|
||||
imageService.processContentImages(originalContent, savedStory.getId());
|
||||
|
||||
// Update story content with processed images if changed
|
||||
if (!imageResult.getProcessedContent().equals(originalContent)) {
|
||||
savedStory.setContentHtml(imageResult.getProcessedContent());
|
||||
savedStory = storyService.update(savedStory.getId(), savedStory);
|
||||
|
||||
// Log the image processing results
|
||||
log.debug("EPUB Import - Image processing completed for story {}. Downloaded {} images.",
|
||||
savedStory.getId(), imageResult.getDownloadedImages().size());
|
||||
|
||||
if (imageResult.hasWarnings()) {
|
||||
log.debug("EPUB Import - Image processing warnings: {}",
|
||||
String.join(", ", imageResult.getWarnings()));
|
||||
}
|
||||
}
|
||||
} catch (Exception e) {
|
||||
// Log error but don't fail the import
|
||||
System.err.println("EPUB Import - Failed to process embedded images for story " +
|
||||
savedStory.getId() + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
EPUBImportResponse response = EPUBImportResponse.success(savedStory.getId(), savedStory.getTitle());
|
||||
response.setWordCount(savedStory.getWordCount());
|
||||
response.setTotalChapters(book.getSpine().size());
|
||||
@@ -254,7 +285,7 @@ public class EPUBImportService {
|
||||
if (language != null && !language.trim().isEmpty()) {
|
||||
// Store as metadata in story description if needed
|
||||
// For now, we'll just log it for potential future use
|
||||
System.out.println("EPUB Language: " + language);
|
||||
log.debug("EPUB Language: {}", language);
|
||||
}
|
||||
|
||||
// Extract publisher information
|
||||
@@ -262,14 +293,14 @@ public class EPUBImportService {
|
||||
if (publishers != null && !publishers.isEmpty()) {
|
||||
String publisher = publishers.get(0);
|
||||
// Could append to description or store separately in future
|
||||
System.out.println("EPUB Publisher: " + publisher);
|
||||
log.debug("EPUB Publisher: {}", publisher);
|
||||
}
|
||||
|
||||
// Extract publication date
|
||||
List<nl.siegmann.epublib.domain.Date> dates = metadata.getDates();
|
||||
if (dates != null && !dates.isEmpty()) {
|
||||
for (nl.siegmann.epublib.domain.Date date : dates) {
|
||||
System.out.println("EPUB Date (" + date.getEvent() + "): " + date.getValue());
|
||||
log.debug("EPUB Date ({}): {}", date.getEvent(), date.getValue());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -277,7 +308,7 @@ public class EPUBImportService {
|
||||
List<nl.siegmann.epublib.domain.Identifier> identifiers = metadata.getIdentifiers();
|
||||
if (identifiers != null && !identifiers.isEmpty()) {
|
||||
for (nl.siegmann.epublib.domain.Identifier identifier : identifiers) {
|
||||
System.out.println("EPUB Identifier (" + identifier.getScheme() + "): " + identifier.getValue());
|
||||
log.debug("EPUB Identifier ({}): {}", identifier.getScheme(), identifier.getValue());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -137,12 +137,63 @@ public class HtmlSanitizationService {
|
||||
return config;
|
||||
}
|
||||
|
||||
/**
|
||||
* Preprocess HTML to extract images from figure tags before sanitization
|
||||
*/
|
||||
private String preprocessFigureTags(String html) {
|
||||
if (html == null || html.trim().isEmpty()) {
|
||||
return html;
|
||||
}
|
||||
|
||||
try {
|
||||
org.jsoup.nodes.Document doc = Jsoup.parse(html);
|
||||
org.jsoup.select.Elements figures = doc.select("figure");
|
||||
|
||||
for (org.jsoup.nodes.Element figure : figures) {
|
||||
// Find img tags within the figure
|
||||
org.jsoup.select.Elements images = figure.select("img");
|
||||
|
||||
if (!images.isEmpty()) {
|
||||
// Extract the first image and replace the figure with it
|
||||
org.jsoup.nodes.Element img = images.first();
|
||||
|
||||
// Check if there's a figcaption to preserve as alt text
|
||||
org.jsoup.select.Elements figcaptions = figure.select("figcaption");
|
||||
if (!figcaptions.isEmpty() && !img.hasAttr("alt")) {
|
||||
String captionText = figcaptions.first().text();
|
||||
if (captionText != null && !captionText.trim().isEmpty()) {
|
||||
img.attr("alt", captionText);
|
||||
}
|
||||
}
|
||||
|
||||
// Replace the figure element with just the img
|
||||
figure.replaceWith(img.clone());
|
||||
logger.debug("Extracted image from figure tag: {}", img.attr("src"));
|
||||
} else {
|
||||
// No images in figure, remove it entirely
|
||||
figure.remove();
|
||||
logger.debug("Removed figure tag without images");
|
||||
}
|
||||
}
|
||||
|
||||
return doc.body().html();
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to preprocess figure tags, returning original HTML: {}", e.getMessage());
|
||||
return html;
|
||||
}
|
||||
}
|
||||
|
||||
public String sanitize(String html) {
|
||||
if (html == null || html.trim().isEmpty()) {
|
||||
return "";
|
||||
}
|
||||
|
||||
logger.info("Content before sanitization: "+html);
|
||||
String saniztedHtml = Jsoup.clean(html, allowlist.preserveRelativeLinks(true));
|
||||
|
||||
// Preprocess to extract images from figure tags
|
||||
String preprocessed = preprocessFigureTags(html);
|
||||
|
||||
String saniztedHtml = Jsoup.clean(preprocessed, allowlist.preserveRelativeLinks(true));
|
||||
logger.info("Content after sanitization: "+saniztedHtml);
|
||||
return saniztedHtml;
|
||||
}
|
||||
|
||||
@@ -0,0 +1,108 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.util.Map;
|
||||
import java.util.UUID;
|
||||
import java.util.concurrent.ConcurrentHashMap;
|
||||
|
||||
@Service
|
||||
public class ImageProcessingProgressService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(ImageProcessingProgressService.class);
|
||||
|
||||
private final Map<UUID, ImageProcessingProgress> progressMap = new ConcurrentHashMap<>();
|
||||
|
||||
public static class ImageProcessingProgress {
|
||||
private final UUID storyId;
|
||||
private final int totalImages;
|
||||
private volatile int processedImages;
|
||||
private volatile String currentImageUrl;
|
||||
private volatile String status;
|
||||
private volatile boolean completed;
|
||||
private volatile String errorMessage;
|
||||
|
||||
public ImageProcessingProgress(UUID storyId, int totalImages) {
|
||||
this.storyId = storyId;
|
||||
this.totalImages = totalImages;
|
||||
this.processedImages = 0;
|
||||
this.status = "Starting";
|
||||
this.completed = false;
|
||||
}
|
||||
|
||||
// Getters
|
||||
public UUID getStoryId() { return storyId; }
|
||||
public int getTotalImages() { return totalImages; }
|
||||
public int getProcessedImages() { return processedImages; }
|
||||
public String getCurrentImageUrl() { return currentImageUrl; }
|
||||
public String getStatus() { return status; }
|
||||
public boolean isCompleted() { return completed; }
|
||||
public String getErrorMessage() { return errorMessage; }
|
||||
public double getProgressPercentage() {
|
||||
return totalImages > 0 ? (double) processedImages / totalImages * 100 : 100;
|
||||
}
|
||||
|
||||
// Setters
|
||||
public void setProcessedImages(int processedImages) { this.processedImages = processedImages; }
|
||||
public void setCurrentImageUrl(String currentImageUrl) { this.currentImageUrl = currentImageUrl; }
|
||||
public void setStatus(String status) { this.status = status; }
|
||||
public void setCompleted(boolean completed) { this.completed = completed; }
|
||||
public void setErrorMessage(String errorMessage) { this.errorMessage = errorMessage; }
|
||||
|
||||
public void incrementProcessed() {
|
||||
this.processedImages++;
|
||||
}
|
||||
}
|
||||
|
||||
public ImageProcessingProgress startProgress(UUID storyId, int totalImages) {
|
||||
ImageProcessingProgress progress = new ImageProcessingProgress(storyId, totalImages);
|
||||
progressMap.put(storyId, progress);
|
||||
logger.info("Started image processing progress tracking for story {} with {} images", storyId, totalImages);
|
||||
return progress;
|
||||
}
|
||||
|
||||
public ImageProcessingProgress getProgress(UUID storyId) {
|
||||
return progressMap.get(storyId);
|
||||
}
|
||||
|
||||
public void updateProgress(UUID storyId, int processedImages, String currentImageUrl, String status) {
|
||||
ImageProcessingProgress progress = progressMap.get(storyId);
|
||||
if (progress != null) {
|
||||
progress.setProcessedImages(processedImages);
|
||||
progress.setCurrentImageUrl(currentImageUrl);
|
||||
progress.setStatus(status);
|
||||
logger.debug("Updated progress for story {}: {}/{} - {}", storyId, processedImages, progress.getTotalImages(), status);
|
||||
}
|
||||
}
|
||||
|
||||
public void completeProgress(UUID storyId, String finalStatus) {
|
||||
ImageProcessingProgress progress = progressMap.get(storyId);
|
||||
if (progress != null) {
|
||||
progress.setCompleted(true);
|
||||
progress.setStatus(finalStatus);
|
||||
logger.info("Completed image processing for story {}: {}", storyId, finalStatus);
|
||||
}
|
||||
}
|
||||
|
||||
public void setError(UUID storyId, String errorMessage) {
|
||||
ImageProcessingProgress progress = progressMap.get(storyId);
|
||||
if (progress != null) {
|
||||
progress.setErrorMessage(errorMessage);
|
||||
progress.setStatus("Error: " + errorMessage);
|
||||
progress.setCompleted(true);
|
||||
logger.error("Image processing error for story {}: {}", storyId, errorMessage);
|
||||
}
|
||||
}
|
||||
|
||||
public void removeProgress(UUID storyId) {
|
||||
progressMap.remove(storyId);
|
||||
logger.debug("Removed progress tracking for story {}", storyId);
|
||||
}
|
||||
|
||||
public boolean isProcessing(UUID storyId) {
|
||||
ImageProcessingProgress progress = progressMap.get(storyId);
|
||||
return progress != null && !progress.isCompleted();
|
||||
}
|
||||
}
|
||||
@@ -4,6 +4,8 @@ import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.context.event.EventListener;
|
||||
import org.springframework.scheduling.annotation.Async;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.web.multipart.MultipartFile;
|
||||
|
||||
@@ -21,6 +23,8 @@ import java.util.List;
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
import com.storycove.event.StoryContentUpdatedEvent;
|
||||
|
||||
@Service
|
||||
public class ImageService {
|
||||
|
||||
@@ -40,6 +44,15 @@ public class ImageService {
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
@Autowired
|
||||
private StoryService storyService;
|
||||
|
||||
@Autowired
|
||||
private AuthorService authorService;
|
||||
|
||||
@Autowired
|
||||
private CollectionService collectionService;
|
||||
|
||||
private String getUploadDir() {
|
||||
String libraryPath = libraryService.getCurrentImagePath();
|
||||
return baseUploadDir + libraryPath;
|
||||
@@ -57,6 +70,9 @@ public class ImageService {
|
||||
@Value("${storycove.images.max-file-size:5242880}") // 5MB default
|
||||
private long maxFileSize;
|
||||
|
||||
@Value("${storycove.app.public-url:http://localhost:6925}")
|
||||
private String publicUrl;
|
||||
|
||||
public enum ImageType {
|
||||
COVER("covers"),
|
||||
AVATAR("avatars"),
|
||||
@@ -245,14 +261,14 @@ public class ImageService {
|
||||
* Process HTML content and download all referenced images, replacing URLs with local paths
|
||||
*/
|
||||
public ContentImageProcessingResult processContentImages(String htmlContent, UUID storyId) {
|
||||
logger.info("Processing content images for story: {}, content length: {}", storyId,
|
||||
logger.debug("Processing content images for story: {}, content length: {}", storyId,
|
||||
htmlContent != null ? htmlContent.length() : 0);
|
||||
|
||||
List<String> warnings = new ArrayList<>();
|
||||
List<String> downloadedImages = new ArrayList<>();
|
||||
|
||||
if (htmlContent == null || htmlContent.trim().isEmpty()) {
|
||||
logger.info("No content to process for story: {}", storyId);
|
||||
logger.debug("No content to process for story: {}", storyId);
|
||||
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
|
||||
}
|
||||
|
||||
@@ -270,18 +286,18 @@ public class ImageService {
|
||||
String imageUrl = matcher.group(1);
|
||||
imageCount++;
|
||||
|
||||
logger.info("Found image #{}: {} in tag: {}", imageCount, imageUrl, fullImgTag);
|
||||
logger.debug("Found image #{}: {} in tag: {}", imageCount, imageUrl, fullImgTag);
|
||||
|
||||
try {
|
||||
// Skip if it's already a local path or data URL
|
||||
if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) {
|
||||
logger.info("Skipping local/data URL: {}", imageUrl);
|
||||
// Skip if it's already a local path, data URL, or from this application
|
||||
if (!isExternalUrl(imageUrl)) {
|
||||
logger.debug("Skipping local/internal URL: {}", imageUrl);
|
||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||
continue;
|
||||
}
|
||||
|
||||
externalImageCount++;
|
||||
logger.info("Processing external image #{}: {}", externalImageCount, imageUrl);
|
||||
logger.debug("Processing external image #{}: {}", externalImageCount, imageUrl);
|
||||
|
||||
// Download and store the image
|
||||
String localPath = downloadImageFromUrl(imageUrl, storyId);
|
||||
@@ -289,7 +305,7 @@ public class ImageService {
|
||||
|
||||
// Generate local URL
|
||||
String localUrl = getLocalImageUrl(storyId, localPath);
|
||||
logger.info("Downloaded image: {} -> {}", imageUrl, localUrl);
|
||||
logger.debug("Downloaded image: {} -> {}", imageUrl, localUrl);
|
||||
|
||||
// Replace the src attribute with the local path - handle both single and double quotes
|
||||
String newImgTag = fullImgTag
|
||||
@@ -302,7 +318,7 @@ public class ImageService {
|
||||
newImgTag = fullImgTag.replaceAll("src\\s*=\\s*[\"']?" + Pattern.quote(imageUrl) + "[\"']?", "src=\"" + localUrl + "\"");
|
||||
}
|
||||
|
||||
logger.info("Replaced img tag: {} -> {}", fullImgTag, newImgTag);
|
||||
logger.debug("Replaced img tag: {} -> {}", fullImgTag, newImgTag);
|
||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
|
||||
|
||||
} catch (Exception e) {
|
||||
@@ -321,6 +337,151 @@ public class ImageService {
|
||||
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
|
||||
}
|
||||
|
||||
/**
|
||||
* Functional interface for progress callbacks during image processing
|
||||
*/
|
||||
@FunctionalInterface
|
||||
public interface ImageProcessingProgressCallback {
|
||||
void onProgress(String currentImageUrl, int processedCount, int totalCount);
|
||||
}
|
||||
|
||||
/**
|
||||
* Process content images with progress callbacks for async processing
|
||||
*/
|
||||
public ContentImageProcessingResult processContentImagesWithProgress(String htmlContent, UUID storyId, ImageProcessingProgressCallback progressCallback) {
|
||||
logger.debug("Processing content images with progress for story: {}, content length: {}", storyId,
|
||||
htmlContent != null ? htmlContent.length() : 0);
|
||||
|
||||
List<String> warnings = new ArrayList<>();
|
||||
List<String> downloadedImages = new ArrayList<>();
|
||||
|
||||
if (htmlContent == null || htmlContent.trim().isEmpty()) {
|
||||
logger.debug("No content to process for story: {}", storyId);
|
||||
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
|
||||
}
|
||||
|
||||
// Find all img tags with src attributes
|
||||
Pattern imgPattern = Pattern.compile("<img[^>]+src=[\"']([^\"']+)[\"'][^>]*>", Pattern.CASE_INSENSITIVE);
|
||||
Matcher matcher = imgPattern.matcher(htmlContent);
|
||||
|
||||
// First pass: count external images
|
||||
List<String> externalImages = new ArrayList<>();
|
||||
Matcher countMatcher = imgPattern.matcher(htmlContent);
|
||||
while (countMatcher.find()) {
|
||||
String imageUrl = countMatcher.group(1);
|
||||
if (isExternalUrl(imageUrl)) {
|
||||
externalImages.add(imageUrl);
|
||||
}
|
||||
}
|
||||
|
||||
int totalExternalImages = externalImages.size();
|
||||
int processedCount = 0;
|
||||
|
||||
StringBuffer processedContent = new StringBuffer();
|
||||
matcher.reset(); // Reset the matcher for processing
|
||||
|
||||
while (matcher.find()) {
|
||||
String fullImgTag = matcher.group(0);
|
||||
String imageUrl = matcher.group(1);
|
||||
|
||||
logger.debug("Found image: {} in tag: {}", imageUrl, fullImgTag);
|
||||
|
||||
try {
|
||||
// Skip if it's already a local path, data URL, or from this application
|
||||
if (!isExternalUrl(imageUrl)) {
|
||||
logger.debug("Skipping local/internal URL: {}", imageUrl);
|
||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||
continue;
|
||||
}
|
||||
|
||||
// Call progress callback
|
||||
if (progressCallback != null) {
|
||||
progressCallback.onProgress(imageUrl, processedCount, totalExternalImages);
|
||||
}
|
||||
|
||||
logger.debug("Processing external image #{}: {}", processedCount + 1, imageUrl);
|
||||
|
||||
// Download and store the image
|
||||
String localPath = downloadImageFromUrl(imageUrl, storyId);
|
||||
downloadedImages.add(localPath);
|
||||
|
||||
// Generate local URL
|
||||
String localUrl = getLocalImageUrl(storyId, localPath);
|
||||
logger.debug("Downloaded image: {} -> {}", imageUrl, localUrl);
|
||||
|
||||
// Replace the src attribute with the local path
|
||||
String newImgTag = fullImgTag
|
||||
.replaceFirst("src=\"" + Pattern.quote(imageUrl) + "\"", "src=\"" + localUrl + "\"")
|
||||
.replaceFirst("src='" + Pattern.quote(imageUrl) + "'", "src='" + localUrl + "'");
|
||||
|
||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
|
||||
processedCount++;
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to download image: {} - Error: {}", imageUrl, e.getMessage());
|
||||
warnings.add("Failed to download image: " + imageUrl + " - " + e.getMessage());
|
||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||
}
|
||||
}
|
||||
|
||||
matcher.appendTail(processedContent);
|
||||
|
||||
logger.info("Processed {} external images for story: {} (Total: {}, Downloaded: {}, Warnings: {})",
|
||||
processedCount, storyId, processedCount, downloadedImages.size(), warnings.size());
|
||||
|
||||
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a URL is external (not from this application).
|
||||
* Returns true if the URL should be downloaded, false if it's already local.
|
||||
*/
|
||||
private boolean isExternalUrl(String url) {
|
||||
if (url == null || url.trim().isEmpty()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Skip data URLs
|
||||
if (url.startsWith("data:")) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Skip relative URLs (local paths)
|
||||
if (url.startsWith("/")) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Skip URLs that are already pointing to our API
|
||||
if (url.contains("/api/files/images/")) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check if URL starts with the public URL (our own domain)
|
||||
if (publicUrl != null && !publicUrl.trim().isEmpty()) {
|
||||
String normalizedUrl = url.trim().toLowerCase();
|
||||
String normalizedPublicUrl = publicUrl.trim().toLowerCase();
|
||||
|
||||
// Remove trailing slash from public URL for comparison
|
||||
if (normalizedPublicUrl.endsWith("/")) {
|
||||
normalizedPublicUrl = normalizedPublicUrl.substring(0, normalizedPublicUrl.length() - 1);
|
||||
}
|
||||
|
||||
if (normalizedUrl.startsWith(normalizedPublicUrl)) {
|
||||
logger.debug("URL is from this application (matches publicUrl): {}", url);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// If it's an HTTP(S) URL that didn't match our filters, it's external
|
||||
if (url.startsWith("http://") || url.startsWith("https://")) {
|
||||
logger.debug("URL is external: {}", url);
|
||||
return true;
|
||||
}
|
||||
|
||||
// For any other format, consider it non-external (safer default)
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Download an image from a URL and store it locally
|
||||
*/
|
||||
@@ -385,7 +546,7 @@ public class ImageService {
|
||||
return "/api/files/images/default/" + imagePath;
|
||||
}
|
||||
String localUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
||||
logger.info("Generated local image URL: {} for story: {}", localUrl, storyId);
|
||||
logger.debug("Generated local image URL: {} for story: {}", localUrl, storyId);
|
||||
return localUrl;
|
||||
}
|
||||
|
||||
@@ -421,6 +582,358 @@ public class ImageService {
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Cleanup orphaned content images that are no longer referenced in any story
|
||||
*/
|
||||
public ContentImageCleanupResult cleanupOrphanedContentImages(boolean dryRun) {
|
||||
logger.info("Starting orphaned content image cleanup (dryRun: {})", dryRun);
|
||||
|
||||
final Set<String> referencedImages;
|
||||
List<String> orphanedImages = new ArrayList<>();
|
||||
List<String> errors = new ArrayList<>();
|
||||
long totalSizeBytes = 0;
|
||||
int foldersToDelete = 0;
|
||||
|
||||
// Step 1: Collect all image references from all story content
|
||||
logger.debug("Scanning all story content for image references...");
|
||||
referencedImages = collectAllImageReferences();
|
||||
logger.debug("Found {} unique image references in story content", referencedImages.size());
|
||||
|
||||
try {
|
||||
// Step 2: Scan the content images directory
|
||||
Path contentImagesDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory());
|
||||
|
||||
if (!Files.exists(contentImagesDir)) {
|
||||
logger.debug("Content images directory does not exist: {}", contentImagesDir);
|
||||
return new ContentImageCleanupResult(orphanedImages, 0, 0, referencedImages.size(), errors, dryRun);
|
||||
}
|
||||
|
||||
logger.debug("Scanning content images directory: {}", contentImagesDir);
|
||||
|
||||
// Walk through all story directories
|
||||
Files.walk(contentImagesDir, 2)
|
||||
.filter(Files::isDirectory)
|
||||
.filter(path -> !path.equals(contentImagesDir)) // Skip the root content directory
|
||||
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system directories
|
||||
.forEach(storyDir -> {
|
||||
try {
|
||||
String storyId = storyDir.getFileName().toString();
|
||||
logger.debug("Checking story directory: {}", storyId);
|
||||
|
||||
// Check if this story still exists
|
||||
boolean storyExists = storyService.findByIdOptional(UUID.fromString(storyId)).isPresent();
|
||||
|
||||
if (!storyExists) {
|
||||
logger.debug("Found orphaned story directory (story deleted): {}", storyId);
|
||||
// Mark entire directory for deletion
|
||||
try {
|
||||
Files.walk(storyDir)
|
||||
.filter(Files::isRegularFile)
|
||||
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system files
|
||||
.filter(path -> isValidImageFile(path)) // Only process actual image files
|
||||
.forEach(file -> {
|
||||
try {
|
||||
long size = Files.size(file);
|
||||
orphanedImages.add(file.toString());
|
||||
// Add to total size (will be updated in main scope)
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to get size for " + file + ": " + e.getMessage());
|
||||
}
|
||||
});
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to scan orphaned story directory " + storyDir + ": " + e.getMessage());
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Check individual files in the story directory
|
||||
try {
|
||||
Files.walk(storyDir)
|
||||
.filter(Files::isRegularFile)
|
||||
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system files
|
||||
.filter(path -> isValidImageFile(path)) // Only process actual image files
|
||||
.forEach(imageFile -> {
|
||||
try {
|
||||
String filename = imageFile.getFileName().toString();
|
||||
|
||||
// Only consider it orphaned if it's not in our referenced filenames
|
||||
if (!referencedImages.contains(filename)) {
|
||||
logger.debug("Found orphaned image: {}", filename);
|
||||
orphanedImages.add(imageFile.toString());
|
||||
} else {
|
||||
logger.debug("Image file is referenced, keeping: {}", filename);
|
||||
}
|
||||
} catch (Exception e) {
|
||||
errors.add("Error checking image file " + imageFile + ": " + e.getMessage());
|
||||
}
|
||||
});
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to scan story directory " + storyDir + ": " + e.getMessage());
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
errors.add("Error processing story directory " + storyDir + ": " + e.getMessage());
|
||||
}
|
||||
});
|
||||
|
||||
// Calculate total size and count empty directories
|
||||
for (String orphanedImage : orphanedImages) {
|
||||
try {
|
||||
Path imagePath = Paths.get(orphanedImage);
|
||||
if (Files.exists(imagePath)) {
|
||||
totalSizeBytes += Files.size(imagePath);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to get size for " + orphanedImage + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
// Count empty directories that would be removed
|
||||
try {
|
||||
foldersToDelete = (int) Files.walk(contentImagesDir)
|
||||
.filter(Files::isDirectory)
|
||||
.filter(path -> !path.equals(contentImagesDir))
|
||||
.filter(this::isDirectoryEmptyOrWillBeEmpty)
|
||||
.count();
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to count empty directories: " + e.getMessage());
|
||||
}
|
||||
|
||||
// Step 3: Delete orphaned files if not dry run
|
||||
if (!dryRun && !orphanedImages.isEmpty()) {
|
||||
logger.debug("Deleting {} orphaned images...", orphanedImages.size());
|
||||
|
||||
Set<Path> directoriesToCheck = new HashSet<>();
|
||||
|
||||
for (String orphanedImage : orphanedImages) {
|
||||
try {
|
||||
Path imagePath = Paths.get(orphanedImage);
|
||||
if (Files.exists(imagePath)) {
|
||||
directoriesToCheck.add(imagePath.getParent());
|
||||
Files.delete(imagePath);
|
||||
logger.debug("Deleted orphaned image: {}", imagePath);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to delete " + orphanedImage + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up empty directories
|
||||
for (Path dir : directoriesToCheck) {
|
||||
try {
|
||||
if (Files.exists(dir) && isDirEmpty(dir)) {
|
||||
Files.delete(dir);
|
||||
logger.debug("Deleted empty story directory: {}", dir);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to delete empty directory " + dir + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
logger.info("Orphaned content image cleanup completed. Found {} orphaned files ({} bytes)",
|
||||
orphanedImages.size(), totalSizeBytes);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during orphaned content image cleanup", e);
|
||||
errors.add("General cleanup error: " + e.getMessage());
|
||||
}
|
||||
|
||||
return new ContentImageCleanupResult(orphanedImages, totalSizeBytes, foldersToDelete, referencedImages.size(), errors, dryRun);
|
||||
}
|
||||
|
||||
/**
|
||||
* Collect all image filenames referenced in content (UUID-based filenames only)
|
||||
*/
|
||||
private Set<String> collectAllImageReferences() {
|
||||
Set<String> referencedFilenames = new HashSet<>();
|
||||
|
||||
try {
|
||||
// Get all stories
|
||||
List<com.storycove.entity.Story> allStories = storyService.findAllWithAssociations();
|
||||
|
||||
// Pattern to match local image URLs in content
|
||||
Pattern imagePattern = Pattern.compile("src=[\"']([^\"']*(?:content/[^\"']*\\.(jpg|jpeg|png)))[\"']", Pattern.CASE_INSENSITIVE);
|
||||
|
||||
for (com.storycove.entity.Story story : allStories) {
|
||||
// Add story cover image filename if present
|
||||
if (story.getCoverPath() != null && !story.getCoverPath().trim().isEmpty()) {
|
||||
String filename = extractFilename(story.getCoverPath());
|
||||
if (filename != null) {
|
||||
referencedFilenames.add(filename);
|
||||
logger.debug("Found cover image filename in story {}: {}", story.getId(), filename);
|
||||
}
|
||||
}
|
||||
|
||||
// Add author avatar image filename if present
|
||||
if (story.getAuthor() != null && story.getAuthor().getAvatarImagePath() != null && !story.getAuthor().getAvatarImagePath().trim().isEmpty()) {
|
||||
String filename = extractFilename(story.getAuthor().getAvatarImagePath());
|
||||
if (filename != null) {
|
||||
referencedFilenames.add(filename);
|
||||
logger.debug("Found avatar image filename for author {}: {}", story.getAuthor().getId(), filename);
|
||||
}
|
||||
}
|
||||
|
||||
// Add content images from HTML
|
||||
if (story.getContentHtml() != null) {
|
||||
Matcher matcher = imagePattern.matcher(story.getContentHtml());
|
||||
|
||||
while (matcher.find()) {
|
||||
String imageSrc = matcher.group(1);
|
||||
|
||||
// Extract just the filename from the URL
|
||||
String filename = extractFilename(imageSrc);
|
||||
if (filename != null && isUuidBasedFilename(filename)) {
|
||||
referencedFilenames.add(filename);
|
||||
logger.debug("Found content image filename in story {}: {}", story.getId(), filename);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Also get all authors separately to catch avatars for authors without stories
|
||||
List<com.storycove.entity.Author> allAuthors = authorService.findAll();
|
||||
for (com.storycove.entity.Author author : allAuthors) {
|
||||
if (author.getAvatarImagePath() != null && !author.getAvatarImagePath().trim().isEmpty()) {
|
||||
String filename = extractFilename(author.getAvatarImagePath());
|
||||
if (filename != null) {
|
||||
referencedFilenames.add(filename);
|
||||
logger.debug("Found standalone avatar image filename for author {}: {}", author.getId(), filename);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Also get all collections to catch cover images
|
||||
List<com.storycove.entity.Collection> allCollections = collectionService.findAllWithTags();
|
||||
for (com.storycove.entity.Collection collection : allCollections) {
|
||||
if (collection.getCoverImagePath() != null && !collection.getCoverImagePath().trim().isEmpty()) {
|
||||
String filename = extractFilename(collection.getCoverImagePath());
|
||||
if (filename != null) {
|
||||
referencedFilenames.add(filename);
|
||||
logger.debug("Found collection cover image filename for collection {}: {}", collection.getId(), filename);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error collecting image references from stories", e);
|
||||
}
|
||||
|
||||
return referencedFilenames;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert an image src attribute to relative file path
|
||||
*/
|
||||
private String convertSrcToRelativePath(String src) {
|
||||
try {
|
||||
// Handle both /api/files/images/libraryId/content/... and relative content/... paths
|
||||
if (src.contains("/content/")) {
|
||||
int contentIndex = src.indexOf("/content/");
|
||||
return src.substring(contentIndex + 1); // Remove leading slash, keep "content/..."
|
||||
}
|
||||
} catch (Exception e) {
|
||||
logger.debug("Failed to convert src to relative path: {}", src);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert absolute file path to relative path from upload directory
|
||||
*/
|
||||
private String convertAbsolutePathToRelative(String absolutePath) {
|
||||
try {
|
||||
if (absolutePath == null || absolutePath.trim().isEmpty()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
Path absPath = Paths.get(absolutePath);
|
||||
Path uploadDirPath = Paths.get(getUploadDir());
|
||||
|
||||
// If the path is already relative to upload dir, return as-is
|
||||
if (!absPath.isAbsolute()) {
|
||||
return absolutePath.replace('\\', '/');
|
||||
}
|
||||
|
||||
// Try to make it relative to the upload directory
|
||||
if (absPath.startsWith(uploadDirPath)) {
|
||||
Path relativePath = uploadDirPath.relativize(absPath);
|
||||
return relativePath.toString().replace('\\', '/');
|
||||
}
|
||||
|
||||
// If it's not under upload directory, check if it's library-specific path
|
||||
String libraryPath = libraryService.getCurrentImagePath();
|
||||
Path baseUploadPath = Paths.get(baseUploadDir);
|
||||
|
||||
if (absPath.startsWith(baseUploadPath)) {
|
||||
Path relativePath = baseUploadPath.relativize(absPath);
|
||||
String relativeStr = relativePath.toString().replace('\\', '/');
|
||||
|
||||
// Remove library prefix if present to make it library-agnostic for comparison
|
||||
if (relativeStr.startsWith(libraryPath.substring(1))) { // Remove leading slash from library path
|
||||
return relativeStr.substring(libraryPath.length() - 1); // Keep the leading slash
|
||||
}
|
||||
return relativeStr;
|
||||
}
|
||||
|
||||
// Fallback: just use the filename portion if it's in the right structure
|
||||
String fileName = absPath.getFileName().toString();
|
||||
if (fileName.matches(".*\\.(jpg|jpeg|png)$")) {
|
||||
// Try to preserve directory structure if it looks like covers/ or avatars/
|
||||
Path parent = absPath.getParent();
|
||||
if (parent != null) {
|
||||
String parentName = parent.getFileName().toString();
|
||||
if (parentName.equals("covers") || parentName.equals("avatars")) {
|
||||
return parentName + "/" + fileName;
|
||||
}
|
||||
}
|
||||
return fileName;
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.debug("Failed to convert absolute path to relative: {}", absolutePath, e);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get relative image path from absolute file path
|
||||
*/
|
||||
private String getRelativeImagePath(Path imageFile) {
|
||||
try {
|
||||
Path uploadDir = Paths.get(getUploadDir());
|
||||
Path relativePath = uploadDir.relativize(imageFile);
|
||||
return relativePath.toString().replace('\\', '/'); // Normalize path separators
|
||||
} catch (Exception e) {
|
||||
logger.debug("Failed to get relative path for: {}", imageFile);
|
||||
return imageFile.toString();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if directory is empty or will be empty after cleanup
|
||||
*/
|
||||
private boolean isDirectoryEmptyOrWillBeEmpty(Path dir) {
|
||||
try {
|
||||
return Files.walk(dir)
|
||||
.filter(Files::isRegularFile)
|
||||
.count() == 0;
|
||||
} catch (IOException e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if directory is empty
|
||||
*/
|
||||
private boolean isDirEmpty(Path dir) {
|
||||
try {
|
||||
return Files.list(dir).count() == 0;
|
||||
} catch (IOException e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up content images for a story
|
||||
*/
|
||||
@@ -458,4 +971,152 @@ public class ImageService {
|
||||
public List<String> getDownloadedImages() { return downloadedImages; }
|
||||
public boolean hasWarnings() { return !warnings.isEmpty(); }
|
||||
}
|
||||
|
||||
/**
|
||||
* Result class for orphaned image cleanup
|
||||
*/
|
||||
public static class ContentImageCleanupResult {
|
||||
private final List<String> orphanedImages;
|
||||
private final long totalSizeBytes;
|
||||
private final int foldersToDelete;
|
||||
private final int totalReferencedImages;
|
||||
private final List<String> errors;
|
||||
private final boolean dryRun;
|
||||
|
||||
public ContentImageCleanupResult(List<String> orphanedImages, long totalSizeBytes, int foldersToDelete,
|
||||
int totalReferencedImages, List<String> errors, boolean dryRun) {
|
||||
this.orphanedImages = orphanedImages;
|
||||
this.totalSizeBytes = totalSizeBytes;
|
||||
this.foldersToDelete = foldersToDelete;
|
||||
this.totalReferencedImages = totalReferencedImages;
|
||||
this.errors = errors;
|
||||
this.dryRun = dryRun;
|
||||
}
|
||||
|
||||
public List<String> getOrphanedImages() { return orphanedImages; }
|
||||
public long getTotalSizeBytes() { return totalSizeBytes; }
|
||||
public int getFoldersToDelete() { return foldersToDelete; }
|
||||
public int getTotalReferencedImages() { return totalReferencedImages; }
|
||||
public List<String> getErrors() { return errors; }
|
||||
public boolean isDryRun() { return dryRun; }
|
||||
public boolean hasErrors() { return !errors.isEmpty(); }
|
||||
|
||||
public String getFormattedSize() {
|
||||
if (totalSizeBytes < 1024) return totalSizeBytes + " B";
|
||||
if (totalSizeBytes < 1024 * 1024) return String.format("%.1f KB", totalSizeBytes / 1024.0);
|
||||
if (totalSizeBytes < 1024 * 1024 * 1024) return String.format("%.1f MB", totalSizeBytes / (1024.0 * 1024.0));
|
||||
return String.format("%.1f GB", totalSizeBytes / (1024.0 * 1024.0 * 1024.0));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a path is a Synology system path that should be ignored
|
||||
*/
|
||||
private boolean isSynologySystemPath(Path path) {
|
||||
String pathStr = path.toString();
|
||||
String fileName = path.getFileName().toString();
|
||||
|
||||
// Skip Synology metadata directories and files
|
||||
return pathStr.contains("@eaDir") ||
|
||||
fileName.startsWith("@") ||
|
||||
fileName.contains("@SynoEAStream") ||
|
||||
fileName.startsWith(".") ||
|
||||
fileName.equals("Thumbs.db") ||
|
||||
fileName.equals(".DS_Store");
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a file is a valid image file (not a system/metadata file)
|
||||
*/
|
||||
private boolean isValidImageFile(Path path) {
|
||||
if (isSynologySystemPath(path)) {
|
||||
return false;
|
||||
}
|
||||
|
||||
String fileName = path.getFileName().toString().toLowerCase();
|
||||
return fileName.endsWith(".jpg") ||
|
||||
fileName.endsWith(".jpeg") ||
|
||||
fileName.endsWith(".png") ||
|
||||
fileName.endsWith(".gif") ||
|
||||
fileName.endsWith(".webp");
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract filename from a path or URL
|
||||
*/
|
||||
private String extractFilename(String pathOrUrl) {
|
||||
if (pathOrUrl == null || pathOrUrl.trim().isEmpty()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
// Remove query parameters if present
|
||||
if (pathOrUrl.contains("?")) {
|
||||
pathOrUrl = pathOrUrl.substring(0, pathOrUrl.indexOf("?"));
|
||||
}
|
||||
|
||||
// Get the last part after slash
|
||||
String filename = pathOrUrl.substring(pathOrUrl.lastIndexOf("/") + 1);
|
||||
|
||||
// Remove any special Synology suffixes
|
||||
filename = filename.replace("@SynoEAStream", "");
|
||||
|
||||
return filename.trim().isEmpty() ? null : filename;
|
||||
} catch (Exception e) {
|
||||
logger.debug("Failed to extract filename from: {}", pathOrUrl);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a filename follows UUID pattern (indicates it's our generated file)
|
||||
*/
|
||||
private boolean isUuidBasedFilename(String filename) {
|
||||
if (filename == null || filename.trim().isEmpty()) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Remove extension
|
||||
String nameWithoutExt = filename;
|
||||
int lastDot = filename.lastIndexOf(".");
|
||||
if (lastDot > 0) {
|
||||
nameWithoutExt = filename.substring(0, lastDot);
|
||||
}
|
||||
|
||||
// Check if it matches UUID pattern (8-4-4-4-12 hex characters)
|
||||
return nameWithoutExt.matches("[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}");
|
||||
}
|
||||
|
||||
/**
|
||||
* Event listener for story content updates - processes external images asynchronously
|
||||
*/
|
||||
@EventListener
|
||||
@Async
|
||||
public void handleStoryContentUpdated(StoryContentUpdatedEvent event) {
|
||||
logger.info("Processing images for {} story {} after content update",
|
||||
event.isNewStory() ? "new" : "updated", event.getStoryId());
|
||||
|
||||
try {
|
||||
ContentImageProcessingResult result = processContentImages(event.getContentHtml(), event.getStoryId());
|
||||
|
||||
// If content was changed, we need to update the story (but this could cause circular events)
|
||||
// Instead, let's just log the results for now and let the controller handle updates if needed
|
||||
if (result.hasWarnings()) {
|
||||
logger.warn("Image processing warnings for story {}: {}", event.getStoryId(), result.getWarnings());
|
||||
}
|
||||
if (!result.getDownloadedImages().isEmpty()) {
|
||||
logger.info("Downloaded {} external images for story {}: {}",
|
||||
result.getDownloadedImages().size(), event.getStoryId(), result.getDownloadedImages());
|
||||
}
|
||||
|
||||
// TODO: If content was changed, we might need a way to update the story without triggering another event
|
||||
if (!result.getProcessedContent().equals(event.getContentHtml())) {
|
||||
logger.info("Story {} content was processed and external images were replaced with local URLs", event.getStoryId());
|
||||
// For now, just log that processing occurred - the original content processing already handles updates
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to process images for story {}: {}", event.getStoryId(), e.getMessage(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -13,8 +13,6 @@ import org.springframework.context.ApplicationContext;
|
||||
import org.springframework.context.ApplicationContextAware;
|
||||
import org.springframework.security.crypto.bcrypt.BCryptPasswordEncoder;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.typesense.api.Client;
|
||||
import org.typesense.resources.Node;
|
||||
|
||||
import jakarta.annotation.PostConstruct;
|
||||
import jakarta.annotation.PreDestroy;
|
||||
@@ -26,7 +24,6 @@ import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.sql.SQLException;
|
||||
import java.time.Duration;
|
||||
import java.util.*;
|
||||
import java.util.concurrent.ConcurrentHashMap;
|
||||
|
||||
@@ -43,14 +40,6 @@ public class LibraryService implements ApplicationContextAware {
|
||||
@Value("${spring.datasource.password}")
|
||||
private String dbPassword;
|
||||
|
||||
@Value("${typesense.host}")
|
||||
private String typesenseHost;
|
||||
|
||||
@Value("${typesense.port}")
|
||||
private String typesensePort;
|
||||
|
||||
@Value("${typesense.api-key}")
|
||||
private String typesenseApiKey;
|
||||
|
||||
private final ObjectMapper objectMapper = new ObjectMapper();
|
||||
private final BCryptPasswordEncoder passwordEncoder = new BCryptPasswordEncoder();
|
||||
@@ -61,7 +50,6 @@ public class LibraryService implements ApplicationContextAware {
|
||||
|
||||
// Current active resources
|
||||
private volatile String currentLibraryId;
|
||||
private volatile Client currentTypesenseClient;
|
||||
|
||||
// Security: Track if user has explicitly authenticated in this session
|
||||
private volatile boolean explicitlyAuthenticated = false;
|
||||
@@ -100,7 +88,6 @@ public class LibraryService implements ApplicationContextAware {
|
||||
@PreDestroy
|
||||
public void cleanup() {
|
||||
currentLibraryId = null;
|
||||
currentTypesenseClient = null;
|
||||
explicitlyAuthenticated = false;
|
||||
}
|
||||
|
||||
@@ -110,7 +97,6 @@ public class LibraryService implements ApplicationContextAware {
|
||||
public void clearAuthentication() {
|
||||
explicitlyAuthenticated = false;
|
||||
currentLibraryId = null;
|
||||
currentTypesenseClient = null;
|
||||
logger.info("Authentication cleared - user must re-authenticate to access libraries");
|
||||
}
|
||||
|
||||
@@ -129,7 +115,7 @@ public class LibraryService implements ApplicationContextAware {
|
||||
|
||||
/**
|
||||
* Switch to library after authentication with forced reindexing
|
||||
* This ensures Typesense is always up-to-date after login
|
||||
* This ensures Solr is always up-to-date after login
|
||||
*/
|
||||
public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception {
|
||||
logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId);
|
||||
@@ -158,9 +144,9 @@ public class LibraryService implements ApplicationContextAware {
|
||||
String previousLibraryId = currentLibraryId;
|
||||
|
||||
if (libraryId.equals(currentLibraryId) && forceReindex) {
|
||||
logger.info("Forcing reindex for current library: {} ({})", library.getName(), libraryId);
|
||||
logger.debug("Forcing reindex for current library: {} ({})", library.getName(), libraryId);
|
||||
} else {
|
||||
logger.info("Switching to library: {} ({})", library.getName(), libraryId);
|
||||
logger.debug("Switching to library: {} ({})", library.getName(), libraryId);
|
||||
}
|
||||
|
||||
// Close current resources
|
||||
@@ -168,25 +154,15 @@ public class LibraryService implements ApplicationContextAware {
|
||||
|
||||
// Set new active library (datasource routing handled by SmartRoutingDataSource)
|
||||
currentLibraryId = libraryId;
|
||||
currentTypesenseClient = createTypesenseClient(library.getTypesenseCollection());
|
||||
|
||||
// Initialize Typesense collections for this library
|
||||
try {
|
||||
TypesenseService typesenseService = applicationContext.getBean(TypesenseService.class);
|
||||
// First ensure collections exist
|
||||
typesenseService.initializeCollectionsForCurrentLibrary();
|
||||
logger.info("Completed Typesense initialization for library: {}", libraryId);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to initialize Typesense for library {}: {}", libraryId, e.getMessage());
|
||||
// Don't fail the switch - collections can be created later
|
||||
}
|
||||
// Solr indexes are global - no per-library initialization needed
|
||||
logger.debug("Library switched to Solr mode for library: {}", libraryId);
|
||||
|
||||
logger.info("Successfully switched to library: {}", library.getName());
|
||||
|
||||
// Perform complete reindex AFTER library switch is fully complete
|
||||
// This ensures database routing is properly established
|
||||
if (forceReindex || !libraryId.equals(previousLibraryId)) {
|
||||
logger.info("Starting post-switch Typesense reindex for library: {}", libraryId);
|
||||
logger.debug("Starting post-switch Solr reindex for library: {}", libraryId);
|
||||
|
||||
// Run reindex asynchronously to avoid blocking authentication response
|
||||
// and allow time for database routing to fully stabilize
|
||||
@@ -195,15 +171,25 @@ public class LibraryService implements ApplicationContextAware {
|
||||
try {
|
||||
// Give routing time to stabilize
|
||||
Thread.sleep(500);
|
||||
logger.info("Starting async Typesense reindex for library: {}", finalLibraryId);
|
||||
logger.debug("Starting async Solr reindex for library: {}", finalLibraryId);
|
||||
|
||||
TypesenseService typesenseService = applicationContext.getBean(TypesenseService.class);
|
||||
typesenseService.performCompleteReindex();
|
||||
logger.info("Completed async Typesense reindexing for library: {}", finalLibraryId);
|
||||
SearchServiceAdapter searchService = applicationContext.getBean(SearchServiceAdapter.class);
|
||||
// Get all stories and authors for reindexing
|
||||
StoryService storyService = applicationContext.getBean(StoryService.class);
|
||||
AuthorService authorService = applicationContext.getBean(AuthorService.class);
|
||||
|
||||
var allStories = storyService.findAllWithAssociations();
|
||||
var allAuthors = authorService.findAllWithStories();
|
||||
|
||||
searchService.bulkIndexStories(allStories);
|
||||
searchService.bulkIndexAuthors(allAuthors);
|
||||
|
||||
logger.info("Completed async Solr reindexing for library: {} ({} stories, {} authors)",
|
||||
finalLibraryId, allStories.size(), allAuthors.size());
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to async reindex Typesense for library {}: {}", finalLibraryId, e.getMessage());
|
||||
logger.warn("Failed to async reindex Solr for library {}: {}", finalLibraryId, e.getMessage());
|
||||
}
|
||||
}, "TypesenseReindex-" + libraryId).start();
|
||||
}, "SolrReindex-" + libraryId).start();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -219,12 +205,6 @@ public class LibraryService implements ApplicationContextAware {
|
||||
}
|
||||
}
|
||||
|
||||
public Client getCurrentTypesenseClient() {
|
||||
if (currentTypesenseClient == null) {
|
||||
throw new IllegalStateException("No active library - please authenticate first");
|
||||
}
|
||||
return currentTypesenseClient;
|
||||
}
|
||||
|
||||
public String getCurrentLibraryId() {
|
||||
return currentLibraryId;
|
||||
@@ -362,10 +342,10 @@ public class LibraryService implements ApplicationContextAware {
|
||||
library.setInitialized((Boolean) data.getOrDefault("initialized", false));
|
||||
|
||||
libraries.put(id, library);
|
||||
logger.info("Loaded library: {} ({})", library.getName(), id);
|
||||
logger.debug("Loaded library: {} ({})", library.getName(), id);
|
||||
}
|
||||
} else {
|
||||
logger.info("No libraries configuration file found, will create default");
|
||||
logger.debug("No libraries configuration file found, will create default");
|
||||
}
|
||||
} catch (IOException e) {
|
||||
logger.error("Failed to load libraries configuration", e);
|
||||
@@ -431,7 +411,7 @@ public class LibraryService implements ApplicationContextAware {
|
||||
String json = objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(config);
|
||||
Files.writeString(Paths.get(LIBRARIES_CONFIG_PATH), json);
|
||||
|
||||
logger.info("Saved libraries configuration");
|
||||
logger.debug("Saved libraries configuration");
|
||||
} catch (IOException e) {
|
||||
logger.error("Failed to save libraries configuration", e);
|
||||
}
|
||||
@@ -439,7 +419,7 @@ public class LibraryService implements ApplicationContextAware {
|
||||
|
||||
private DataSource createDataSource(String dbName) {
|
||||
String url = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
||||
logger.info("Creating DataSource for: {}", url);
|
||||
logger.debug("Creating DataSource for: {}", url);
|
||||
|
||||
// First, ensure the database exists
|
||||
ensureDatabaseExists(dbName);
|
||||
@@ -479,7 +459,7 @@ public class LibraryService implements ApplicationContextAware {
|
||||
preparedStatement.setString(1, dbName);
|
||||
try (var resultSet = preparedStatement.executeQuery()) {
|
||||
if (resultSet.next()) {
|
||||
logger.info("Database {} already exists", dbName);
|
||||
logger.debug("Database {} already exists", dbName);
|
||||
return; // Database exists, nothing to do
|
||||
}
|
||||
}
|
||||
@@ -508,7 +488,7 @@ public class LibraryService implements ApplicationContextAware {
|
||||
}
|
||||
|
||||
private void initializeNewDatabaseSchema(String dbName) {
|
||||
logger.info("Initializing schema for new database: {}", dbName);
|
||||
logger.debug("Initializing schema for new database: {}", dbName);
|
||||
|
||||
// Create a temporary DataSource for the new database to initialize schema
|
||||
String newDbUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
||||
@@ -525,7 +505,7 @@ public class LibraryService implements ApplicationContextAware {
|
||||
// Use Hibernate to create the schema
|
||||
// This mimics what Spring Boot does during startup
|
||||
createSchemaUsingHibernate(tempDataSource);
|
||||
logger.info("Schema initialized for database: {}", dbName);
|
||||
logger.debug("Schema initialized for database: {}", dbName);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to initialize schema for database {}: {}", dbName, e.getMessage());
|
||||
@@ -540,15 +520,15 @@ public class LibraryService implements ApplicationContextAware {
|
||||
}
|
||||
|
||||
try {
|
||||
logger.info("Initializing resources for new library: {}", library.getName());
|
||||
logger.debug("Initializing resources for new library: {}", library.getName());
|
||||
|
||||
// 1. Create image directory structure
|
||||
initializeImageDirectories(library);
|
||||
|
||||
// 2. Initialize Typesense collections (this will be done when switching to the library)
|
||||
// The TypesenseService.initializeCollections() will be called automatically
|
||||
// 2. Solr indexes are global and managed automatically
|
||||
// No per-library initialization needed for Solr
|
||||
|
||||
logger.info("Successfully initialized resources for library: {}", library.getName());
|
||||
logger.debug("Successfully initialized resources for library: {}", library.getName());
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to initialize resources for library {}: {}", libraryId, e.getMessage());
|
||||
@@ -564,16 +544,16 @@ public class LibraryService implements ApplicationContextAware {
|
||||
|
||||
if (!java.nio.file.Files.exists(libraryImagePath)) {
|
||||
java.nio.file.Files.createDirectories(libraryImagePath);
|
||||
logger.info("Created image directory: {}", imagePath);
|
||||
logger.debug("Created image directory: {}", imagePath);
|
||||
|
||||
// Create subdirectories for different image types
|
||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("stories"));
|
||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("authors"));
|
||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("collections"));
|
||||
|
||||
logger.info("Created image subdirectories for library: {}", library.getId());
|
||||
logger.debug("Created image subdirectories for library: {}", library.getId());
|
||||
} else {
|
||||
logger.info("Image directory already exists: {}", imagePath);
|
||||
logger.debug("Image directory already exists: {}", imagePath);
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
@@ -769,7 +749,7 @@ public class LibraryService implements ApplicationContextAware {
|
||||
statement.executeUpdate(sql);
|
||||
}
|
||||
|
||||
logger.info("Successfully created all database tables and constraints");
|
||||
logger.debug("Successfully created all database tables and constraints");
|
||||
|
||||
} catch (SQLException e) {
|
||||
logger.error("Failed to create database schema", e);
|
||||
@@ -777,21 +757,10 @@ public class LibraryService implements ApplicationContextAware {
|
||||
}
|
||||
}
|
||||
|
||||
private Client createTypesenseClient(String collection) {
|
||||
logger.info("Creating Typesense client for collection: {}", collection);
|
||||
|
||||
List<Node> nodes = Arrays.asList(
|
||||
new Node("http", typesenseHost, typesensePort)
|
||||
);
|
||||
|
||||
org.typesense.api.Configuration configuration = new org.typesense.api.Configuration(nodes, Duration.ofSeconds(10), typesenseApiKey);
|
||||
return new Client(configuration);
|
||||
}
|
||||
|
||||
private void closeCurrentResources() {
|
||||
// No need to close datasource - SmartRoutingDataSource handles this
|
||||
// Typesense client doesn't need explicit cleanup
|
||||
currentTypesenseClient = null;
|
||||
// Solr service is managed by Spring - no explicit cleanup needed
|
||||
// Don't clear currentLibraryId here - only when explicitly switching
|
||||
}
|
||||
|
||||
@@ -848,7 +817,6 @@ public class LibraryService implements ApplicationContextAware {
|
||||
config.put("description", library.getDescription());
|
||||
config.put("passwordHash", library.getPasswordHash());
|
||||
config.put("dbName", library.getDbName());
|
||||
config.put("typesenseCollection", library.getTypesenseCollection());
|
||||
config.put("imagePath", library.getImagePath());
|
||||
config.put("initialized", library.isInitialized());
|
||||
|
||||
|
||||
@@ -0,0 +1,257 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.config.SolrProperties;
|
||||
import com.storycove.dto.LibraryOverviewStatsDto;
|
||||
import com.storycove.dto.LibraryOverviewStatsDto.StoryWordCountDto;
|
||||
import org.apache.solr.client.solrj.SolrClient;
|
||||
import org.apache.solr.client.solrj.SolrQuery;
|
||||
import org.apache.solr.client.solrj.SolrServerException;
|
||||
import org.apache.solr.client.solrj.response.FacetField;
|
||||
import org.apache.solr.client.solrj.response.QueryResponse;
|
||||
import org.apache.solr.common.SolrDocument;
|
||||
import org.apache.solr.common.params.StatsParams;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.Map;
|
||||
|
||||
@Service
|
||||
@ConditionalOnProperty(
|
||||
value = "storycove.search.engine",
|
||||
havingValue = "solr",
|
||||
matchIfMissing = false
|
||||
)
|
||||
public class LibraryStatisticsService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(LibraryStatisticsService.class);
|
||||
private static final int WORDS_PER_MINUTE = 250;
|
||||
|
||||
@Autowired(required = false)
|
||||
private SolrClient solrClient;
|
||||
|
||||
@Autowired
|
||||
private SolrProperties properties;
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
/**
|
||||
* Get overview statistics for a library
|
||||
*/
|
||||
public LibraryOverviewStatsDto getOverviewStatistics(String libraryId) throws IOException, SolrServerException {
|
||||
LibraryOverviewStatsDto stats = new LibraryOverviewStatsDto();
|
||||
|
||||
// Collection Overview
|
||||
stats.setTotalStories(getTotalStories(libraryId));
|
||||
stats.setTotalAuthors(getTotalAuthors(libraryId));
|
||||
stats.setTotalSeries(getTotalSeries(libraryId));
|
||||
stats.setTotalTags(getTotalTags(libraryId));
|
||||
stats.setTotalCollections(getTotalCollections(libraryId));
|
||||
stats.setUniqueSourceDomains(getUniqueSourceDomains(libraryId));
|
||||
|
||||
// Content Metrics - use Solr Stats Component
|
||||
WordCountStats wordStats = getWordCountStatistics(libraryId);
|
||||
stats.setTotalWordCount(wordStats.sum);
|
||||
stats.setAverageWordsPerStory(wordStats.mean);
|
||||
stats.setLongestStory(getLongestStory(libraryId));
|
||||
stats.setShortestStory(getShortestStory(libraryId));
|
||||
|
||||
// Reading Time
|
||||
stats.setTotalReadingTimeMinutes(wordStats.sum / WORDS_PER_MINUTE);
|
||||
stats.setAverageReadingTimeMinutes(wordStats.mean / WORDS_PER_MINUTE);
|
||||
|
||||
return stats;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get total number of stories in library
|
||||
*/
|
||||
private long getTotalStories(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.setRows(0); // We only want the count
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getStories(), query);
|
||||
return response.getResults().getNumFound();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get total number of authors in library
|
||||
*/
|
||||
private long getTotalAuthors(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.setRows(0);
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getAuthors(), query);
|
||||
return response.getResults().getNumFound();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get total number of series using faceting on seriesId
|
||||
*/
|
||||
private long getTotalSeries(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.addFilterQuery("seriesId:[* TO *]"); // Only stories that have a series
|
||||
query.setRows(0);
|
||||
query.setFacet(true);
|
||||
query.addFacetField("seriesId");
|
||||
query.setFacetLimit(-1); // Get all unique series
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getStories(), query);
|
||||
FacetField seriesFacet = response.getFacetField("seriesId");
|
||||
|
||||
return (seriesFacet != null && seriesFacet.getValues() != null)
|
||||
? seriesFacet.getValueCount()
|
||||
: 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get total number of unique tags using faceting
|
||||
*/
|
||||
private long getTotalTags(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.setRows(0);
|
||||
query.setFacet(true);
|
||||
query.addFacetField("tagNames");
|
||||
query.setFacetLimit(-1); // Get all unique tags
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getStories(), query);
|
||||
FacetField tagsFacet = response.getFacetField("tagNames");
|
||||
|
||||
return (tagsFacet != null && tagsFacet.getValues() != null)
|
||||
? tagsFacet.getValueCount()
|
||||
: 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get total number of collections
|
||||
*/
|
||||
private long getTotalCollections(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.setRows(0);
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getCollections(), query);
|
||||
return response.getResults().getNumFound();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get number of unique source domains using faceting
|
||||
*/
|
||||
private long getUniqueSourceDomains(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.addFilterQuery("sourceDomain:[* TO *]"); // Only stories with a source domain
|
||||
query.setRows(0);
|
||||
query.setFacet(true);
|
||||
query.addFacetField("sourceDomain");
|
||||
query.setFacetLimit(-1);
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getStories(), query);
|
||||
FacetField domainFacet = response.getFacetField("sourceDomain");
|
||||
|
||||
return (domainFacet != null && domainFacet.getValues() != null)
|
||||
? domainFacet.getValueCount()
|
||||
: 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get word count statistics using Solr Stats Component
|
||||
*/
|
||||
private WordCountStats getWordCountStatistics(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.setRows(0);
|
||||
query.setParam(StatsParams.STATS, true);
|
||||
query.setParam(StatsParams.STATS_FIELD, "wordCount");
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getStories(), query);
|
||||
|
||||
WordCountStats stats = new WordCountStats();
|
||||
|
||||
// Extract stats from response
|
||||
var fieldStatsInfo = response.getFieldStatsInfo();
|
||||
if (fieldStatsInfo != null && fieldStatsInfo.get("wordCount") != null) {
|
||||
var fieldStat = fieldStatsInfo.get("wordCount");
|
||||
|
||||
Object sumObj = fieldStat.getSum();
|
||||
Object meanObj = fieldStat.getMean();
|
||||
|
||||
stats.sum = (sumObj != null) ? ((Number) sumObj).longValue() : 0L;
|
||||
stats.mean = (meanObj != null) ? ((Number) meanObj).doubleValue() : 0.0;
|
||||
}
|
||||
|
||||
return stats;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the longest story in the library
|
||||
*/
|
||||
private StoryWordCountDto getLongestStory(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.addFilterQuery("wordCount:[1 TO *]"); // Exclude stories with 0 words
|
||||
query.setSort("wordCount", SolrQuery.ORDER.desc);
|
||||
query.setRows(1);
|
||||
query.setFields("id", "title", "authorName", "wordCount");
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getStories(), query);
|
||||
|
||||
if (response.getResults().isEmpty()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
SolrDocument doc = response.getResults().get(0);
|
||||
return createStoryWordCountDto(doc);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the shortest story in the library (excluding 0 word count)
|
||||
*/
|
||||
private StoryWordCountDto getShortestStory(String libraryId) throws IOException, SolrServerException {
|
||||
SolrQuery query = new SolrQuery("*:*");
|
||||
query.addFilterQuery("libraryId:" + libraryId);
|
||||
query.addFilterQuery("wordCount:[1 TO *]"); // Exclude stories with 0 words
|
||||
query.setSort("wordCount", SolrQuery.ORDER.asc);
|
||||
query.setRows(1);
|
||||
query.setFields("id", "title", "authorName", "wordCount");
|
||||
|
||||
QueryResponse response = solrClient.query(properties.getCores().getStories(), query);
|
||||
|
||||
if (response.getResults().isEmpty()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
SolrDocument doc = response.getResults().get(0);
|
||||
return createStoryWordCountDto(doc);
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper method to create StoryWordCountDto from Solr document
|
||||
*/
|
||||
private StoryWordCountDto createStoryWordCountDto(SolrDocument doc) {
|
||||
String id = (String) doc.getFieldValue("id");
|
||||
String title = (String) doc.getFieldValue("title");
|
||||
String authorName = (String) doc.getFieldValue("authorName");
|
||||
Object wordCountObj = doc.getFieldValue("wordCount");
|
||||
int wordCount = (wordCountObj != null) ? ((Number) wordCountObj).intValue() : 0;
|
||||
long readingTime = wordCount / WORDS_PER_MINUTE;
|
||||
|
||||
return new StoryWordCountDto(id, title, authorName, wordCount, readingTime);
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper class to hold word count statistics
|
||||
*/
|
||||
private static class WordCountStats {
|
||||
long sum = 0;
|
||||
double mean = 0.0;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,91 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.RefreshToken;
|
||||
import com.storycove.repository.RefreshTokenRepository;
|
||||
import com.storycove.util.JwtUtil;
|
||||
import jakarta.transaction.Transactional;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.scheduling.annotation.Scheduled;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.Optional;
|
||||
|
||||
@Service
|
||||
public class RefreshTokenService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(RefreshTokenService.class);
|
||||
|
||||
private final RefreshTokenRepository refreshTokenRepository;
|
||||
private final JwtUtil jwtUtil;
|
||||
|
||||
public RefreshTokenService(RefreshTokenRepository refreshTokenRepository, JwtUtil jwtUtil) {
|
||||
this.refreshTokenRepository = refreshTokenRepository;
|
||||
this.jwtUtil = jwtUtil;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new refresh token
|
||||
*/
|
||||
public RefreshToken createRefreshToken(String libraryId, String userAgent, String ipAddress) {
|
||||
String token = jwtUtil.generateRefreshToken();
|
||||
LocalDateTime expiresAt = LocalDateTime.now().plusSeconds(jwtUtil.getRefreshExpirationMs() / 1000);
|
||||
|
||||
RefreshToken refreshToken = new RefreshToken(token, expiresAt, libraryId, userAgent, ipAddress);
|
||||
return refreshTokenRepository.save(refreshToken);
|
||||
}
|
||||
|
||||
/**
|
||||
* Find a refresh token by its token string
|
||||
*/
|
||||
public Optional<RefreshToken> findByToken(String token) {
|
||||
return refreshTokenRepository.findByToken(token);
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify and validate a refresh token
|
||||
*/
|
||||
public Optional<RefreshToken> verifyRefreshToken(String token) {
|
||||
return refreshTokenRepository.findByToken(token)
|
||||
.filter(RefreshToken::isValid);
|
||||
}
|
||||
|
||||
/**
|
||||
* Revoke a specific refresh token
|
||||
*/
|
||||
@Transactional
|
||||
public void revokeToken(RefreshToken token) {
|
||||
token.setRevokedAt(LocalDateTime.now());
|
||||
refreshTokenRepository.save(token);
|
||||
}
|
||||
|
||||
/**
|
||||
* Revoke all refresh tokens for a specific library
|
||||
*/
|
||||
@Transactional
|
||||
public void revokeAllByLibraryId(String libraryId) {
|
||||
refreshTokenRepository.revokeAllByLibraryId(libraryId, LocalDateTime.now());
|
||||
logger.info("Revoked all refresh tokens for library: {}", libraryId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Revoke all refresh tokens (e.g., for logout all)
|
||||
*/
|
||||
@Transactional
|
||||
public void revokeAll() {
|
||||
refreshTokenRepository.revokeAll(LocalDateTime.now());
|
||||
logger.info("Revoked all refresh tokens");
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up expired tokens periodically
|
||||
* Runs daily at 3 AM
|
||||
*/
|
||||
@Scheduled(cron = "0 0 3 * * ?")
|
||||
@Transactional
|
||||
public void cleanupExpiredTokens() {
|
||||
refreshTokenRepository.deleteExpiredTokens(LocalDateTime.now());
|
||||
logger.info("Cleaned up expired refresh tokens");
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,341 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.AuthorSearchDto;
|
||||
import com.storycove.dto.CollectionDto;
|
||||
import com.storycove.dto.SearchResultDto;
|
||||
import com.storycove.dto.StorySearchDto;
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Collection;
|
||||
import com.storycove.entity.Story;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
/**
|
||||
* Service adapter that provides a unified interface for search operations.
|
||||
*
|
||||
* This adapter directly delegates to SolrService.
|
||||
*/
|
||||
@Service
|
||||
public class SearchServiceAdapter {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(SearchServiceAdapter.class);
|
||||
|
||||
@Autowired
|
||||
private SolrService solrService;
|
||||
|
||||
// ===============================
|
||||
// SEARCH OPERATIONS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Search stories with unified interface
|
||||
*/
|
||||
public SearchResultDto<StorySearchDto> searchStories(String query, List<String> tags, String author,
|
||||
String series, Integer minWordCount, Integer maxWordCount,
|
||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||
String sortBy, String sortOrder, int page, int size,
|
||||
List<String> facetBy,
|
||||
// Advanced filters
|
||||
String createdAfter, String createdBefore,
|
||||
String lastReadAfter, String lastReadBefore,
|
||||
Boolean unratedOnly, String readingStatus,
|
||||
Boolean hasReadingProgress, Boolean hasCoverImage,
|
||||
String sourceDomain, String seriesFilter,
|
||||
Integer minTagCount, Boolean popularOnly,
|
||||
Boolean hiddenGemsOnly) {
|
||||
logger.info("SearchServiceAdapter: delegating search to SolrService");
|
||||
try {
|
||||
SearchResultDto<StorySearchDto> result = solrService.searchStories(query, tags, author, series, minWordCount, maxWordCount,
|
||||
minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy,
|
||||
createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus,
|
||||
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly,
|
||||
hiddenGemsOnly);
|
||||
logger.info("SearchServiceAdapter: received result with {} stories and {} facets",
|
||||
result.getResults().size(), result.getFacets().size());
|
||||
return result;
|
||||
} catch (Exception e) {
|
||||
logger.error("SearchServiceAdapter: error during search", e);
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get random stories with unified interface
|
||||
*/
|
||||
public List<StorySearchDto> getRandomStories(int count, List<String> tags, String author,
|
||||
String series, Integer minWordCount, Integer maxWordCount,
|
||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||
Long seed) {
|
||||
return solrService.getRandomStories(count, tags, author, series, minWordCount, maxWordCount,
|
||||
minRating, isRead, isFavorite, seed);
|
||||
}
|
||||
|
||||
/**
|
||||
* Recreate search indices
|
||||
*/
|
||||
public void recreateIndices() {
|
||||
try {
|
||||
solrService.recreateIndices();
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to recreate search indices", e);
|
||||
throw new RuntimeException("Failed to recreate search indices", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform complete reindex of all data
|
||||
*/
|
||||
public void performCompleteReindex() {
|
||||
try {
|
||||
recreateIndices();
|
||||
logger.info("Search indices recreated successfully");
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to perform complete reindex", e);
|
||||
throw new RuntimeException("Failed to perform complete reindex", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get random story ID with unified interface
|
||||
*/
|
||||
public String getRandomStoryId(Long seed) {
|
||||
return solrService.getRandomStoryId(seed);
|
||||
}
|
||||
|
||||
/**
|
||||
* Search authors with unified interface
|
||||
*/
|
||||
public List<AuthorSearchDto> searchAuthors(String query, int limit) {
|
||||
return solrService.searchAuthors(query, limit);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get tag suggestions with unified interface
|
||||
*/
|
||||
public List<String> getTagSuggestions(String query, int limit) {
|
||||
return solrService.getTagSuggestions(query, limit);
|
||||
}
|
||||
|
||||
/**
|
||||
* Search collections with unified interface
|
||||
*/
|
||||
public SearchResultDto<CollectionDto> searchCollections(String query, List<String> tags,
|
||||
boolean includeArchived, int page, int limit) {
|
||||
return solrService.searchCollections(query, tags, includeArchived, page, limit);
|
||||
}
|
||||
|
||||
// ===============================
|
||||
// INDEX OPERATIONS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Index a story in Solr
|
||||
*/
|
||||
public void indexStory(Story story) {
|
||||
try {
|
||||
solrService.indexStory(story);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index story {}", story.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a story in Solr
|
||||
*/
|
||||
public void updateStory(Story story) {
|
||||
try {
|
||||
solrService.updateStory(story);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update story {}", story.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a story from Solr
|
||||
*/
|
||||
public void deleteStory(UUID storyId) {
|
||||
try {
|
||||
solrService.deleteStory(storyId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete story {}", storyId, e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Index an author in Solr
|
||||
*/
|
||||
public void indexAuthor(Author author) {
|
||||
try {
|
||||
solrService.indexAuthor(author);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index author {}", author.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update an author in Solr
|
||||
*/
|
||||
public void updateAuthor(Author author) {
|
||||
try {
|
||||
solrService.updateAuthor(author);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update author {}", author.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete an author from Solr
|
||||
*/
|
||||
public void deleteAuthor(UUID authorId) {
|
||||
try {
|
||||
solrService.deleteAuthor(authorId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete author {}", authorId, e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bulk index stories in Solr
|
||||
*/
|
||||
public void bulkIndexStories(List<Story> stories) {
|
||||
try {
|
||||
solrService.bulkIndexStories(stories);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} stories", stories.size(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bulk index authors in Solr
|
||||
*/
|
||||
public void bulkIndexAuthors(List<Author> authors) {
|
||||
try {
|
||||
solrService.bulkIndexAuthors(authors);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} authors", authors.size(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Index a collection in Solr
|
||||
*/
|
||||
public void indexCollection(Collection collection) {
|
||||
try {
|
||||
solrService.indexCollection(collection);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index collection {}", collection.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a collection in Solr
|
||||
*/
|
||||
public void updateCollection(Collection collection) {
|
||||
try {
|
||||
solrService.updateCollection(collection);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update collection {}", collection.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a collection from Solr
|
||||
*/
|
||||
public void deleteCollection(UUID collectionId) {
|
||||
try {
|
||||
solrService.deleteCollection(collectionId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete collection {}", collectionId, e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bulk index collections in Solr
|
||||
*/
|
||||
public void bulkIndexCollections(List<Collection> collections) {
|
||||
try {
|
||||
solrService.bulkIndexCollections(collections);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} collections", collections.size(), e);
|
||||
}
|
||||
}
|
||||
|
||||
// ===============================
|
||||
// UTILITY METHODS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Check if search service is available and healthy
|
||||
*/
|
||||
public boolean isSearchServiceAvailable() {
|
||||
return solrService.testConnection();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current search engine name
|
||||
*/
|
||||
public String getCurrentSearchEngine() {
|
||||
return "solr";
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if dual-write is enabled
|
||||
*/
|
||||
public boolean isDualWriteEnabled() {
|
||||
return false; // No longer supported
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if we can switch to Solr
|
||||
*/
|
||||
public boolean canSwitchToSolr() {
|
||||
return true; // Already using Solr
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if we can switch to Typesense
|
||||
*/
|
||||
public boolean canSwitchToTypesense() {
|
||||
return false; // Typesense no longer available
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current search status for admin interface
|
||||
*/
|
||||
public SearchStatus getSearchStatus() {
|
||||
return new SearchStatus(
|
||||
"solr",
|
||||
false, // no dual-write
|
||||
false, // no typesense
|
||||
solrService.testConnection()
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* DTO for search status
|
||||
*/
|
||||
public static class SearchStatus {
|
||||
private final String primaryEngine;
|
||||
private final boolean dualWrite;
|
||||
private final boolean typesenseAvailable;
|
||||
private final boolean solrAvailable;
|
||||
|
||||
public SearchStatus(String primaryEngine, boolean dualWrite,
|
||||
boolean typesenseAvailable, boolean solrAvailable) {
|
||||
this.primaryEngine = primaryEngine;
|
||||
this.dualWrite = dualWrite;
|
||||
this.typesenseAvailable = typesenseAvailable;
|
||||
this.solrAvailable = solrAvailable;
|
||||
}
|
||||
|
||||
public String getPrimaryEngine() { return primaryEngine; }
|
||||
public boolean isDualWrite() { return dualWrite; }
|
||||
public boolean isTypesenseAvailable() { return typesenseAvailable; }
|
||||
public boolean isSolrAvailable() { return solrAvailable; }
|
||||
}
|
||||
}
|
||||
1413
backend/src/main/java/com/storycove/service/SolrService.java
Normal file
1413
backend/src/main/java/com/storycove/service/SolrService.java
Normal file
File diff suppressed because it is too large
Load Diff
@@ -42,7 +42,7 @@ public class StoryService {
|
||||
private final TagService tagService;
|
||||
private final SeriesService seriesService;
|
||||
private final HtmlSanitizationService sanitizationService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
public StoryService(StoryRepository storyRepository,
|
||||
@@ -52,7 +52,7 @@ public class StoryService {
|
||||
TagService tagService,
|
||||
SeriesService seriesService,
|
||||
HtmlSanitizationService sanitizationService,
|
||||
@Autowired(required = false) TypesenseService typesenseService) {
|
||||
SearchServiceAdapter searchServiceAdapter) {
|
||||
this.storyRepository = storyRepository;
|
||||
this.tagRepository = tagRepository;
|
||||
this.readingPositionRepository = readingPositionRepository;
|
||||
@@ -60,7 +60,7 @@ public class StoryService {
|
||||
this.tagService = tagService;
|
||||
this.seriesService = seriesService;
|
||||
this.sanitizationService = sanitizationService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
@@ -239,10 +239,8 @@ public class StoryService {
|
||||
story.addTag(tag);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new tag information
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with new tag information
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -256,10 +254,8 @@ public class StoryService {
|
||||
story.removeTag(tag);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with updated tag information
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with updated tag information
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -274,10 +270,8 @@ public class StoryService {
|
||||
story.setRating(rating);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new rating
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with new rating
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -292,10 +286,8 @@ public class StoryService {
|
||||
story.updateReadingProgress(position);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new reading progress
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with new reading progress
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -313,10 +305,8 @@ public class StoryService {
|
||||
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new reading status
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with new reading status
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -358,10 +348,8 @@ public class StoryService {
|
||||
updateStoryTags(savedStory, story.getTags());
|
||||
}
|
||||
|
||||
// Index in Typesense (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexStory(savedStory);
|
||||
}
|
||||
// Index in search engine
|
||||
searchServiceAdapter.indexStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -388,10 +376,8 @@ public class StoryService {
|
||||
updateStoryTagsByNames(savedStory, tagNames);
|
||||
}
|
||||
|
||||
// Index in Typesense (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexStory(savedStory);
|
||||
}
|
||||
// Index in search engine
|
||||
searchServiceAdapter.indexStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -409,10 +395,8 @@ public class StoryService {
|
||||
updateStoryFields(existingStory, storyUpdates);
|
||||
Story updatedStory = storyRepository.save(existingStory);
|
||||
|
||||
// Update in Typesense (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(updatedStory);
|
||||
}
|
||||
// Update in search engine
|
||||
searchServiceAdapter.updateStory(updatedStory);
|
||||
|
||||
return updatedStory;
|
||||
}
|
||||
@@ -432,10 +416,20 @@ public class StoryService {
|
||||
|
||||
Story updatedStory = storyRepository.save(existingStory);
|
||||
|
||||
// Update in Typesense (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(updatedStory);
|
||||
}
|
||||
// Update in search engine
|
||||
searchServiceAdapter.updateStory(updatedStory);
|
||||
|
||||
return updatedStory;
|
||||
}
|
||||
|
||||
public Story updateContentOnly(UUID id, String contentHtml) {
|
||||
Story existingStory = findById(id);
|
||||
existingStory.setContentHtml(contentHtml);
|
||||
|
||||
Story updatedStory = storyRepository.save(existingStory);
|
||||
|
||||
// Update in search engine since content changed
|
||||
searchServiceAdapter.updateStory(updatedStory);
|
||||
|
||||
return updatedStory;
|
||||
}
|
||||
@@ -455,10 +449,8 @@ public class StoryService {
|
||||
// Create a copy to avoid ConcurrentModificationException
|
||||
new ArrayList<>(story.getTags()).forEach(tag -> story.removeTag(tag));
|
||||
|
||||
// Delete from Typesense first (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.deleteStory(story.getId().toString());
|
||||
}
|
||||
// Delete from search engine first
|
||||
searchServiceAdapter.deleteStory(story.getId());
|
||||
|
||||
storyRepository.delete(story);
|
||||
}
|
||||
@@ -674,7 +666,7 @@ public class StoryService {
|
||||
|
||||
/**
|
||||
* Find a random story based on optional filters.
|
||||
* Uses Typesense for consistency with Library search functionality.
|
||||
* Uses search service for consistency with Library search functionality.
|
||||
* Supports text search and multiple tags using the same logic as the Library view.
|
||||
* @param searchQuery Optional search query
|
||||
* @param tags Optional list of tags to filter by
|
||||
@@ -693,7 +685,7 @@ public class StoryService {
|
||||
|
||||
/**
|
||||
* Find a random story based on optional filters with seed support.
|
||||
* Uses Typesense for consistency with Library search functionality.
|
||||
* Uses search service for consistency with Library search functionality.
|
||||
* Supports text search and multiple tags using the same logic as the Library view.
|
||||
* @param searchQuery Optional search query
|
||||
* @param tags Optional list of tags to filter by
|
||||
@@ -711,21 +703,16 @@ public class StoryService {
|
||||
String seriesFilter, Integer minTagCount,
|
||||
Boolean popularOnly, Boolean hiddenGemsOnly) {
|
||||
|
||||
// Use Typesense if available for consistency with Library search
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
Optional<UUID> randomStoryId = typesenseService.getRandomStoryId(searchQuery, tags, seed,
|
||||
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||
minRating, maxRating, unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
|
||||
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
|
||||
if (randomStoryId.isPresent()) {
|
||||
return storyRepository.findById(randomStoryId.get());
|
||||
}
|
||||
return Optional.empty();
|
||||
} catch (Exception e) {
|
||||
// Fallback to database queries if Typesense fails
|
||||
logger.warn("Typesense random story lookup failed, falling back to database queries", e);
|
||||
// Use search service for consistency with Library search
|
||||
try {
|
||||
String randomStoryId = searchServiceAdapter.getRandomStoryId(seed);
|
||||
if (randomStoryId != null) {
|
||||
return storyRepository.findById(UUID.fromString(randomStoryId));
|
||||
}
|
||||
return Optional.empty();
|
||||
} catch (Exception e) {
|
||||
// Fallback to database queries if search service fails
|
||||
logger.warn("Search service random story lookup failed, falling back to database queries", e);
|
||||
}
|
||||
|
||||
// Fallback to repository-based implementation (global routing handles library selection)
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -22,9 +22,12 @@ public class JwtUtil {
|
||||
// Security: Generate new secret on each startup to invalidate all existing tokens
|
||||
private String secret;
|
||||
|
||||
@Value("${storycove.jwt.expiration:86400000}") // 24 hours default
|
||||
@Value("${storycove.jwt.expiration:86400000}") // 24 hours default (access token)
|
||||
private Long expiration;
|
||||
|
||||
@Value("${storycove.jwt.refresh-expiration:1209600000}") // 14 days default (refresh token)
|
||||
private Long refreshExpiration;
|
||||
|
||||
@PostConstruct
|
||||
public void initialize() {
|
||||
// Generate a new random secret on startup to invalidate all existing JWT tokens
|
||||
@@ -38,6 +41,17 @@ public class JwtUtil {
|
||||
logger.info("Users will need to re-authenticate after application restart for security");
|
||||
}
|
||||
|
||||
public Long getRefreshExpirationMs() {
|
||||
return refreshExpiration;
|
||||
}
|
||||
|
||||
public String generateRefreshToken() {
|
||||
SecureRandom random = new SecureRandom();
|
||||
byte[] tokenBytes = new byte[32]; // 256 bits
|
||||
random.nextBytes(tokenBytes);
|
||||
return Base64.getUrlEncoder().withoutPadding().encodeToString(tokenBytes);
|
||||
}
|
||||
|
||||
private SecretKey getSigningKey() {
|
||||
return Keys.hmacShaKeyFor(secret.getBytes());
|
||||
}
|
||||
|
||||
@@ -4,6 +4,11 @@ spring:
|
||||
username: ${SPRING_DATASOURCE_USERNAME:storycove}
|
||||
password: ${SPRING_DATASOURCE_PASSWORD:password}
|
||||
driver-class-name: org.postgresql.Driver
|
||||
hikari:
|
||||
connection-timeout: 60000 # 60 seconds
|
||||
idle-timeout: 300000 # 5 minutes
|
||||
max-lifetime: 1800000 # 30 minutes
|
||||
maximum-pool-size: 20
|
||||
|
||||
jpa:
|
||||
hibernate:
|
||||
@@ -16,11 +21,19 @@ spring:
|
||||
|
||||
servlet:
|
||||
multipart:
|
||||
max-file-size: 256MB # Increased for backup restore
|
||||
max-request-size: 260MB # Slightly higher to account for form data
|
||||
max-file-size: 600MB # Increased for large backup restore (425MB+)
|
||||
max-request-size: 610MB # Slightly higher to account for form data
|
||||
|
||||
jackson:
|
||||
serialization:
|
||||
write-dates-as-timestamps: false
|
||||
deserialization:
|
||||
adjust-dates-to-context-time-zone: false
|
||||
|
||||
server:
|
||||
port: 8080
|
||||
tomcat:
|
||||
max-http-request-size: 650MB # Tomcat HTTP request size limit (separate from multipart)
|
||||
|
||||
storycove:
|
||||
app:
|
||||
@@ -29,17 +42,68 @@ storycove:
|
||||
allowed-origins: ${STORYCOVE_CORS_ALLOWED_ORIGINS:http://localhost:3000,http://localhost:6925}
|
||||
jwt:
|
||||
secret: ${JWT_SECRET} # REQUIRED: Must be at least 32 characters, no default for security
|
||||
expiration: 86400000 # 24 hours
|
||||
expiration: 86400000 # 24 hours (access token)
|
||||
refresh-expiration: 1209600000 # 14 days (refresh token)
|
||||
auth:
|
||||
password: ${APP_PASSWORD} # REQUIRED: No default password for security
|
||||
typesense:
|
||||
api-key: ${TYPESENSE_API_KEY:xyz}
|
||||
host: ${TYPESENSE_HOST:localhost}
|
||||
port: ${TYPESENSE_PORT:8108}
|
||||
enabled: ${TYPESENSE_ENABLED:true}
|
||||
reindex-interval: ${TYPESENSE_REINDEX_INTERVAL:3600000} # 1 hour in milliseconds
|
||||
search:
|
||||
engine: solr # Apache Solr search engine
|
||||
solr:
|
||||
# Connection settings
|
||||
url: ${SOLR_URL:http://solr:8983/solr}
|
||||
username: ${SOLR_USERNAME:}
|
||||
password: ${SOLR_PASSWORD:}
|
||||
|
||||
# Core configuration
|
||||
cores:
|
||||
stories: ${SOLR_STORIES_CORE:storycove_stories}
|
||||
authors: ${SOLR_AUTHORS_CORE:storycove_authors}
|
||||
|
||||
# Connection settings
|
||||
connection:
|
||||
timeout: ${SOLR_CONNECTION_TIMEOUT:30000} # 30 seconds
|
||||
socket-timeout: ${SOLR_SOCKET_TIMEOUT:60000} # 60 seconds
|
||||
max-connections-per-route: ${SOLR_MAX_CONN_PER_ROUTE:10}
|
||||
max-connections-total: ${SOLR_MAX_CONN_TOTAL:30}
|
||||
retry-on-failure: ${SOLR_RETRY_ON_FAILURE:true}
|
||||
max-retries: ${SOLR_MAX_RETRIES:3}
|
||||
|
||||
# Query settings
|
||||
query:
|
||||
default-rows: ${SOLR_DEFAULT_ROWS:10}
|
||||
max-rows: ${SOLR_MAX_ROWS:1000}
|
||||
default-operator: ${SOLR_DEFAULT_OPERATOR:AND}
|
||||
highlight: ${SOLR_ENABLE_HIGHLIGHT:true}
|
||||
facets: ${SOLR_ENABLE_FACETS:true}
|
||||
|
||||
# Commit settings
|
||||
commit:
|
||||
soft-commit: ${SOLR_SOFT_COMMIT:true}
|
||||
commit-within: ${SOLR_COMMIT_WITHIN:1000} # 1 second
|
||||
wait-searcher: ${SOLR_WAIT_SEARCHER:false}
|
||||
|
||||
# Health and monitoring
|
||||
health:
|
||||
check-interval: ${SOLR_HEALTH_CHECK_INTERVAL:30000} # 30 seconds
|
||||
slow-query-threshold: ${SOLR_SLOW_QUERY_THRESHOLD:5000} # 5 seconds
|
||||
enable-metrics: ${SOLR_ENABLE_METRICS:true}
|
||||
images:
|
||||
storage-path: ${IMAGE_STORAGE_PATH:/app/images}
|
||||
automatic-backup:
|
||||
dir: ${AUTOMATIC_BACKUP_DIR:/app/automatic-backups}
|
||||
|
||||
management:
|
||||
endpoints:
|
||||
web:
|
||||
exposure:
|
||||
include: health,info,prometheus
|
||||
endpoint:
|
||||
health:
|
||||
show-details: when-authorized
|
||||
show-components: always
|
||||
health:
|
||||
solr:
|
||||
enabled: ${SOLR_HEALTH_ENABLED:true}
|
||||
|
||||
logging:
|
||||
level:
|
||||
|
||||
@@ -1,12 +1,8 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import com.storycove.service.TypesenseService;
|
||||
import org.springframework.boot.test.context.TestConfiguration;
|
||||
import org.springframework.boot.test.mock.mockito.MockBean;
|
||||
|
||||
@TestConfiguration
|
||||
public class TestConfig {
|
||||
|
||||
@MockBean
|
||||
public TypesenseService typesenseService;
|
||||
// Test configuration
|
||||
}
|
||||
@@ -44,8 +44,9 @@ class AuthorServiceTest {
|
||||
testAuthor.setId(testId);
|
||||
testAuthor.setNotes("Test notes");
|
||||
|
||||
// Initialize service with null TypesenseService (which is allowed for tests)
|
||||
authorService = new AuthorService(authorRepository, null);
|
||||
// Initialize service with mock SearchServiceAdapter
|
||||
SearchServiceAdapter mockSearchServiceAdapter = mock(SearchServiceAdapter.class);
|
||||
authorService = new AuthorService(authorRepository, mockSearchServiceAdapter);
|
||||
}
|
||||
|
||||
@Test
|
||||
|
||||
@@ -0,0 +1,465 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.CollectionDto;
|
||||
import com.storycove.dto.SearchResultDto;
|
||||
import com.storycove.entity.Collection;
|
||||
import com.storycove.entity.CollectionStory;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.entity.Tag;
|
||||
import com.storycove.repository.CollectionRepository;
|
||||
import com.storycove.repository.CollectionStoryRepository;
|
||||
import com.storycove.repository.StoryRepository;
|
||||
import com.storycove.repository.TagRepository;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.DisplayName;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.InjectMocks;
|
||||
import org.mockito.Mock;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
|
||||
import java.util.*;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.*;
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class CollectionServiceTest {
|
||||
|
||||
@Mock
|
||||
private CollectionRepository collectionRepository;
|
||||
|
||||
@Mock
|
||||
private CollectionStoryRepository collectionStoryRepository;
|
||||
|
||||
@Mock
|
||||
private StoryRepository storyRepository;
|
||||
|
||||
@Mock
|
||||
private TagRepository tagRepository;
|
||||
|
||||
@Mock
|
||||
private SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Mock
|
||||
private ReadingTimeService readingTimeService;
|
||||
|
||||
@InjectMocks
|
||||
private CollectionService collectionService;
|
||||
|
||||
private Collection testCollection;
|
||||
private Story testStory;
|
||||
private Tag testTag;
|
||||
private UUID collectionId;
|
||||
private UUID storyId;
|
||||
|
||||
@BeforeEach
|
||||
void setUp() {
|
||||
collectionId = UUID.randomUUID();
|
||||
storyId = UUID.randomUUID();
|
||||
|
||||
testCollection = new Collection();
|
||||
testCollection.setId(collectionId);
|
||||
testCollection.setName("Test Collection");
|
||||
testCollection.setDescription("Test Description");
|
||||
testCollection.setIsArchived(false);
|
||||
|
||||
testStory = new Story();
|
||||
testStory.setId(storyId);
|
||||
testStory.setTitle("Test Story");
|
||||
testStory.setWordCount(1000);
|
||||
|
||||
testTag = new Tag();
|
||||
testTag.setId(UUID.randomUUID());
|
||||
testTag.setName("test-tag");
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Search Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should search collections using SearchServiceAdapter")
|
||||
void testSearchCollections() {
|
||||
// Arrange
|
||||
CollectionDto dto = new CollectionDto();
|
||||
dto.setId(collectionId);
|
||||
dto.setName("Test Collection");
|
||||
|
||||
SearchResultDto<CollectionDto> searchResult = new SearchResultDto<>(
|
||||
List.of(dto), 1, 0, 10, "test", 100L
|
||||
);
|
||||
|
||||
when(searchServiceAdapter.searchCollections(anyString(), anyList(), anyBoolean(), anyInt(), anyInt()))
|
||||
.thenReturn(searchResult);
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
|
||||
// Act
|
||||
SearchResultDto<Collection> result = collectionService.searchCollections("test", null, false, 0, 10);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.getTotalHits());
|
||||
assertEquals(1, result.getResults().size());
|
||||
assertEquals(collectionId, result.getResults().get(0).getId());
|
||||
verify(searchServiceAdapter).searchCollections("test", null, false, 0, 10);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle search with tag filters")
|
||||
void testSearchCollectionsWithTags() {
|
||||
// Arrange
|
||||
List<String> tags = List.of("fantasy", "adventure");
|
||||
CollectionDto dto = new CollectionDto();
|
||||
dto.setId(collectionId);
|
||||
|
||||
SearchResultDto<CollectionDto> searchResult = new SearchResultDto<>(
|
||||
List.of(dto), 1, 0, 10, "test", 50L
|
||||
);
|
||||
|
||||
when(searchServiceAdapter.searchCollections(anyString(), eq(tags), anyBoolean(), anyInt(), anyInt()))
|
||||
.thenReturn(searchResult);
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
|
||||
// Act
|
||||
SearchResultDto<Collection> result = collectionService.searchCollections("test", tags, false, 0, 10);
|
||||
|
||||
// Assert
|
||||
assertEquals(1, result.getResults().size());
|
||||
verify(searchServiceAdapter).searchCollections("test", tags, false, 0, 10);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return empty results when search fails")
|
||||
void testSearchCollectionsFailure() {
|
||||
// Arrange
|
||||
when(searchServiceAdapter.searchCollections(anyString(), anyList(), anyBoolean(), anyInt(), anyInt()))
|
||||
.thenThrow(new RuntimeException("Search failed"));
|
||||
|
||||
// Act
|
||||
SearchResultDto<Collection> result = collectionService.searchCollections("test", null, false, 0, 10);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(0, result.getTotalHits());
|
||||
assertTrue(result.getResults().isEmpty());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// CRUD Operations Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find collection by ID")
|
||||
void testFindById() {
|
||||
// Arrange
|
||||
when(collectionRepository.findByIdWithStoriesAndTags(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
|
||||
// Act
|
||||
Collection result = collectionService.findById(collectionId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(collectionId, result.getId());
|
||||
assertEquals("Test Collection", result.getName());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when collection not found")
|
||||
void testFindByIdNotFound() {
|
||||
// Arrange
|
||||
when(collectionRepository.findByIdWithStoriesAndTags(any()))
|
||||
.thenReturn(Optional.empty());
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(ResourceNotFoundException.class, () -> {
|
||||
collectionService.findById(UUID.randomUUID());
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should create collection with tags")
|
||||
void testCreateCollection() {
|
||||
// Arrange
|
||||
List<String> tagNames = List.of("fantasy", "adventure");
|
||||
when(tagRepository.findByName("fantasy")).thenReturn(Optional.of(testTag));
|
||||
when(tagRepository.findByName("adventure")).thenReturn(Optional.empty());
|
||||
when(tagRepository.save(any(Tag.class))).thenReturn(testTag);
|
||||
when(collectionRepository.save(any(Collection.class))).thenReturn(testCollection);
|
||||
|
||||
// Act
|
||||
Collection result = collectionService.createCollection("New Collection", "Description", tagNames, null);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
verify(collectionRepository).save(any(Collection.class));
|
||||
verify(tagRepository, times(2)).findByName(anyString());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should create collection with initial stories")
|
||||
void testCreateCollectionWithStories() {
|
||||
// Arrange
|
||||
List<UUID> storyIds = List.of(storyId);
|
||||
when(collectionRepository.save(any(Collection.class))).thenReturn(testCollection);
|
||||
when(storyRepository.findAllById(storyIds)).thenReturn(List.of(testStory));
|
||||
when(collectionStoryRepository.existsByCollectionIdAndStoryId(any(), any())).thenReturn(false);
|
||||
when(collectionStoryRepository.getNextPosition(any())).thenReturn(1000);
|
||||
when(collectionStoryRepository.save(any())).thenReturn(new CollectionStory());
|
||||
when(collectionRepository.findByIdWithStoriesAndTags(any()))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
|
||||
// Act
|
||||
Collection result = collectionService.createCollection("New Collection", "Description", null, storyIds);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
verify(storyRepository).findAllById(storyIds);
|
||||
verify(collectionStoryRepository).save(any(CollectionStory.class));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should update collection metadata")
|
||||
void testUpdateCollection() {
|
||||
// Arrange
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
when(collectionRepository.save(any(Collection.class)))
|
||||
.thenReturn(testCollection);
|
||||
|
||||
// Act
|
||||
Collection result = collectionService.updateCollection(
|
||||
collectionId, "Updated Name", "Updated Description", null, 5
|
||||
);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
verify(collectionRepository).save(any(Collection.class));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should delete collection")
|
||||
void testDeleteCollection() {
|
||||
// Arrange
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
doNothing().when(collectionRepository).delete(any(Collection.class));
|
||||
|
||||
// Act
|
||||
collectionService.deleteCollection(collectionId);
|
||||
|
||||
// Assert
|
||||
verify(collectionRepository).delete(testCollection);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should archive collection")
|
||||
void testArchiveCollection() {
|
||||
// Arrange
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
when(collectionRepository.save(any(Collection.class)))
|
||||
.thenReturn(testCollection);
|
||||
|
||||
// Act
|
||||
Collection result = collectionService.archiveCollection(collectionId, true);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
verify(collectionRepository).save(any(Collection.class));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Story Management Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should add stories to collection")
|
||||
void testAddStoriesToCollection() {
|
||||
// Arrange
|
||||
List<UUID> storyIds = List.of(storyId);
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
when(storyRepository.findAllById(storyIds))
|
||||
.thenReturn(List.of(testStory));
|
||||
when(collectionStoryRepository.existsByCollectionIdAndStoryId(collectionId, storyId))
|
||||
.thenReturn(false);
|
||||
when(collectionStoryRepository.getNextPosition(collectionId))
|
||||
.thenReturn(1000);
|
||||
when(collectionStoryRepository.save(any()))
|
||||
.thenReturn(new CollectionStory());
|
||||
when(collectionStoryRepository.countByCollectionId(collectionId))
|
||||
.thenReturn(1L);
|
||||
|
||||
// Act
|
||||
Map<String, Object> result = collectionService.addStoriesToCollection(collectionId, storyIds, null);
|
||||
|
||||
// Assert
|
||||
assertEquals(1, result.get("added"));
|
||||
assertEquals(0, result.get("skipped"));
|
||||
assertEquals(1L, result.get("totalStories"));
|
||||
verify(collectionStoryRepository).save(any(CollectionStory.class));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should skip duplicate stories when adding")
|
||||
void testAddDuplicateStories() {
|
||||
// Arrange
|
||||
List<UUID> storyIds = List.of(storyId);
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
when(storyRepository.findAllById(storyIds))
|
||||
.thenReturn(List.of(testStory));
|
||||
when(collectionStoryRepository.existsByCollectionIdAndStoryId(collectionId, storyId))
|
||||
.thenReturn(true);
|
||||
when(collectionStoryRepository.countByCollectionId(collectionId))
|
||||
.thenReturn(1L);
|
||||
|
||||
// Act
|
||||
Map<String, Object> result = collectionService.addStoriesToCollection(collectionId, storyIds, null);
|
||||
|
||||
// Assert
|
||||
assertEquals(0, result.get("added"));
|
||||
assertEquals(1, result.get("skipped"));
|
||||
verify(collectionStoryRepository, never()).save(any());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when adding non-existent stories")
|
||||
void testAddNonExistentStories() {
|
||||
// Arrange
|
||||
List<UUID> storyIds = List.of(storyId, UUID.randomUUID());
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
when(storyRepository.findAllById(storyIds))
|
||||
.thenReturn(List.of(testStory)); // Only one story found
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(ResourceNotFoundException.class, () -> {
|
||||
collectionService.addStoriesToCollection(collectionId, storyIds, null);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove story from collection")
|
||||
void testRemoveStoryFromCollection() {
|
||||
// Arrange
|
||||
CollectionStory collectionStory = new CollectionStory();
|
||||
when(collectionStoryRepository.existsByCollectionIdAndStoryId(collectionId, storyId))
|
||||
.thenReturn(true);
|
||||
when(collectionStoryRepository.findByCollectionIdAndStoryId(collectionId, storyId))
|
||||
.thenReturn(collectionStory);
|
||||
doNothing().when(collectionStoryRepository).delete(any());
|
||||
|
||||
// Act
|
||||
collectionService.removeStoryFromCollection(collectionId, storyId);
|
||||
|
||||
// Assert
|
||||
verify(collectionStoryRepository).delete(collectionStory);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when removing non-existent story")
|
||||
void testRemoveNonExistentStory() {
|
||||
// Arrange
|
||||
when(collectionStoryRepository.existsByCollectionIdAndStoryId(any(), any()))
|
||||
.thenReturn(false);
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(ResourceNotFoundException.class, () -> {
|
||||
collectionService.removeStoryFromCollection(collectionId, storyId);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reorder stories in collection")
|
||||
void testReorderStories() {
|
||||
// Arrange
|
||||
List<Map<String, Object>> storyOrders = List.of(
|
||||
Map.of("storyId", storyId.toString(), "position", 1)
|
||||
);
|
||||
when(collectionRepository.findById(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
doNothing().when(collectionStoryRepository).updatePosition(any(), any(), anyInt());
|
||||
|
||||
// Act
|
||||
collectionService.reorderStories(collectionId, storyOrders);
|
||||
|
||||
// Assert
|
||||
verify(collectionStoryRepository, times(2)).updatePosition(any(), any(), anyInt());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Statistics Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should get collection statistics")
|
||||
void testGetCollectionStatistics() {
|
||||
// Arrange
|
||||
testStory.setWordCount(1000);
|
||||
testStory.setRating(5);
|
||||
|
||||
CollectionStory cs = new CollectionStory();
|
||||
cs.setStory(testStory);
|
||||
testCollection.setCollectionStories(List.of(cs));
|
||||
|
||||
when(collectionRepository.findByIdWithStoriesAndTags(collectionId))
|
||||
.thenReturn(Optional.of(testCollection));
|
||||
when(readingTimeService.calculateReadingTime(1000))
|
||||
.thenReturn(5);
|
||||
|
||||
// Act
|
||||
Map<String, Object> stats = collectionService.getCollectionStatistics(collectionId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(stats);
|
||||
assertEquals(1, stats.get("totalStories"));
|
||||
assertEquals(1000, stats.get("totalWordCount"));
|
||||
assertEquals(5, stats.get("estimatedReadingTime"));
|
||||
assertTrue(stats.containsKey("averageStoryRating"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Helper Method Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find all collections with tags for indexing")
|
||||
void testFindAllWithTags() {
|
||||
// Arrange
|
||||
when(collectionRepository.findAllWithTags())
|
||||
.thenReturn(List.of(testCollection));
|
||||
|
||||
// Act
|
||||
List<Collection> result = collectionService.findAllWithTags();
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.size());
|
||||
verify(collectionRepository).findAllWithTags();
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should get collections for a specific story")
|
||||
void testGetCollectionsForStory() {
|
||||
// Arrange
|
||||
CollectionStory cs = new CollectionStory();
|
||||
cs.setCollection(testCollection);
|
||||
when(collectionStoryRepository.findByStoryId(storyId))
|
||||
.thenReturn(List.of(cs));
|
||||
|
||||
// Act
|
||||
List<Collection> result = collectionService.getCollectionsForStory(storyId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.size());
|
||||
assertEquals(collectionId, result.get(0).getId());
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,721 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.EPUBExportRequest;
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Collection;
|
||||
import com.storycove.entity.CollectionStory;
|
||||
import com.storycove.entity.ReadingPosition;
|
||||
import com.storycove.entity.Series;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.entity.Tag;
|
||||
import com.storycove.repository.ReadingPositionRepository;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.DisplayName;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.InjectMocks;
|
||||
import org.mockito.Mock;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
import org.springframework.core.io.Resource;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.*;
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
/**
|
||||
* Tests for EPUBExportService.
|
||||
* Note: These tests focus on service logic. Full EPUB validation would be done in integration tests.
|
||||
*/
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class EPUBExportServiceTest {
|
||||
|
||||
@Mock
|
||||
private StoryService storyService;
|
||||
|
||||
@Mock
|
||||
private ReadingPositionRepository readingPositionRepository;
|
||||
|
||||
@Mock
|
||||
private CollectionService collectionService;
|
||||
|
||||
@InjectMocks
|
||||
private EPUBExportService epubExportService;
|
||||
|
||||
private Story testStory;
|
||||
private Author testAuthor;
|
||||
private Series testSeries;
|
||||
private Collection testCollection;
|
||||
private EPUBExportRequest testRequest;
|
||||
private UUID storyId;
|
||||
private UUID collectionId;
|
||||
|
||||
@BeforeEach
|
||||
void setUp() {
|
||||
storyId = UUID.randomUUID();
|
||||
collectionId = UUID.randomUUID();
|
||||
|
||||
testAuthor = new Author();
|
||||
testAuthor.setId(UUID.randomUUID());
|
||||
testAuthor.setName("Test Author");
|
||||
|
||||
testSeries = new Series();
|
||||
testSeries.setId(UUID.randomUUID());
|
||||
testSeries.setName("Test Series");
|
||||
|
||||
testStory = new Story();
|
||||
testStory.setId(storyId);
|
||||
testStory.setTitle("Test Story");
|
||||
testStory.setDescription("Test Description");
|
||||
testStory.setContentHtml("<p>Test content here</p>");
|
||||
testStory.setWordCount(1000);
|
||||
testStory.setRating(5);
|
||||
testStory.setAuthor(testAuthor);
|
||||
testStory.setCreatedAt(LocalDateTime.now());
|
||||
testStory.setTags(new HashSet<>());
|
||||
|
||||
testCollection = new Collection();
|
||||
testCollection.setId(collectionId);
|
||||
testCollection.setName("Test Collection");
|
||||
testCollection.setDescription("Test Collection Description");
|
||||
testCollection.setCreatedAt(LocalDateTime.now());
|
||||
testCollection.setCollectionStories(new ArrayList<>());
|
||||
|
||||
testRequest = new EPUBExportRequest();
|
||||
testRequest.setStoryId(storyId);
|
||||
testRequest.setIncludeCoverImage(false);
|
||||
testRequest.setIncludeMetadata(false);
|
||||
testRequest.setIncludeReadingPosition(false);
|
||||
testRequest.setSplitByChapters(false);
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Basic Export Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should export story as EPUB successfully")
|
||||
void testExportStoryAsEPUB() throws IOException {
|
||||
// Arrange
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
verify(storyService).findById(storyId);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when story not found")
|
||||
void testExportNonExistentStory() {
|
||||
// Arrange
|
||||
when(storyService.findById(any())).thenThrow(new ResourceNotFoundException("Story not found"));
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(ResourceNotFoundException.class, () -> {
|
||||
epubExportService.exportStoryAsEPUB(testRequest);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should export story with HTML content")
|
||||
void testExportStoryWithHtmlContent() throws IOException {
|
||||
// Arrange
|
||||
testStory.setContentHtml("<p>HTML content</p>");
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should export story with plain text content when HTML is null")
|
||||
void testExportStoryWithPlainContent() throws IOException {
|
||||
// Arrange
|
||||
// Note: contentPlain is set automatically when contentHtml is set
|
||||
// We test with HTML then clear it to simulate plain text content
|
||||
testStory.setContentHtml("<p>Plain text content here</p>");
|
||||
// contentPlain will be auto-populated, then we clear HTML
|
||||
testStory.setContentHtml(null);
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle story with no content")
|
||||
void testExportStoryWithNoContent() throws IOException {
|
||||
// Arrange
|
||||
// Create a fresh story with no content (don't set contentHtml at all)
|
||||
Story emptyContentStory = new Story();
|
||||
emptyContentStory.setId(storyId);
|
||||
emptyContentStory.setTitle("Story With No Content");
|
||||
emptyContentStory.setAuthor(testAuthor);
|
||||
emptyContentStory.setCreatedAt(LocalDateTime.now());
|
||||
emptyContentStory.setTags(new HashSet<>());
|
||||
// Don't set contentHtml - it will be null by default
|
||||
|
||||
when(storyService.findById(storyId)).thenReturn(emptyContentStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Metadata Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should use custom title when provided")
|
||||
void testCustomTitle() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setCustomTitle("Custom Title");
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals("Custom Title", testRequest.getCustomTitle());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should use custom author when provided")
|
||||
void testCustomAuthor() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setCustomAuthor("Custom Author");
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals("Custom Author", testRequest.getCustomAuthor());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should use story author when custom author not provided")
|
||||
void testDefaultAuthor() throws IOException {
|
||||
// Arrange
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals("Test Author", testStory.getAuthor().getName());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle story with no author")
|
||||
void testStoryWithNoAuthor() throws IOException {
|
||||
// Arrange
|
||||
testStory.setAuthor(null);
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertNull(testStory.getAuthor());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should include metadata when requested")
|
||||
void testIncludeMetadata() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setIncludeMetadata(true);
|
||||
testStory.setSeries(testSeries);
|
||||
testStory.setVolume(1);
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(testRequest.getIncludeMetadata());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should set custom language")
|
||||
void testCustomLanguage() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setLanguage("de");
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals("de", testRequest.getLanguage());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should use default language when not specified")
|
||||
void testDefaultLanguage() throws IOException {
|
||||
// Arrange
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertNull(testRequest.getLanguage());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle custom metadata")
|
||||
void testCustomMetadata() throws IOException {
|
||||
// Arrange
|
||||
List<String> customMetadata = Arrays.asList(
|
||||
"publisher: Test Publisher",
|
||||
"isbn: 123-456-789"
|
||||
);
|
||||
testRequest.setCustomMetadata(customMetadata);
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(2, testRequest.getCustomMetadata().size());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Chapter Splitting Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should export as single chapter when splitByChapters is false")
|
||||
void testSingleChapter() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setSplitByChapters(false);
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertFalse(testRequest.getSplitByChapters());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should split into chapters when requested")
|
||||
void testSplitByChapters() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setSplitByChapters(true);
|
||||
testStory.setContentHtml("<h1>Chapter 1</h1><p>Content 1</p><h1>Chapter 2</h1><p>Content 2</p>");
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(testRequest.getSplitByChapters());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should respect max words per chapter setting")
|
||||
void testMaxWordsPerChapter() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setSplitByChapters(true);
|
||||
testRequest.setMaxWordsPerChapter(500);
|
||||
String longContent = String.join(" ", Collections.nCopies(1000, "word"));
|
||||
testStory.setContentHtml("<p>" + longContent + "</p>");
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(500, testRequest.getMaxWordsPerChapter());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Reading Position Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should include reading position when requested")
|
||||
void testIncludeReadingPosition() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setIncludeReadingPosition(true);
|
||||
|
||||
ReadingPosition position = new ReadingPosition(testStory);
|
||||
position.setChapterIndex(5);
|
||||
position.setWordPosition(100);
|
||||
position.setPercentageComplete(50.0);
|
||||
position.setEpubCfi("epubcfi(/6/4[chap01ref]!/4/2/2[page005])");
|
||||
position.setUpdatedAt(LocalDateTime.now());
|
||||
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
when(readingPositionRepository.findByStoryId(storyId)).thenReturn(Optional.of(position));
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(testRequest.getIncludeReadingPosition());
|
||||
verify(readingPositionRepository).findByStoryId(storyId);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle missing reading position gracefully")
|
||||
void testMissingReadingPosition() throws IOException {
|
||||
// Arrange
|
||||
testRequest.setIncludeReadingPosition(true);
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
when(readingPositionRepository.findByStoryId(storyId)).thenReturn(Optional.empty());
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
verify(readingPositionRepository).findByStoryId(storyId);
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Filename Generation Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should generate filename with author and title")
|
||||
void testGenerateFilenameWithAuthor() {
|
||||
// Act
|
||||
String filename = epubExportService.getEPUBFilename(testStory);
|
||||
|
||||
// Assert
|
||||
assertNotNull(filename);
|
||||
assertTrue(filename.contains("Test_Author"));
|
||||
assertTrue(filename.contains("Test_Story"));
|
||||
assertTrue(filename.endsWith(".epub"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should generate filename without author")
|
||||
void testGenerateFilenameWithoutAuthor() {
|
||||
// Arrange
|
||||
testStory.setAuthor(null);
|
||||
|
||||
// Act
|
||||
String filename = epubExportService.getEPUBFilename(testStory);
|
||||
|
||||
// Assert
|
||||
assertNotNull(filename);
|
||||
assertTrue(filename.contains("Test_Story"));
|
||||
assertTrue(filename.endsWith(".epub"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should include series info in filename")
|
||||
void testGenerateFilenameWithSeries() {
|
||||
// Arrange
|
||||
testStory.setSeries(testSeries);
|
||||
testStory.setVolume(3);
|
||||
|
||||
// Act
|
||||
String filename = epubExportService.getEPUBFilename(testStory);
|
||||
|
||||
// Assert
|
||||
assertNotNull(filename);
|
||||
assertTrue(filename.contains("Test_Series"));
|
||||
assertTrue(filename.contains("3"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should sanitize special characters in filename")
|
||||
void testSanitizeFilename() {
|
||||
// Arrange
|
||||
testStory.setTitle("Test: Story? With/Special\\Characters!");
|
||||
|
||||
// Act
|
||||
String filename = epubExportService.getEPUBFilename(testStory);
|
||||
|
||||
// Assert
|
||||
assertNotNull(filename);
|
||||
assertFalse(filename.contains(":"));
|
||||
assertFalse(filename.contains("?"));
|
||||
assertFalse(filename.contains("/"));
|
||||
assertFalse(filename.contains("\\"));
|
||||
assertTrue(filename.endsWith(".epub"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Collection Export Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should export collection as EPUB")
|
||||
void testExportCollectionAsEPUB() throws IOException {
|
||||
// Arrange
|
||||
CollectionStory cs = new CollectionStory();
|
||||
cs.setStory(testStory);
|
||||
cs.setPosition(1000);
|
||||
testCollection.setCollectionStories(Arrays.asList(cs));
|
||||
|
||||
when(collectionService.findById(collectionId)).thenReturn(testCollection);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportCollectionAsEPUB(collectionId, testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
verify(collectionService).findById(collectionId);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when exporting empty collection")
|
||||
void testExportEmptyCollection() {
|
||||
// Arrange
|
||||
testCollection.setCollectionStories(new ArrayList<>());
|
||||
when(collectionService.findById(collectionId)).thenReturn(testCollection);
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(ResourceNotFoundException.class, () -> {
|
||||
epubExportService.exportCollectionAsEPUB(collectionId, testRequest);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should export collection with multiple stories in order")
|
||||
void testExportCollectionWithMultipleStories() throws IOException {
|
||||
// Arrange
|
||||
Story story2 = new Story();
|
||||
story2.setId(UUID.randomUUID());
|
||||
story2.setTitle("Second Story");
|
||||
story2.setContentHtml("<p>Second content</p>");
|
||||
story2.setAuthor(testAuthor);
|
||||
story2.setCreatedAt(LocalDateTime.now());
|
||||
story2.setTags(new HashSet<>());
|
||||
|
||||
CollectionStory cs1 = new CollectionStory();
|
||||
cs1.setStory(testStory);
|
||||
cs1.setPosition(1000);
|
||||
|
||||
CollectionStory cs2 = new CollectionStory();
|
||||
cs2.setStory(story2);
|
||||
cs2.setPosition(2000);
|
||||
|
||||
testCollection.setCollectionStories(Arrays.asList(cs1, cs2));
|
||||
when(collectionService.findById(collectionId)).thenReturn(testCollection);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportCollectionAsEPUB(collectionId, testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should generate collection EPUB filename")
|
||||
void testGenerateCollectionFilename() {
|
||||
// Act
|
||||
String filename = epubExportService.getCollectionEPUBFilename(testCollection);
|
||||
|
||||
// Assert
|
||||
assertNotNull(filename);
|
||||
assertTrue(filename.contains("Test_Collection"));
|
||||
assertTrue(filename.contains("collection"));
|
||||
assertTrue(filename.endsWith(".epub"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Utility Method Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should check if story can be exported")
|
||||
void testCanExportStory() {
|
||||
// Arrange
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
boolean canExport = epubExportService.canExportStory(storyId);
|
||||
|
||||
// Assert
|
||||
assertTrue(canExport);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false for story with no content")
|
||||
void testCannotExportStoryWithNoContent() {
|
||||
// Arrange
|
||||
// Create a story with no content set at all
|
||||
Story emptyStory = new Story();
|
||||
emptyStory.setId(storyId);
|
||||
emptyStory.setTitle("Empty Story");
|
||||
when(storyService.findById(storyId)).thenReturn(emptyStory);
|
||||
|
||||
// Act
|
||||
boolean canExport = epubExportService.canExportStory(storyId);
|
||||
|
||||
// Assert
|
||||
assertFalse(canExport);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false for non-existent story")
|
||||
void testCannotExportNonExistentStory() {
|
||||
// Arrange
|
||||
when(storyService.findById(any())).thenThrow(new ResourceNotFoundException("Story not found"));
|
||||
|
||||
// Act
|
||||
boolean canExport = epubExportService.canExportStory(UUID.randomUUID());
|
||||
|
||||
// Assert
|
||||
assertFalse(canExport);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return true for story with plain text content only")
|
||||
void testCanExportStoryWithPlainContent() {
|
||||
// Arrange
|
||||
// Set HTML first which will populate contentPlain, then clear HTML
|
||||
testStory.setContentHtml("<p>Plain text content</p>");
|
||||
testStory.setContentHtml(null);
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
boolean canExport = epubExportService.canExportStory(storyId);
|
||||
|
||||
// Assert
|
||||
// Note: This might return false because contentPlain is protected and we can't verify it
|
||||
// The service checks both contentHtml and contentPlain, but since we can't set contentPlain directly
|
||||
// in tests, this test documents the limitation
|
||||
assertFalse(canExport);
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Edge Cases
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle story with tags")
|
||||
void testStoryWithTags() throws IOException {
|
||||
// Arrange
|
||||
Tag tag1 = new Tag();
|
||||
tag1.setName("fantasy");
|
||||
Tag tag2 = new Tag();
|
||||
tag2.setName("adventure");
|
||||
|
||||
testStory.getTags().add(tag1);
|
||||
testStory.getTags().add(tag2);
|
||||
testRequest.setIncludeMetadata(true);
|
||||
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(2, testStory.getTags().size());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle long story title")
|
||||
void testLongTitle() throws IOException {
|
||||
// Arrange
|
||||
testStory.setTitle("A".repeat(200));
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle HTML with special characters")
|
||||
void testHtmlWithSpecialCharacters() throws IOException {
|
||||
// Arrange
|
||||
testStory.setContentHtml("<p>Content with < > & special chars</p>");
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle story with null description")
|
||||
void testNullDescription() throws IOException {
|
||||
// Arrange
|
||||
testStory.setDescription(null);
|
||||
when(storyService.findById(storyId)).thenReturn(testStory);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportStoryAsEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle collection with null description")
|
||||
void testCollectionWithNullDescription() throws IOException {
|
||||
// Arrange
|
||||
testCollection.setDescription(null);
|
||||
|
||||
CollectionStory cs = new CollectionStory();
|
||||
cs.setStory(testStory);
|
||||
cs.setPosition(1000);
|
||||
testCollection.setCollectionStories(Arrays.asList(cs));
|
||||
|
||||
when(collectionService.findById(collectionId)).thenReturn(testCollection);
|
||||
|
||||
// Act
|
||||
Resource result = epubExportService.exportCollectionAsEPUB(collectionId, testRequest);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.contentLength() > 0);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,490 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.EPUBImportRequest;
|
||||
import com.storycove.dto.EPUBImportResponse;
|
||||
import com.storycove.entity.*;
|
||||
import com.storycove.repository.ReadingPositionRepository;
|
||||
import com.storycove.service.exception.InvalidFileException;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.DisplayName;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.InjectMocks;
|
||||
import org.mockito.Mock;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
import org.springframework.mock.web.MockMultipartFile;
|
||||
import org.springframework.web.multipart.MultipartFile;
|
||||
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.InputStream;
|
||||
import java.util.*;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.*;
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
/**
|
||||
* Tests for EPUBImportService.
|
||||
* Note: These tests mock the EPUB parsing since nl.siegmann.epublib is complex to test.
|
||||
* Integration tests should be added separately to test actual EPUB file parsing.
|
||||
*/
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class EPUBImportServiceTest {
|
||||
|
||||
@Mock
|
||||
private StoryService storyService;
|
||||
|
||||
@Mock
|
||||
private AuthorService authorService;
|
||||
|
||||
@Mock
|
||||
private SeriesService seriesService;
|
||||
|
||||
@Mock
|
||||
private TagService tagService;
|
||||
|
||||
@Mock
|
||||
private ReadingPositionRepository readingPositionRepository;
|
||||
|
||||
@Mock
|
||||
private HtmlSanitizationService sanitizationService;
|
||||
|
||||
@Mock
|
||||
private ImageService imageService;
|
||||
|
||||
@InjectMocks
|
||||
private EPUBImportService epubImportService;
|
||||
|
||||
private EPUBImportRequest testRequest;
|
||||
private Story testStory;
|
||||
private Author testAuthor;
|
||||
private Series testSeries;
|
||||
private UUID storyId;
|
||||
|
||||
@BeforeEach
|
||||
void setUp() {
|
||||
storyId = UUID.randomUUID();
|
||||
|
||||
testStory = new Story();
|
||||
testStory.setId(storyId);
|
||||
testStory.setTitle("Test Story");
|
||||
testStory.setWordCount(1000);
|
||||
|
||||
testAuthor = new Author();
|
||||
testAuthor.setId(UUID.randomUUID());
|
||||
testAuthor.setName("Test Author");
|
||||
|
||||
testSeries = new Series();
|
||||
testSeries.setId(UUID.randomUUID());
|
||||
testSeries.setName("Test Series");
|
||||
|
||||
testRequest = new EPUBImportRequest();
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// File Validation Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reject null EPUB file")
|
||||
void testNullEPUBFile() {
|
||||
// Arrange
|
||||
testRequest.setEpubFile(null);
|
||||
|
||||
// Act
|
||||
EPUBImportResponse response = epubImportService.importEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertFalse(response.isSuccess());
|
||||
assertEquals("EPUB file is required", response.getMessage());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reject empty EPUB file")
|
||||
void testEmptyEPUBFile() {
|
||||
// Arrange
|
||||
MockMultipartFile emptyFile = new MockMultipartFile(
|
||||
"file", "test.epub", "application/epub+zip", new byte[0]
|
||||
);
|
||||
testRequest.setEpubFile(emptyFile);
|
||||
|
||||
// Act
|
||||
EPUBImportResponse response = epubImportService.importEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertFalse(response.isSuccess());
|
||||
assertEquals("EPUB file is required", response.getMessage());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reject non-EPUB file by extension")
|
||||
void testInvalidFileExtension() {
|
||||
// Arrange
|
||||
MockMultipartFile pdfFile = new MockMultipartFile(
|
||||
"file", "test.pdf", "application/pdf", "fake content".getBytes()
|
||||
);
|
||||
testRequest.setEpubFile(pdfFile);
|
||||
|
||||
// Act
|
||||
EPUBImportResponse response = epubImportService.importEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertFalse(response.isSuccess());
|
||||
assertEquals("Invalid EPUB file format", response.getMessage());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should validate EPUB file and return errors")
|
||||
void testValidateEPUBFile() {
|
||||
// Arrange
|
||||
MockMultipartFile invalidFile = new MockMultipartFile(
|
||||
"file", "test.pdf", "application/pdf", "fake content".getBytes()
|
||||
);
|
||||
|
||||
// Act
|
||||
List<String> errors = epubImportService.validateEPUBFile(invalidFile);
|
||||
|
||||
// Assert
|
||||
assertNotNull(errors);
|
||||
assertFalse(errors.isEmpty());
|
||||
assertTrue(errors.stream().anyMatch(e -> e.contains("Invalid EPUB file format")));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should validate file size limit")
|
||||
void testFileSizeLimit() {
|
||||
// Arrange
|
||||
byte[] largeData = new byte[101 * 1024 * 1024]; // 101MB
|
||||
MockMultipartFile largeFile = new MockMultipartFile(
|
||||
"file", "large.epub", "application/epub+zip", largeData
|
||||
);
|
||||
|
||||
// Act
|
||||
List<String> errors = epubImportService.validateEPUBFile(largeFile);
|
||||
|
||||
// Assert
|
||||
assertTrue(errors.stream().anyMatch(e -> e.contains("100MB limit")));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should accept valid EPUB with correct extension")
|
||||
void testAcceptValidEPUBExtension() {
|
||||
// Arrange
|
||||
MockMultipartFile validFile = new MockMultipartFile(
|
||||
"file", "test.epub", "application/epub+zip", createMinimalEPUB()
|
||||
);
|
||||
testRequest.setEpubFile(validFile);
|
||||
|
||||
// Note: This will fail at parsing since we don't have a real EPUB
|
||||
// But it should pass the extension validation
|
||||
EPUBImportResponse response = epubImportService.importEPUB(testRequest);
|
||||
|
||||
// Assert - should fail at parsing, not at validation
|
||||
assertFalse(response.isSuccess());
|
||||
assertNotEquals("Invalid EPUB file format", response.getMessage());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should accept EPUB with application/zip content type")
|
||||
void testAcceptZipContentType() {
|
||||
// Arrange
|
||||
MockMultipartFile zipFile = new MockMultipartFile(
|
||||
"file", "test.epub", "application/zip", createMinimalEPUB()
|
||||
);
|
||||
testRequest.setEpubFile(zipFile);
|
||||
|
||||
// Act
|
||||
EPUBImportResponse response = epubImportService.importEPUB(testRequest);
|
||||
|
||||
// Assert - should not fail at content type validation
|
||||
assertFalse(response.isSuccess());
|
||||
assertNotEquals("Invalid EPUB file format", response.getMessage());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Request Parameter Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle createMissingAuthor flag")
|
||||
void testCreateMissingAuthor() {
|
||||
// This is an integration-level test and would require actual EPUB parsing
|
||||
// We verify the flag is present in the request object
|
||||
testRequest.setCreateMissingAuthor(true);
|
||||
assertTrue(testRequest.getCreateMissingAuthor());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle createMissingSeries flag")
|
||||
void testCreateMissingSeries() {
|
||||
testRequest.setCreateMissingSeries(true);
|
||||
testRequest.setSeriesName("New Series");
|
||||
testRequest.setSeriesVolume(1);
|
||||
|
||||
assertTrue(testRequest.getCreateMissingSeries());
|
||||
assertEquals("New Series", testRequest.getSeriesName());
|
||||
assertEquals(1, testRequest.getSeriesVolume());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle extractCover flag")
|
||||
void testExtractCoverFlag() {
|
||||
testRequest.setExtractCover(true);
|
||||
assertTrue(testRequest.getExtractCover());
|
||||
|
||||
testRequest.setExtractCover(false);
|
||||
assertFalse(testRequest.getExtractCover());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle preserveReadingPosition flag")
|
||||
void testPreserveReadingPositionFlag() {
|
||||
testRequest.setPreserveReadingPosition(true);
|
||||
assertTrue(testRequest.getPreserveReadingPosition());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle custom tags")
|
||||
void testCustomTags() {
|
||||
List<String> tags = Arrays.asList("fantasy", "adventure", "magic");
|
||||
testRequest.setTags(tags);
|
||||
|
||||
assertEquals(3, testRequest.getTags().size());
|
||||
assertTrue(testRequest.getTags().contains("fantasy"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Author Handling Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should use provided authorId when available")
|
||||
void testUseProvidedAuthorId() {
|
||||
// This would require mocking the EPUB parsing
|
||||
// We verify the request accepts authorId
|
||||
UUID authorId = UUID.randomUUID();
|
||||
testRequest.setAuthorId(authorId);
|
||||
assertEquals(authorId, testRequest.getAuthorId());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should use provided authorName")
|
||||
void testUseProvidedAuthorName() {
|
||||
testRequest.setAuthorName("Custom Author Name");
|
||||
assertEquals("Custom Author Name", testRequest.getAuthorName());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Series Handling Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should use provided seriesId and volume")
|
||||
void testUseProvidedSeriesId() {
|
||||
UUID seriesId = UUID.randomUUID();
|
||||
testRequest.setSeriesId(seriesId);
|
||||
testRequest.setSeriesVolume(5);
|
||||
|
||||
assertEquals(seriesId, testRequest.getSeriesId());
|
||||
assertEquals(5, testRequest.getSeriesVolume());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Error Handling Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle corrupt EPUB file gracefully")
|
||||
void testCorruptEPUBFile() {
|
||||
// Arrange
|
||||
MockMultipartFile corruptFile = new MockMultipartFile(
|
||||
"file", "corrupt.epub", "application/epub+zip", "not a real epub".getBytes()
|
||||
);
|
||||
testRequest.setEpubFile(corruptFile);
|
||||
|
||||
// Act
|
||||
EPUBImportResponse response = epubImportService.importEPUB(testRequest);
|
||||
|
||||
// Assert
|
||||
assertFalse(response.isSuccess());
|
||||
assertNotNull(response.getMessage());
|
||||
assertTrue(response.getMessage().contains("Failed to import EPUB"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle missing metadata gracefully")
|
||||
void testMissingMetadata() {
|
||||
// Arrange
|
||||
MockMultipartFile epubFile = new MockMultipartFile(
|
||||
"file", "test.epub", "application/epub+zip", createMinimalEPUB()
|
||||
);
|
||||
|
||||
// Act
|
||||
List<String> errors = epubImportService.validateEPUBFile(epubFile);
|
||||
|
||||
// Assert - validation should catch missing metadata
|
||||
assertNotNull(errors);
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Response Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should create success response with correct fields")
|
||||
void testSuccessResponse() {
|
||||
// Arrange
|
||||
EPUBImportResponse response = EPUBImportResponse.success(storyId, "Test Story");
|
||||
response.setWordCount(1500);
|
||||
response.setTotalChapters(10);
|
||||
|
||||
// Assert
|
||||
assertTrue(response.isSuccess());
|
||||
assertEquals(storyId, response.getStoryId());
|
||||
assertEquals("Test Story", response.getStoryTitle());
|
||||
assertEquals(1500, response.getWordCount());
|
||||
assertEquals(10, response.getTotalChapters());
|
||||
assertNull(response.getMessage());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should create error response with message")
|
||||
void testErrorResponse() {
|
||||
// Arrange
|
||||
EPUBImportResponse response = EPUBImportResponse.error("Test error message");
|
||||
|
||||
// Assert
|
||||
assertFalse(response.isSuccess());
|
||||
assertEquals("Test error message", response.getMessage());
|
||||
assertNull(response.getStoryId());
|
||||
assertNull(response.getStoryTitle());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Integration Scenario Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle complete import workflow (mock)")
|
||||
void testCompleteImportWorkflow() {
|
||||
// This test verifies that all the request parameters are properly structured
|
||||
// Actual EPUB parsing would be tested in integration tests
|
||||
|
||||
// Arrange - Create a complete request
|
||||
testRequest.setEpubFile(new MockMultipartFile(
|
||||
"file", "story.epub", "application/epub+zip", createMinimalEPUB()
|
||||
));
|
||||
testRequest.setAuthorName("Jane Doe");
|
||||
testRequest.setCreateMissingAuthor(true);
|
||||
testRequest.setSeriesName("Epic Series");
|
||||
testRequest.setSeriesVolume(3);
|
||||
testRequest.setCreateMissingSeries(true);
|
||||
testRequest.setTags(Arrays.asList("fantasy", "adventure"));
|
||||
testRequest.setExtractCover(true);
|
||||
testRequest.setPreserveReadingPosition(true);
|
||||
|
||||
// Assert - All parameters set correctly
|
||||
assertNotNull(testRequest.getEpubFile());
|
||||
assertEquals("Jane Doe", testRequest.getAuthorName());
|
||||
assertTrue(testRequest.getCreateMissingAuthor());
|
||||
assertEquals("Epic Series", testRequest.getSeriesName());
|
||||
assertEquals(3, testRequest.getSeriesVolume());
|
||||
assertTrue(testRequest.getCreateMissingSeries());
|
||||
assertEquals(2, testRequest.getTags().size());
|
||||
assertTrue(testRequest.getExtractCover());
|
||||
assertTrue(testRequest.getPreserveReadingPosition());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle minimal import request")
|
||||
void testMinimalImportRequest() {
|
||||
// Arrange - Only required field
|
||||
testRequest.setEpubFile(new MockMultipartFile(
|
||||
"file", "simple.epub", "application/epub+zip", createMinimalEPUB()
|
||||
));
|
||||
|
||||
// Assert - Optional fields are null/false
|
||||
assertNotNull(testRequest.getEpubFile());
|
||||
assertNull(testRequest.getAuthorId());
|
||||
assertNull(testRequest.getAuthorName());
|
||||
assertNull(testRequest.getSeriesId());
|
||||
assertNull(testRequest.getTags());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Edge Cases
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle EPUB with special characters in filename")
|
||||
void testSpecialCharactersInFilename() {
|
||||
// Arrange
|
||||
MockMultipartFile fileWithSpecialChars = new MockMultipartFile(
|
||||
"file", "test story (2024) #1.epub", "application/epub+zip", createMinimalEPUB()
|
||||
);
|
||||
testRequest.setEpubFile(fileWithSpecialChars);
|
||||
|
||||
// Act
|
||||
EPUBImportResponse response = epubImportService.importEPUB(testRequest);
|
||||
|
||||
// Assert - should not fail due to filename
|
||||
assertNotNull(response);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle EPUB with null content type")
|
||||
void testNullContentType() {
|
||||
// Arrange
|
||||
MockMultipartFile fileWithNullContentType = new MockMultipartFile(
|
||||
"file", "test.epub", null, createMinimalEPUB()
|
||||
);
|
||||
testRequest.setEpubFile(fileWithNullContentType);
|
||||
|
||||
// Act - Should still validate based on extension
|
||||
EPUBImportResponse response = epubImportService.importEPUB(testRequest);
|
||||
|
||||
// Assert - should not fail at validation, only at parsing
|
||||
assertNotNull(response);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should trim whitespace from author name")
|
||||
void testTrimAuthorName() {
|
||||
testRequest.setAuthorName(" John Doe ");
|
||||
// The service should trim this internally
|
||||
assertEquals(" John Doe ", testRequest.getAuthorName());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle empty tags list")
|
||||
void testEmptyTagsList() {
|
||||
testRequest.setTags(new ArrayList<>());
|
||||
assertNotNull(testRequest.getTags());
|
||||
assertTrue(testRequest.getTags().isEmpty());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle duplicate tags in request")
|
||||
void testDuplicateTags() {
|
||||
List<String> tagsWithDuplicates = Arrays.asList("fantasy", "adventure", "fantasy");
|
||||
testRequest.setTags(tagsWithDuplicates);
|
||||
|
||||
assertEquals(3, testRequest.getTags().size());
|
||||
// The service should handle deduplication internally
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Helper Methods
|
||||
// ========================================
|
||||
|
||||
/**
|
||||
* Creates minimal EPUB-like content for testing.
|
||||
* Note: This is not a real EPUB, just test data.
|
||||
*/
|
||||
private byte[] createMinimalEPUB() {
|
||||
// This creates minimal test data that looks like an EPUB structure
|
||||
// Real EPUB parsing would require a proper EPUB file structure
|
||||
return "PK\u0003\u0004fake epub content".getBytes();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,335 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.storycove.dto.HtmlSanitizationConfigDto;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.DisplayName;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.boot.test.context.SpringBootTest;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
|
||||
/**
|
||||
* Security-critical tests for HtmlSanitizationService.
|
||||
* These tests ensure that malicious HTML is properly sanitized.
|
||||
*/
|
||||
@SpringBootTest
|
||||
class HtmlSanitizationServiceTest {
|
||||
|
||||
@Autowired
|
||||
private HtmlSanitizationService sanitizationService;
|
||||
|
||||
@BeforeEach
|
||||
void setUp() {
|
||||
// Service is initialized via @PostConstruct
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// XSS Attack Prevention Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove script tags (XSS prevention)")
|
||||
void testRemoveScriptTags() {
|
||||
String malicious = "<p>Hello</p><script>alert('XSS')</script>";
|
||||
String sanitized = sanitizationService.sanitize(malicious);
|
||||
|
||||
assertFalse(sanitized.contains("<script>"));
|
||||
assertFalse(sanitized.contains("alert"));
|
||||
assertTrue(sanitized.contains("Hello"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove inline JavaScript event handlers")
|
||||
void testRemoveEventHandlers() {
|
||||
String malicious = "<p onclick='alert(\"XSS\")'>Click me</p>";
|
||||
String sanitized = sanitizationService.sanitize(malicious);
|
||||
|
||||
assertFalse(sanitized.contains("onclick"));
|
||||
assertFalse(sanitized.contains("alert"));
|
||||
assertTrue(sanitized.contains("Click me"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove javascript: URLs")
|
||||
void testRemoveJavaScriptUrls() {
|
||||
String malicious = "<a href='javascript:alert(\"XSS\")'>Click</a>";
|
||||
String sanitized = sanitizationService.sanitize(malicious);
|
||||
|
||||
assertFalse(sanitized.contains("javascript:"));
|
||||
assertFalse(sanitized.contains("alert"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove data: URLs with JavaScript")
|
||||
void testRemoveDataUrlsWithJs() {
|
||||
String malicious = "<a href='data:text/html,<script>alert(\"XSS\")</script>'>Click</a>";
|
||||
String sanitized = sanitizationService.sanitize(malicious);
|
||||
|
||||
assertFalse(sanitized.toLowerCase().contains("script"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove iframe tags")
|
||||
void testRemoveIframeTags() {
|
||||
String malicious = "<p>Content</p><iframe src='http://evil.com'></iframe>";
|
||||
String sanitized = sanitizationService.sanitize(malicious);
|
||||
|
||||
assertFalse(sanitized.contains("<iframe"));
|
||||
assertTrue(sanitized.contains("Content"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove object and embed tags")
|
||||
void testRemoveObjectAndEmbedTags() {
|
||||
String malicious = "<object data='http://evil.com'></object><embed src='http://evil.com'>";
|
||||
String sanitized = sanitizationService.sanitize(malicious);
|
||||
|
||||
assertFalse(sanitized.contains("<object"));
|
||||
assertFalse(sanitized.contains("<embed"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Allowed Content Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should preserve safe HTML tags")
|
||||
void testPreserveSafeTags() {
|
||||
String safe = "<p>Paragraph</p><h1>Heading</h1><ul><li>Item</li></ul>";
|
||||
String sanitized = sanitizationService.sanitize(safe);
|
||||
|
||||
assertTrue(sanitized.contains("<p>"));
|
||||
assertTrue(sanitized.contains("<h1>"));
|
||||
assertTrue(sanitized.contains("<ul>"));
|
||||
assertTrue(sanitized.contains("<li>"));
|
||||
assertTrue(sanitized.contains("Paragraph"));
|
||||
assertTrue(sanitized.contains("Heading"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should preserve text formatting tags")
|
||||
void testPreserveFormattingTags() {
|
||||
String formatted = "<p><strong>Bold</strong> <em>Italic</em> <u>Underline</u></p>";
|
||||
String sanitized = sanitizationService.sanitize(formatted);
|
||||
|
||||
assertTrue(sanitized.contains("<strong>"));
|
||||
assertTrue(sanitized.contains("<em>"));
|
||||
assertTrue(sanitized.contains("<u>"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should preserve safe links")
|
||||
void testPreserveSafeLinks() {
|
||||
String link = "<a href='https://example.com'>Link</a>";
|
||||
String sanitized = sanitizationService.sanitize(link);
|
||||
|
||||
assertTrue(sanitized.contains("<a"));
|
||||
assertTrue(sanitized.contains("href"));
|
||||
assertTrue(sanitized.contains("example.com"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should preserve images with safe attributes")
|
||||
void testPreserveSafeImages() {
|
||||
String img = "<img src='https://example.com/image.jpg' alt='Description'>";
|
||||
String sanitized = sanitizationService.sanitize(img);
|
||||
|
||||
assertTrue(sanitized.contains("<img"));
|
||||
assertTrue(sanitized.contains("src"));
|
||||
assertTrue(sanitized.contains("alt"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should preserve relative image URLs")
|
||||
void testPreserveRelativeImageUrls() {
|
||||
String img = "<img src='/images/photo.jpg' alt='Photo'>";
|
||||
String sanitized = sanitizationService.sanitize(img);
|
||||
|
||||
assertTrue(sanitized.contains("<img"));
|
||||
assertTrue(sanitized.contains("/images/photo.jpg"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Figure Tag Preprocessing Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should extract image from figure tag")
|
||||
void testExtractImageFromFigure() {
|
||||
String figure = "<figure><img src='/image.jpg' alt='Test'><figcaption>Caption</figcaption></figure>";
|
||||
String sanitized = sanitizationService.sanitize(figure);
|
||||
|
||||
assertFalse(sanitized.contains("<figure"));
|
||||
assertFalse(sanitized.contains("<figcaption"));
|
||||
assertTrue(sanitized.contains("<img"));
|
||||
assertTrue(sanitized.contains("/image.jpg"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should use figcaption as alt text if alt is missing")
|
||||
void testFigcaptionAsAltText() {
|
||||
String figure = "<figure><img src='/image.jpg'><figcaption>My Caption</figcaption></figure>";
|
||||
String sanitized = sanitizationService.sanitize(figure);
|
||||
|
||||
assertTrue(sanitized.contains("<img"));
|
||||
assertTrue(sanitized.contains("alt="));
|
||||
assertTrue(sanitized.contains("My Caption"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove figure without images")
|
||||
void testRemoveFigureWithoutImages() {
|
||||
String figure = "<p>Before</p><figure><figcaption>Caption only</figcaption></figure><p>After</p>";
|
||||
String sanitized = sanitizationService.sanitize(figure);
|
||||
|
||||
assertFalse(sanitized.contains("<figure"));
|
||||
assertFalse(sanitized.contains("Caption only"));
|
||||
assertTrue(sanitized.contains("Before"));
|
||||
assertTrue(sanitized.contains("After"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Edge Cases and Utility Methods
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle null input")
|
||||
void testNullInput() {
|
||||
String sanitized = sanitizationService.sanitize(null);
|
||||
assertEquals("", sanitized);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle empty input")
|
||||
void testEmptyInput() {
|
||||
String sanitized = sanitizationService.sanitize("");
|
||||
assertEquals("", sanitized);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle whitespace-only input")
|
||||
void testWhitespaceInput() {
|
||||
String sanitized = sanitizationService.sanitize(" ");
|
||||
assertEquals("", sanitized);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should extract plain text from HTML")
|
||||
void testExtractPlainText() {
|
||||
String html = "<p>Hello <strong>World</strong></p>";
|
||||
String plainText = sanitizationService.extractPlainText(html);
|
||||
|
||||
assertEquals("Hello World", plainText);
|
||||
assertFalse(plainText.contains("<"));
|
||||
assertFalse(plainText.contains(">"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should detect clean HTML")
|
||||
void testIsCleanWithCleanHtml() {
|
||||
String clean = "<p>Safe content</p>";
|
||||
assertTrue(sanitizationService.isClean(clean));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should detect malicious HTML")
|
||||
void testIsCleanWithMaliciousHtml() {
|
||||
String malicious = "<p>Content</p><script>alert('XSS')</script>";
|
||||
assertFalse(sanitizationService.isClean(malicious));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should sanitize and extract text")
|
||||
void testSanitizeAndExtractText() {
|
||||
String html = "<p>Hello</p><script>alert('XSS')</script>";
|
||||
String result = sanitizationService.sanitizeAndExtractText(html);
|
||||
|
||||
assertEquals("Hello", result);
|
||||
assertFalse(result.contains("script"));
|
||||
assertFalse(result.contains("XSS"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Configuration Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should load and provide configuration")
|
||||
void testGetConfiguration() {
|
||||
HtmlSanitizationConfigDto config = sanitizationService.getConfiguration();
|
||||
|
||||
assertNotNull(config);
|
||||
assertNotNull(config.getAllowedTags());
|
||||
assertFalse(config.getAllowedTags().isEmpty());
|
||||
assertTrue(config.getAllowedTags().contains("p"));
|
||||
assertTrue(config.getAllowedTags().contains("a"));
|
||||
assertTrue(config.getAllowedTags().contains("img"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Complex Attack Vectors
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should prevent nested XSS attacks")
|
||||
void testNestedXssAttacks() {
|
||||
String nested = "<p><script><script>alert('XSS')</script></script></p>";
|
||||
String sanitized = sanitizationService.sanitize(nested);
|
||||
|
||||
assertFalse(sanitized.contains("<script"));
|
||||
assertFalse(sanitized.contains("alert"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should prevent encoded XSS attacks")
|
||||
void testEncodedXssAttacks() {
|
||||
String encoded = "<img src=x onerror='alert(1)'>";
|
||||
String sanitized = sanitizationService.sanitize(encoded);
|
||||
|
||||
assertFalse(sanitized.contains("onerror"));
|
||||
assertFalse(sanitized.contains("alert"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should prevent CSS injection attacks")
|
||||
void testCssInjectionPrevention() {
|
||||
String cssInjection = "<p style='background:url(javascript:alert(1))'>Text</p>";
|
||||
String sanitized = sanitizationService.sanitize(cssInjection);
|
||||
|
||||
assertFalse(sanitized.toLowerCase().contains("javascript:"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should preserve multiple safe elements")
|
||||
void testComplexSafeHtml() {
|
||||
String complex = "<div><h1>Title</h1><p>Paragraph with <strong>bold</strong> and " +
|
||||
"<em>italic</em></p><ul><li>Item 1</li><li>Item 2</li></ul>" +
|
||||
"<img src='/image.jpg' alt='Image'></div>";
|
||||
String sanitized = sanitizationService.sanitize(complex);
|
||||
|
||||
assertTrue(sanitized.contains("<div"));
|
||||
assertTrue(sanitized.contains("<h1>"));
|
||||
assertTrue(sanitized.contains("<p>"));
|
||||
assertTrue(sanitized.contains("<strong>"));
|
||||
assertTrue(sanitized.contains("<em>"));
|
||||
assertTrue(sanitized.contains("<ul>"));
|
||||
assertTrue(sanitized.contains("<li>"));
|
||||
assertTrue(sanitized.contains("<img"));
|
||||
assertTrue(sanitized.contains("Title"));
|
||||
assertTrue(sanitized.contains("Item 1"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle malformed HTML gracefully")
|
||||
void testMalformedHtml() {
|
||||
String malformed = "<p>Unclosed paragraph<div>Nested incorrectly</p></div>";
|
||||
String sanitized = sanitizationService.sanitize(malformed);
|
||||
|
||||
// Should not throw exception and should return something
|
||||
assertNotNull(sanitized);
|
||||
assertTrue(sanitized.contains("Unclosed paragraph"));
|
||||
assertTrue(sanitized.contains("Nested incorrectly"));
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,621 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Collection;
|
||||
import com.storycove.entity.Story;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.DisplayName;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.junit.jupiter.api.io.TempDir;
|
||||
import org.mockito.InjectMocks;
|
||||
import org.mockito.Mock;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
import org.springframework.mock.web.MockMultipartFile;
|
||||
import org.springframework.test.util.ReflectionTestUtils;
|
||||
import org.springframework.web.multipart.MultipartFile;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.*;
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
/**
|
||||
* Tests for ImageService.
|
||||
* Note: Some tests use mocking due to filesystem and network dependencies.
|
||||
* Full integration tests would be in a separate test class.
|
||||
*/
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class ImageServiceTest {
|
||||
|
||||
@Mock
|
||||
private LibraryService libraryService;
|
||||
|
||||
@Mock
|
||||
private StoryService storyService;
|
||||
|
||||
@Mock
|
||||
private AuthorService authorService;
|
||||
|
||||
@Mock
|
||||
private CollectionService collectionService;
|
||||
|
||||
@InjectMocks
|
||||
private ImageService imageService;
|
||||
|
||||
@TempDir
|
||||
Path tempDir;
|
||||
|
||||
private MultipartFile validImageFile;
|
||||
private UUID testStoryId;
|
||||
|
||||
@BeforeEach
|
||||
void setUp() throws IOException {
|
||||
testStoryId = UUID.randomUUID();
|
||||
|
||||
// Create a simple valid PNG file (1x1 pixel)
|
||||
byte[] pngData = createMinimalPngData();
|
||||
validImageFile = new MockMultipartFile(
|
||||
"image", "test.png", "image/png", pngData
|
||||
);
|
||||
|
||||
// Configure ImageService with test values
|
||||
when(libraryService.getCurrentImagePath()).thenReturn("/default");
|
||||
when(libraryService.getCurrentLibraryId()).thenReturn("default");
|
||||
|
||||
// Set image service properties using reflection
|
||||
ReflectionTestUtils.setField(imageService, "baseUploadDir", tempDir.toString());
|
||||
ReflectionTestUtils.setField(imageService, "coverMaxWidth", 800);
|
||||
ReflectionTestUtils.setField(imageService, "coverMaxHeight", 1200);
|
||||
ReflectionTestUtils.setField(imageService, "avatarMaxSize", 400);
|
||||
ReflectionTestUtils.setField(imageService, "maxFileSize", 5242880L);
|
||||
ReflectionTestUtils.setField(imageService, "publicUrl", "http://localhost:6925");
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// File Validation Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reject null file")
|
||||
void testRejectNullFile() {
|
||||
// Act & Assert
|
||||
assertThrows(IllegalArgumentException.class, () -> {
|
||||
imageService.uploadImage(null, ImageService.ImageType.COVER);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reject empty file")
|
||||
void testRejectEmptyFile() {
|
||||
// Arrange
|
||||
MockMultipartFile emptyFile = new MockMultipartFile(
|
||||
"image", "test.png", "image/png", new byte[0]
|
||||
);
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(IllegalArgumentException.class, () -> {
|
||||
imageService.uploadImage(emptyFile, ImageService.ImageType.COVER);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reject file with invalid content type")
|
||||
void testRejectInvalidContentType() {
|
||||
// Arrange
|
||||
MockMultipartFile invalidFile = new MockMultipartFile(
|
||||
"image", "test.pdf", "application/pdf", "fake pdf content".getBytes()
|
||||
);
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(IllegalArgumentException.class, () -> {
|
||||
imageService.uploadImage(invalidFile, ImageService.ImageType.COVER);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reject file with invalid extension")
|
||||
void testRejectInvalidExtension() {
|
||||
// Arrange
|
||||
MockMultipartFile invalidFile = new MockMultipartFile(
|
||||
"image", "test.gif", "image/png", createMinimalPngData()
|
||||
);
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(IllegalArgumentException.class, () -> {
|
||||
imageService.uploadImage(invalidFile, ImageService.ImageType.COVER);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should reject file exceeding size limit")
|
||||
void testRejectOversizedFile() {
|
||||
// Arrange
|
||||
// Create file larger than 5MB limit
|
||||
byte[] largeData = new byte[6 * 1024 * 1024]; // 6MB
|
||||
MockMultipartFile largeFile = new MockMultipartFile(
|
||||
"image", "large.png", "image/png", largeData
|
||||
);
|
||||
|
||||
// Act & Assert
|
||||
assertThrows(IllegalArgumentException.class, () -> {
|
||||
imageService.uploadImage(largeFile, ImageService.ImageType.COVER);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should accept JPG files")
|
||||
void testAcceptJpgFile() {
|
||||
// Arrange
|
||||
MockMultipartFile jpgFile = new MockMultipartFile(
|
||||
"image", "test.jpg", "image/jpeg", createMinimalPngData() // Using PNG data for test simplicity
|
||||
);
|
||||
|
||||
// Note: This test will fail at image processing stage since we're not providing real JPG data
|
||||
// but it validates that JPG is accepted as a file type
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should accept PNG files")
|
||||
void testAcceptPngFile() {
|
||||
// PNG is tested in setUp, this validates the behavior
|
||||
assertNotNull(validImageFile);
|
||||
assertEquals("image/png", validImageFile.getContentType());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Image Type Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should have correct directory for COVER type")
|
||||
void testCoverImageDirectory() {
|
||||
assertEquals("covers", ImageService.ImageType.COVER.getDirectory());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should have correct directory for AVATAR type")
|
||||
void testAvatarImageDirectory() {
|
||||
assertEquals("avatars", ImageService.ImageType.AVATAR.getDirectory());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should have correct directory for CONTENT type")
|
||||
void testContentImageDirectory() {
|
||||
assertEquals("content", ImageService.ImageType.CONTENT.getDirectory());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Image Existence Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false for null image path")
|
||||
void testImageExistsWithNullPath() {
|
||||
assertFalse(imageService.imageExists(null));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false for empty image path")
|
||||
void testImageExistsWithEmptyPath() {
|
||||
assertFalse(imageService.imageExists(""));
|
||||
assertFalse(imageService.imageExists(" "));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false for non-existent image")
|
||||
void testImageExistsWithNonExistentPath() {
|
||||
assertFalse(imageService.imageExists("covers/non-existent.jpg"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false for null library ID in imageExistsInLibrary")
|
||||
void testImageExistsInLibraryWithNullLibraryId() {
|
||||
assertFalse(imageService.imageExistsInLibrary("covers/test.jpg", null));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Image Deletion Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false when deleting null path")
|
||||
void testDeleteNullPath() {
|
||||
assertFalse(imageService.deleteImage(null));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false when deleting empty path")
|
||||
void testDeleteEmptyPath() {
|
||||
assertFalse(imageService.deleteImage(""));
|
||||
assertFalse(imageService.deleteImage(" "));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return false when deleting non-existent image")
|
||||
void testDeleteNonExistentImage() {
|
||||
assertFalse(imageService.deleteImage("covers/non-existent.jpg"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Content Image Processing Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should process content with no images")
|
||||
void testProcessContentWithNoImages() {
|
||||
// Arrange
|
||||
String htmlContent = "<p>This is plain text with no images</p>";
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(htmlContent, testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(htmlContent, result.getProcessedContent());
|
||||
assertTrue(result.getDownloadedImages().isEmpty());
|
||||
assertFalse(result.hasWarnings());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle null content gracefully")
|
||||
void testProcessNullContent() {
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(null, testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertNull(result.getProcessedContent());
|
||||
assertTrue(result.getDownloadedImages().isEmpty());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle empty content gracefully")
|
||||
void testProcessEmptyContent() {
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages("", testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals("", result.getProcessedContent());
|
||||
assertTrue(result.getDownloadedImages().isEmpty());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should skip data URLs")
|
||||
void testSkipDataUrls() {
|
||||
// Arrange
|
||||
String htmlWithDataUrl = "<p><img src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mNk+M9QDwADhgGAWjR9awAAAABJRU5ErkJggg==\"></p>";
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(htmlWithDataUrl, testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.getDownloadedImages().isEmpty());
|
||||
assertFalse(result.hasWarnings());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should skip local/relative URLs")
|
||||
void testSkipLocalUrls() {
|
||||
// Arrange
|
||||
String htmlWithLocalUrl = "<p><img src=\"/images/local-image.jpg\"></p>";
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(htmlWithLocalUrl, testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.getDownloadedImages().isEmpty());
|
||||
assertFalse(result.hasWarnings());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should skip images from same application")
|
||||
void testSkipApplicationUrls() {
|
||||
// Arrange
|
||||
String htmlWithAppUrl = "<p><img src=\"/api/files/images/default/covers/test.jpg\"></p>";
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(htmlWithAppUrl, testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.getDownloadedImages().isEmpty());
|
||||
assertFalse(result.hasWarnings());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle external URL gracefully when download fails")
|
||||
void testHandleDownloadFailure() {
|
||||
// Arrange
|
||||
String htmlWithExternalUrl = "<p><img src=\"http://example.com/non-existent-image.jpg\"></p>";
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(htmlWithExternalUrl, testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.hasWarnings());
|
||||
assertEquals(1, result.getWarnings().size());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Content Image Cleanup Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should perform dry run cleanup without deleting")
|
||||
void testDryRunCleanup() {
|
||||
// Arrange
|
||||
when(storyService.findAllWithAssociations()).thenReturn(new ArrayList<>());
|
||||
when(authorService.findAll()).thenReturn(new ArrayList<>());
|
||||
when(collectionService.findAllWithTags()).thenReturn(new ArrayList<>());
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageCleanupResult result =
|
||||
imageService.cleanupOrphanedContentImages(true);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.isDryRun());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle cleanup with no content directory")
|
||||
void testCleanupWithNoContentDirectory() {
|
||||
// Arrange
|
||||
when(storyService.findAllWithAssociations()).thenReturn(new ArrayList<>());
|
||||
when(authorService.findAll()).thenReturn(new ArrayList<>());
|
||||
when(collectionService.findAllWithTags()).thenReturn(new ArrayList<>());
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageCleanupResult result =
|
||||
imageService.cleanupOrphanedContentImages(false);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals(0, result.getTotalReferencedImages());
|
||||
assertTrue(result.getOrphanedImages().isEmpty());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should collect image references from stories")
|
||||
void testCollectImageReferences() {
|
||||
// Arrange
|
||||
Story story = new Story();
|
||||
story.setId(testStoryId);
|
||||
story.setContentHtml("<p><img src=\"/api/files/images/default/content/" + testStoryId + "/test-image.jpg\"></p>");
|
||||
|
||||
when(storyService.findAllWithAssociations()).thenReturn(List.of(story));
|
||||
when(authorService.findAll()).thenReturn(new ArrayList<>());
|
||||
when(collectionService.findAllWithTags()).thenReturn(new ArrayList<>());
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageCleanupResult result =
|
||||
imageService.cleanupOrphanedContentImages(true);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertTrue(result.getTotalReferencedImages() > 0);
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Cleanup Result Formatting Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should format bytes correctly")
|
||||
void testFormatBytes() {
|
||||
ImageService.ContentImageCleanupResult result =
|
||||
new ImageService.ContentImageCleanupResult(
|
||||
new ArrayList<>(), 512, 0, 0, new ArrayList<>(), true
|
||||
);
|
||||
|
||||
assertEquals("512 B", result.getFormattedSize());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should format kilobytes correctly")
|
||||
void testFormatKilobytes() {
|
||||
ImageService.ContentImageCleanupResult result =
|
||||
new ImageService.ContentImageCleanupResult(
|
||||
new ArrayList<>(), 1536, 0, 0, new ArrayList<>(), true
|
||||
);
|
||||
|
||||
assertTrue(result.getFormattedSize().contains("KB"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should format megabytes correctly")
|
||||
void testFormatMegabytes() {
|
||||
ImageService.ContentImageCleanupResult result =
|
||||
new ImageService.ContentImageCleanupResult(
|
||||
new ArrayList<>(), 1024 * 1024 * 5, 0, 0, new ArrayList<>(), true
|
||||
);
|
||||
|
||||
assertTrue(result.getFormattedSize().contains("MB"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should format gigabytes correctly")
|
||||
void testFormatGigabytes() {
|
||||
ImageService.ContentImageCleanupResult result =
|
||||
new ImageService.ContentImageCleanupResult(
|
||||
new ArrayList<>(), 1024L * 1024L * 1024L * 2L, 0, 0, new ArrayList<>(), true
|
||||
);
|
||||
|
||||
assertTrue(result.getFormattedSize().contains("GB"));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should track cleanup errors")
|
||||
void testCleanupErrors() {
|
||||
List<String> errors = new ArrayList<>();
|
||||
errors.add("Test error 1");
|
||||
errors.add("Test error 2");
|
||||
|
||||
ImageService.ContentImageCleanupResult result =
|
||||
new ImageService.ContentImageCleanupResult(
|
||||
new ArrayList<>(), 0, 0, 0, errors, false
|
||||
);
|
||||
|
||||
assertTrue(result.hasErrors());
|
||||
assertEquals(2, result.getErrors().size());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Content Image Processing Result Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should create processing result with warnings")
|
||||
void testProcessingResultWithWarnings() {
|
||||
List<String> warnings = List.of("Warning 1", "Warning 2");
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
new ImageService.ContentImageProcessingResult(
|
||||
"<p>Content</p>", warnings, new ArrayList<>()
|
||||
);
|
||||
|
||||
assertTrue(result.hasWarnings());
|
||||
assertEquals(2, result.getWarnings().size());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should create processing result without warnings")
|
||||
void testProcessingResultWithoutWarnings() {
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
new ImageService.ContentImageProcessingResult(
|
||||
"<p>Content</p>", new ArrayList<>(), new ArrayList<>()
|
||||
);
|
||||
|
||||
assertFalse(result.hasWarnings());
|
||||
assertEquals("<p>Content</p>", result.getProcessedContent());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should track downloaded images")
|
||||
void testTrackDownloadedImages() {
|
||||
List<String> downloadedImages = List.of(
|
||||
"content/story1/image1.jpg",
|
||||
"content/story1/image2.jpg"
|
||||
);
|
||||
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
new ImageService.ContentImageProcessingResult(
|
||||
"<p>Content</p>", new ArrayList<>(), downloadedImages
|
||||
);
|
||||
|
||||
assertEquals(2, result.getDownloadedImages().size());
|
||||
assertTrue(result.getDownloadedImages().contains("content/story1/image1.jpg"));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Story Content Deletion Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should delete content images for story")
|
||||
void testDeleteContentImages() {
|
||||
// Act - Should not throw exception even if directory doesn't exist
|
||||
assertDoesNotThrow(() -> {
|
||||
imageService.deleteContentImages(testStoryId);
|
||||
});
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Edge Cases
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle HTML with multiple images")
|
||||
void testMultipleImages() {
|
||||
// Arrange
|
||||
String html = "<p><img src=\"/local1.jpg\"><img src=\"/local2.jpg\"></p>";
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(html, testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
// Local images should be skipped
|
||||
assertTrue(result.getDownloadedImages().isEmpty());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle malformed HTML gracefully")
|
||||
void testMalformedHtml() {
|
||||
// Arrange
|
||||
String malformedHtml = "<p>Unclosed <img src=\"/test.jpg\" <p>";
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(malformedHtml, testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle very long content")
|
||||
void testVeryLongContent() {
|
||||
// Arrange
|
||||
StringBuilder longContent = new StringBuilder();
|
||||
for (int i = 0; i < 10000; i++) {
|
||||
longContent.append("<p>Paragraph ").append(i).append("</p>");
|
||||
}
|
||||
|
||||
// Act
|
||||
ImageService.ContentImageProcessingResult result =
|
||||
imageService.processContentImages(longContent.toString(), testStoryId);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Helper Methods
|
||||
// ========================================
|
||||
|
||||
/**
|
||||
* Create minimal valid PNG data for testing.
|
||||
* This is a 1x1 pixel transparent PNG image.
|
||||
*/
|
||||
private byte[] createMinimalPngData() {
|
||||
return new byte[]{
|
||||
(byte) 0x89, 'P', 'N', 'G', '\r', '\n', 0x1A, '\n', // PNG signature
|
||||
0x00, 0x00, 0x00, 0x0D, // IHDR chunk length
|
||||
'I', 'H', 'D', 'R', // IHDR chunk type
|
||||
0x00, 0x00, 0x00, 0x01, // Width: 1
|
||||
0x00, 0x00, 0x00, 0x01, // Height: 1
|
||||
0x08, // Bit depth: 8
|
||||
0x06, // Color type: RGBA
|
||||
0x00, 0x00, 0x00, // Compression, filter, interlace
|
||||
0x1F, 0x15, (byte) 0xC4, (byte) 0x89, // CRC
|
||||
0x00, 0x00, 0x00, 0x0A, // IDAT chunk length
|
||||
'I', 'D', 'A', 'T', // IDAT chunk type
|
||||
0x78, (byte) 0x9C, 0x62, 0x00, 0x01, 0x00, 0x00, 0x05, 0x00, 0x01, // Image data
|
||||
0x0D, 0x0A, 0x2D, (byte) 0xB4, // CRC
|
||||
0x00, 0x00, 0x00, 0x00, // IEND chunk length
|
||||
'I', 'E', 'N', 'D', // IEND chunk type
|
||||
(byte) 0xAE, 0x42, 0x60, (byte) 0x82 // CRC
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,176 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.RefreshToken;
|
||||
import com.storycove.repository.RefreshTokenRepository;
|
||||
import com.storycove.util.JwtUtil;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.InjectMocks;
|
||||
import org.mockito.Mock;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.Optional;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.*;
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class RefreshTokenServiceTest {
|
||||
|
||||
@Mock
|
||||
private RefreshTokenRepository refreshTokenRepository;
|
||||
|
||||
@Mock
|
||||
private JwtUtil jwtUtil;
|
||||
|
||||
@InjectMocks
|
||||
private RefreshTokenService refreshTokenService;
|
||||
|
||||
@Test
|
||||
void testCreateRefreshToken() {
|
||||
// Arrange
|
||||
String libraryId = "library-123";
|
||||
String userAgent = "Mozilla/5.0";
|
||||
String ipAddress = "192.168.1.1";
|
||||
|
||||
when(jwtUtil.getRefreshExpirationMs()).thenReturn(1209600000L); // 14 days
|
||||
when(jwtUtil.generateRefreshToken()).thenReturn("test-refresh-token-12345");
|
||||
|
||||
RefreshToken savedToken = new RefreshToken("test-refresh-token-12345",
|
||||
LocalDateTime.now().plusDays(14), libraryId, userAgent, ipAddress);
|
||||
|
||||
when(refreshTokenRepository.save(any(RefreshToken.class))).thenReturn(savedToken);
|
||||
|
||||
// Act
|
||||
RefreshToken result = refreshTokenService.createRefreshToken(libraryId, userAgent, ipAddress);
|
||||
|
||||
// Assert
|
||||
assertNotNull(result);
|
||||
assertEquals("test-refresh-token-12345", result.getToken());
|
||||
assertEquals(libraryId, result.getLibraryId());
|
||||
assertEquals(userAgent, result.getUserAgent());
|
||||
assertEquals(ipAddress, result.getIpAddress());
|
||||
|
||||
verify(jwtUtil).generateRefreshToken();
|
||||
verify(refreshTokenRepository).save(any(RefreshToken.class));
|
||||
}
|
||||
|
||||
@Test
|
||||
void testFindByToken() {
|
||||
// Arrange
|
||||
String tokenString = "test-token";
|
||||
RefreshToken token = new RefreshToken(tokenString,
|
||||
LocalDateTime.now().plusDays(14), "lib-1", "UA", "127.0.0.1");
|
||||
|
||||
when(refreshTokenRepository.findByToken(tokenString)).thenReturn(Optional.of(token));
|
||||
|
||||
// Act
|
||||
Optional<RefreshToken> result = refreshTokenService.findByToken(tokenString);
|
||||
|
||||
// Assert
|
||||
assertTrue(result.isPresent());
|
||||
assertEquals(tokenString, result.get().getToken());
|
||||
|
||||
verify(refreshTokenRepository).findByToken(tokenString);
|
||||
}
|
||||
|
||||
@Test
|
||||
void testVerifyRefreshToken_Valid() {
|
||||
// Arrange
|
||||
String tokenString = "valid-token";
|
||||
RefreshToken token = new RefreshToken(tokenString,
|
||||
LocalDateTime.now().plusDays(14), "lib-1", "UA", "127.0.0.1");
|
||||
|
||||
when(refreshTokenRepository.findByToken(tokenString)).thenReturn(Optional.of(token));
|
||||
|
||||
// Act
|
||||
Optional<RefreshToken> result = refreshTokenService.verifyRefreshToken(tokenString);
|
||||
|
||||
// Assert
|
||||
assertTrue(result.isPresent());
|
||||
assertTrue(result.get().isValid());
|
||||
}
|
||||
|
||||
@Test
|
||||
void testVerifyRefreshToken_Expired() {
|
||||
// Arrange
|
||||
String tokenString = "expired-token";
|
||||
RefreshToken token = new RefreshToken(tokenString,
|
||||
LocalDateTime.now().minusDays(1), "lib-1", "UA", "127.0.0.1"); // Expired
|
||||
|
||||
when(refreshTokenRepository.findByToken(tokenString)).thenReturn(Optional.of(token));
|
||||
|
||||
// Act
|
||||
Optional<RefreshToken> result = refreshTokenService.verifyRefreshToken(tokenString);
|
||||
|
||||
// Assert
|
||||
assertFalse(result.isPresent()); // Expired tokens should be filtered out
|
||||
}
|
||||
|
||||
@Test
|
||||
void testVerifyRefreshToken_Revoked() {
|
||||
// Arrange
|
||||
String tokenString = "revoked-token";
|
||||
RefreshToken token = new RefreshToken(tokenString,
|
||||
LocalDateTime.now().plusDays(14), "lib-1", "UA", "127.0.0.1");
|
||||
token.setRevokedAt(LocalDateTime.now()); // Revoked
|
||||
|
||||
when(refreshTokenRepository.findByToken(tokenString)).thenReturn(Optional.of(token));
|
||||
|
||||
// Act
|
||||
Optional<RefreshToken> result = refreshTokenService.verifyRefreshToken(tokenString);
|
||||
|
||||
// Assert
|
||||
assertFalse(result.isPresent()); // Revoked tokens should be filtered out
|
||||
}
|
||||
|
||||
@Test
|
||||
void testRevokeToken() {
|
||||
// Arrange
|
||||
RefreshToken token = new RefreshToken("token",
|
||||
LocalDateTime.now().plusDays(14), "lib-1", "UA", "127.0.0.1");
|
||||
|
||||
when(refreshTokenRepository.save(any(RefreshToken.class))).thenReturn(token);
|
||||
|
||||
// Act
|
||||
refreshTokenService.revokeToken(token);
|
||||
|
||||
// Assert
|
||||
assertNotNull(token.getRevokedAt());
|
||||
assertTrue(token.isRevoked());
|
||||
|
||||
verify(refreshTokenRepository).save(token);
|
||||
}
|
||||
|
||||
@Test
|
||||
void testRevokeAllByLibraryId() {
|
||||
// Arrange
|
||||
String libraryId = "library-123";
|
||||
|
||||
// Act
|
||||
refreshTokenService.revokeAllByLibraryId(libraryId);
|
||||
|
||||
// Assert
|
||||
verify(refreshTokenRepository).revokeAllByLibraryId(eq(libraryId), any(LocalDateTime.class));
|
||||
}
|
||||
|
||||
@Test
|
||||
void testRevokeAll() {
|
||||
// Act
|
||||
refreshTokenService.revokeAll();
|
||||
|
||||
// Assert
|
||||
verify(refreshTokenRepository).revokeAll(any(LocalDateTime.class));
|
||||
}
|
||||
|
||||
@Test
|
||||
void testCleanupExpiredTokens() {
|
||||
// Act
|
||||
refreshTokenService.cleanupExpiredTokens();
|
||||
|
||||
// Assert
|
||||
verify(refreshTokenRepository).deleteExpiredTokens(any(LocalDateTime.class));
|
||||
}
|
||||
}
|
||||
@@ -33,6 +33,9 @@ class StoryServiceTest {
|
||||
@Mock
|
||||
private ReadingPositionRepository readingPositionRepository;
|
||||
|
||||
@Mock
|
||||
private SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
private StoryService storyService;
|
||||
private Story testStory;
|
||||
private UUID testId;
|
||||
@@ -44,16 +47,16 @@ class StoryServiceTest {
|
||||
testStory.setId(testId);
|
||||
testStory.setContentHtml("<p>Test content for reading progress tracking</p>");
|
||||
|
||||
// Create StoryService with only required repositories, all services can be null for these tests
|
||||
// Create StoryService with mocked dependencies
|
||||
storyService = new StoryService(
|
||||
storyRepository,
|
||||
tagRepository,
|
||||
readingPositionRepository, // added for foreign key constraint handling
|
||||
readingPositionRepository,
|
||||
null, // authorService - not needed for reading progress tests
|
||||
null, // tagService - not needed for reading progress tests
|
||||
null, // seriesService - not needed for reading progress tests
|
||||
null, // sanitizationService - not needed for reading progress tests
|
||||
null // typesenseService - will test both with and without
|
||||
searchServiceAdapter
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
490
backend/src/test/java/com/storycove/service/TagServiceTest.java
Normal file
490
backend/src/test/java/com/storycove/service/TagServiceTest.java
Normal file
@@ -0,0 +1,490 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.entity.Tag;
|
||||
import com.storycove.entity.TagAlias;
|
||||
import com.storycove.repository.TagAliasRepository;
|
||||
import com.storycove.repository.TagRepository;
|
||||
import com.storycove.service.exception.DuplicateResourceException;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.DisplayName;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.InjectMocks;
|
||||
import org.mockito.Mock;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
|
||||
import java.util.*;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.*;
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class TagServiceTest {
|
||||
|
||||
@Mock
|
||||
private TagRepository tagRepository;
|
||||
|
||||
@Mock
|
||||
private TagAliasRepository tagAliasRepository;
|
||||
|
||||
@InjectMocks
|
||||
private TagService tagService;
|
||||
|
||||
private Tag testTag;
|
||||
private UUID tagId;
|
||||
|
||||
@BeforeEach
|
||||
void setUp() {
|
||||
tagId = UUID.randomUUID();
|
||||
testTag = new Tag();
|
||||
testTag.setId(tagId);
|
||||
testTag.setName("fantasy");
|
||||
testTag.setStories(new HashSet<>());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Basic CRUD Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find tag by ID")
|
||||
void testFindById() {
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
|
||||
Tag result = tagService.findById(tagId);
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals(tagId, result.getId());
|
||||
assertEquals("fantasy", result.getName());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when tag not found by ID")
|
||||
void testFindByIdNotFound() {
|
||||
when(tagRepository.findById(any())).thenReturn(Optional.empty());
|
||||
|
||||
assertThrows(ResourceNotFoundException.class, () -> {
|
||||
tagService.findById(UUID.randomUUID());
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find tag by name")
|
||||
void testFindByName() {
|
||||
when(tagRepository.findByName("fantasy")).thenReturn(Optional.of(testTag));
|
||||
|
||||
Tag result = tagService.findByName("fantasy");
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals("fantasy", result.getName());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should create new tag")
|
||||
void testCreateTag() {
|
||||
when(tagRepository.existsByName("fantasy")).thenReturn(false);
|
||||
when(tagRepository.save(any(Tag.class))).thenReturn(testTag);
|
||||
|
||||
Tag result = tagService.create(testTag);
|
||||
|
||||
assertNotNull(result);
|
||||
verify(tagRepository).save(testTag);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when creating duplicate tag")
|
||||
void testCreateDuplicateTag() {
|
||||
when(tagRepository.existsByName("fantasy")).thenReturn(true);
|
||||
|
||||
assertThrows(DuplicateResourceException.class, () -> {
|
||||
tagService.create(testTag);
|
||||
});
|
||||
|
||||
verify(tagRepository, never()).save(any());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should update existing tag")
|
||||
void testUpdateTag() {
|
||||
Tag updates = new Tag();
|
||||
updates.setName("sci-fi");
|
||||
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
when(tagRepository.existsByName("sci-fi")).thenReturn(false);
|
||||
when(tagRepository.save(any(Tag.class))).thenReturn(testTag);
|
||||
|
||||
Tag result = tagService.update(tagId, updates);
|
||||
|
||||
assertNotNull(result);
|
||||
verify(tagRepository).save(testTag);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when updating to duplicate name")
|
||||
void testUpdateToDuplicateName() {
|
||||
Tag updates = new Tag();
|
||||
updates.setName("sci-fi");
|
||||
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
when(tagRepository.existsByName("sci-fi")).thenReturn(true);
|
||||
|
||||
assertThrows(DuplicateResourceException.class, () -> {
|
||||
tagService.update(tagId, updates);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should delete unused tag")
|
||||
void testDeleteUnusedTag() {
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
doNothing().when(tagRepository).delete(testTag);
|
||||
|
||||
tagService.delete(tagId);
|
||||
|
||||
verify(tagRepository).delete(testTag);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when deleting tag in use")
|
||||
void testDeleteTagInUse() {
|
||||
Story story = new Story();
|
||||
testTag.getStories().add(story);
|
||||
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
|
||||
assertThrows(IllegalStateException.class, () -> {
|
||||
tagService.delete(tagId);
|
||||
});
|
||||
|
||||
verify(tagRepository, never()).delete(any());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Tag Alias Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should add alias to tag")
|
||||
void testAddAlias() {
|
||||
TagAlias alias = new TagAlias();
|
||||
alias.setAliasName("sci-fantasy");
|
||||
alias.setCanonicalTag(testTag);
|
||||
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
when(tagAliasRepository.existsByAliasNameIgnoreCase("sci-fantasy")).thenReturn(false);
|
||||
when(tagRepository.existsByNameIgnoreCase("sci-fantasy")).thenReturn(false);
|
||||
when(tagAliasRepository.save(any(TagAlias.class))).thenReturn(alias);
|
||||
|
||||
TagAlias result = tagService.addAlias(tagId, "sci-fantasy");
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals("sci-fantasy", result.getAliasName());
|
||||
verify(tagAliasRepository).save(any(TagAlias.class));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when alias already exists")
|
||||
void testAddDuplicateAlias() {
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
when(tagAliasRepository.existsByAliasNameIgnoreCase("sci-fantasy")).thenReturn(true);
|
||||
|
||||
assertThrows(DuplicateResourceException.class, () -> {
|
||||
tagService.addAlias(tagId, "sci-fantasy");
|
||||
});
|
||||
|
||||
verify(tagAliasRepository, never()).save(any());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when alias conflicts with tag name")
|
||||
void testAddAliasConflictsWithTagName() {
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
when(tagAliasRepository.existsByAliasNameIgnoreCase("sci-fi")).thenReturn(false);
|
||||
when(tagRepository.existsByNameIgnoreCase("sci-fi")).thenReturn(true);
|
||||
|
||||
assertThrows(DuplicateResourceException.class, () -> {
|
||||
tagService.addAlias(tagId, "sci-fi");
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should remove alias from tag")
|
||||
void testRemoveAlias() {
|
||||
UUID aliasId = UUID.randomUUID();
|
||||
TagAlias alias = new TagAlias();
|
||||
alias.setId(aliasId);
|
||||
alias.setCanonicalTag(testTag);
|
||||
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
when(tagAliasRepository.findById(aliasId)).thenReturn(Optional.of(alias));
|
||||
doNothing().when(tagAliasRepository).delete(alias);
|
||||
|
||||
tagService.removeAlias(tagId, aliasId);
|
||||
|
||||
verify(tagAliasRepository).delete(alias);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when removing alias from wrong tag")
|
||||
void testRemoveAliasFromWrongTag() {
|
||||
UUID aliasId = UUID.randomUUID();
|
||||
Tag differentTag = new Tag();
|
||||
differentTag.setId(UUID.randomUUID());
|
||||
|
||||
TagAlias alias = new TagAlias();
|
||||
alias.setId(aliasId);
|
||||
alias.setCanonicalTag(differentTag);
|
||||
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
when(tagAliasRepository.findById(aliasId)).thenReturn(Optional.of(alias));
|
||||
|
||||
assertThrows(IllegalArgumentException.class, () -> {
|
||||
tagService.removeAlias(tagId, aliasId);
|
||||
});
|
||||
|
||||
verify(tagAliasRepository, never()).delete(any());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should resolve tag by name")
|
||||
void testResolveTagByName() {
|
||||
when(tagRepository.findByNameIgnoreCase("fantasy")).thenReturn(Optional.of(testTag));
|
||||
|
||||
Tag result = tagService.resolveTagByName("fantasy");
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals("fantasy", result.getName());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should resolve tag by alias")
|
||||
void testResolveTagByAlias() {
|
||||
TagAlias alias = new TagAlias();
|
||||
alias.setAliasName("sci-fantasy");
|
||||
alias.setCanonicalTag(testTag);
|
||||
|
||||
when(tagRepository.findByNameIgnoreCase("sci-fantasy")).thenReturn(Optional.empty());
|
||||
when(tagAliasRepository.findByAliasNameIgnoreCase("sci-fantasy")).thenReturn(Optional.of(alias));
|
||||
|
||||
Tag result = tagService.resolveTagByName("sci-fantasy");
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals("fantasy", result.getName());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return null when tag/alias not found")
|
||||
void testResolveTagNotFound() {
|
||||
when(tagRepository.findByNameIgnoreCase(anyString())).thenReturn(Optional.empty());
|
||||
when(tagAliasRepository.findByAliasNameIgnoreCase(anyString())).thenReturn(Optional.empty());
|
||||
|
||||
Tag result = tagService.resolveTagByName("nonexistent");
|
||||
|
||||
assertNull(result);
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Tag Merge Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should merge tags successfully")
|
||||
void testMergeTags() {
|
||||
UUID sourceId = UUID.randomUUID();
|
||||
Tag sourceTag = new Tag();
|
||||
sourceTag.setId(sourceId);
|
||||
sourceTag.setName("sci-fi");
|
||||
|
||||
Story story = new Story();
|
||||
story.setTags(new HashSet<>(Arrays.asList(sourceTag)));
|
||||
sourceTag.setStories(new HashSet<>(Arrays.asList(story)));
|
||||
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
when(tagRepository.findById(sourceId)).thenReturn(Optional.of(sourceTag));
|
||||
when(tagAliasRepository.save(any(TagAlias.class))).thenReturn(new TagAlias());
|
||||
when(tagRepository.save(any(Tag.class))).thenReturn(testTag);
|
||||
doNothing().when(tagRepository).delete(sourceTag);
|
||||
|
||||
Tag result = tagService.mergeTags(List.of(sourceId), tagId);
|
||||
|
||||
assertNotNull(result);
|
||||
verify(tagAliasRepository).save(any(TagAlias.class));
|
||||
verify(tagRepository).delete(sourceTag);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should not merge tag with itself")
|
||||
void testMergeTagWithItself() {
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
|
||||
assertThrows(IllegalArgumentException.class, () -> {
|
||||
tagService.mergeTags(List.of(tagId), tagId);
|
||||
});
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when no valid source tags to merge")
|
||||
void testMergeNoValidSourceTags() {
|
||||
when(tagRepository.findById(tagId)).thenReturn(Optional.of(testTag));
|
||||
|
||||
assertThrows(IllegalArgumentException.class, () -> {
|
||||
tagService.mergeTags(Collections.emptyList(), tagId);
|
||||
});
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Search and Query Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find all tags")
|
||||
void testFindAll() {
|
||||
when(tagRepository.findAll()).thenReturn(List.of(testTag));
|
||||
|
||||
List<Tag> result = tagService.findAll();
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.size());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should search tags by name")
|
||||
void testSearchByName() {
|
||||
when(tagRepository.findByNameContainingIgnoreCase("fan"))
|
||||
.thenReturn(List.of(testTag));
|
||||
|
||||
List<Tag> result = tagService.searchByName("fan");
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.size());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find used tags")
|
||||
void testFindUsedTags() {
|
||||
when(tagRepository.findUsedTags()).thenReturn(List.of(testTag));
|
||||
|
||||
List<Tag> result = tagService.findUsedTags();
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.size());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find most used tags")
|
||||
void testFindMostUsedTags() {
|
||||
when(tagRepository.findMostUsedTags()).thenReturn(List.of(testTag));
|
||||
|
||||
List<Tag> result = tagService.findMostUsedTags();
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.size());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find unused tags")
|
||||
void testFindUnusedTags() {
|
||||
when(tagRepository.findUnusedTags()).thenReturn(List.of(testTag));
|
||||
|
||||
List<Tag> result = tagService.findUnusedTags();
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.size());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should delete all unused tags")
|
||||
void testDeleteUnusedTags() {
|
||||
when(tagRepository.findUnusedTags()).thenReturn(List.of(testTag));
|
||||
doNothing().when(tagRepository).deleteAll(anyList());
|
||||
|
||||
List<Tag> result = tagService.deleteUnusedTags();
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals(1, result.size());
|
||||
verify(tagRepository).deleteAll(anyList());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find or create tag")
|
||||
void testFindOrCreate() {
|
||||
when(tagRepository.findByName("fantasy")).thenReturn(Optional.of(testTag));
|
||||
|
||||
Tag result = tagService.findOrCreate("fantasy");
|
||||
|
||||
assertNotNull(result);
|
||||
assertEquals("fantasy", result.getName());
|
||||
verify(tagRepository, never()).save(any());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should create tag when not found")
|
||||
void testFindOrCreateNew() {
|
||||
when(tagRepository.findByName("new-tag")).thenReturn(Optional.empty());
|
||||
when(tagRepository.existsByName("new-tag")).thenReturn(false);
|
||||
when(tagRepository.save(any(Tag.class))).thenReturn(testTag);
|
||||
|
||||
Tag result = tagService.findOrCreate("new-tag");
|
||||
|
||||
assertNotNull(result);
|
||||
verify(tagRepository).save(any(Tag.class));
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Tag Suggestion Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should suggest tags based on content")
|
||||
void testSuggestTags() {
|
||||
when(tagRepository.findAll()).thenReturn(List.of(testTag));
|
||||
|
||||
var suggestions = tagService.suggestTags(
|
||||
"Fantasy Adventure",
|
||||
"A fantasy story about magic",
|
||||
"Epic fantasy tale",
|
||||
5
|
||||
);
|
||||
|
||||
assertNotNull(suggestions);
|
||||
assertFalse(suggestions.isEmpty());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should return empty suggestions for empty content")
|
||||
void testSuggestTagsEmptyContent() {
|
||||
when(tagRepository.findAll()).thenReturn(List.of(testTag));
|
||||
|
||||
var suggestions = tagService.suggestTags("", "", "", 5);
|
||||
|
||||
assertNotNull(suggestions);
|
||||
assertTrue(suggestions.isEmpty());
|
||||
}
|
||||
|
||||
// ========================================
|
||||
// Statistics Tests
|
||||
// ========================================
|
||||
|
||||
@Test
|
||||
@DisplayName("Should count all tags")
|
||||
void testCountAll() {
|
||||
when(tagRepository.count()).thenReturn(10L);
|
||||
|
||||
long count = tagService.countAll();
|
||||
|
||||
assertEquals(10L, count);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should count used tags")
|
||||
void testCountUsedTags() {
|
||||
when(tagRepository.countUsedTags()).thenReturn(5L);
|
||||
|
||||
long count = tagService.countUsedTags();
|
||||
|
||||
assertEquals(5L, count);
|
||||
}
|
||||
}
|
||||
@@ -18,11 +18,15 @@ storycove:
|
||||
expiration: 86400000
|
||||
auth:
|
||||
password: test-password
|
||||
typesense:
|
||||
enabled: false
|
||||
api-key: test-key
|
||||
search:
|
||||
engine: solr
|
||||
solr:
|
||||
host: localhost
|
||||
port: 8108
|
||||
port: 8983
|
||||
scheme: http
|
||||
cores:
|
||||
stories: storycove_stories
|
||||
authors: storycove_authors
|
||||
images:
|
||||
storage-path: /tmp/test-images
|
||||
|
||||
|
||||
4308
backend/test_results.log
Normal file
4308
backend/test_results.log
Normal file
File diff suppressed because it is too large
Load Diff
@@ -2,3 +2,4 @@
|
||||
# https://curl.se/docs/http-cookies.html
|
||||
# This file was generated by libcurl! Edit at your own risk.
|
||||
|
||||
#HttpOnly_localhost FALSE / FALSE 1758433252 token eyJhbGciOiJIUzUxMiJ9.eyJzdWIiOiJ1c2VyIiwiaWF0IjoxNzU4MzQ2ODUyLCJleHAiOjE3NTg0MzMyNTIsImxpYnJhcnlJZCI6InNlY3JldCJ9.zEAQT5_11-pxPxmIhufSQqE26hvHldde4kFNE2HWWgBa5lT_Wt7jwpoPUMkQGQfShQwDZ9N-hFX3R2ew8jD7WQ
|
||||
|
||||
98
deploy.sh
98
deploy.sh
@@ -1,35 +1,91 @@
|
||||
#!/bin/bash
|
||||
|
||||
# StoryCove Deployment Script
|
||||
# Usage: ./deploy.sh [environment]
|
||||
# Environments: development, staging, production
|
||||
# This script handles deployment with automatic Solr volume cleanup
|
||||
|
||||
set -e
|
||||
|
||||
ENVIRONMENT=${1:-development}
|
||||
ENV_FILE=".env.${ENVIRONMENT}"
|
||||
echo "🚀 Starting StoryCove deployment..."
|
||||
|
||||
echo "Deploying StoryCove for ${ENVIRONMENT} environment..."
|
||||
# Colors for output
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
RED='\033[0;31m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Check if environment file exists
|
||||
if [ ! -f "$ENV_FILE" ]; then
|
||||
echo "Error: Environment file $ENV_FILE not found."
|
||||
echo "Available environments: development, staging, production"
|
||||
# Check if docker-compose is available
|
||||
if ! command -v docker-compose &> /dev/null; then
|
||||
echo -e "${RED}❌ docker-compose not found. Please install docker-compose first.${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Copy environment file to .env
|
||||
cp "$ENV_FILE" .env
|
||||
echo "Using environment configuration from $ENV_FILE"
|
||||
|
||||
# Build and start services
|
||||
echo "Building and starting Docker services..."
|
||||
# Stop existing containers
|
||||
echo -e "${YELLOW}📦 Stopping existing containers...${NC}"
|
||||
docker-compose down
|
||||
docker-compose build --no-cache
|
||||
docker-compose up -d
|
||||
|
||||
echo "Deployment complete!"
|
||||
echo "StoryCove is running at: $(grep STORYCOVE_PUBLIC_URL $ENV_FILE | cut -d'=' -f2)"
|
||||
# Remove Solr volume to force recreation with fresh cores
|
||||
echo -e "${YELLOW}🗑️ Removing Solr data volume...${NC}"
|
||||
docker volume rm storycove_solr_data 2>/dev/null || echo "Solr volume doesn't exist yet (first run)"
|
||||
|
||||
# Build and start containers
|
||||
echo -e "${YELLOW}🏗️ Building and starting containers...${NC}"
|
||||
docker-compose up -d --build
|
||||
|
||||
# Wait for services to be healthy
|
||||
echo -e "${YELLOW}⏳ Waiting for services to be healthy...${NC}"
|
||||
sleep 5
|
||||
|
||||
# Check if backend is ready
|
||||
echo -e "${YELLOW}🔍 Checking backend health...${NC}"
|
||||
MAX_RETRIES=30
|
||||
RETRY_COUNT=0
|
||||
while [ $RETRY_COUNT -lt $MAX_RETRIES ]; do
|
||||
if docker-compose exec -T backend curl -f http://localhost:8080/api/health &>/dev/null; then
|
||||
echo -e "${GREEN}✅ Backend is healthy${NC}"
|
||||
break
|
||||
fi
|
||||
RETRY_COUNT=$((RETRY_COUNT+1))
|
||||
echo "Waiting for backend... ($RETRY_COUNT/$MAX_RETRIES)"
|
||||
sleep 2
|
||||
done
|
||||
|
||||
if [ $RETRY_COUNT -eq $MAX_RETRIES ]; then
|
||||
echo -e "${RED}❌ Backend failed to start${NC}"
|
||||
docker-compose logs backend
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Apply database migrations
|
||||
echo -e "${YELLOW}🗄️ Applying database migrations...${NC}"
|
||||
docker-compose run --rm migrations
|
||||
echo -e "${GREEN}✅ Database migrations applied${NC}"
|
||||
|
||||
# Check if Solr is ready
|
||||
echo -e "${YELLOW}🔍 Checking Solr health...${NC}"
|
||||
RETRY_COUNT=0
|
||||
while [ $RETRY_COUNT -lt $MAX_RETRIES ]; do
|
||||
if docker-compose exec -T backend curl -f http://solr:8983/solr/admin/ping &>/dev/null; then
|
||||
echo -e "${GREEN}✅ Solr is healthy${NC}"
|
||||
break
|
||||
fi
|
||||
RETRY_COUNT=$((RETRY_COUNT+1))
|
||||
echo "Waiting for Solr... ($RETRY_COUNT/$MAX_RETRIES)"
|
||||
sleep 2
|
||||
done
|
||||
|
||||
if [ $RETRY_COUNT -eq $MAX_RETRIES ]; then
|
||||
echo -e "${RED}❌ Solr failed to start${NC}"
|
||||
docker-compose logs solr
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo -e "${GREEN}✅ Deployment complete!${NC}"
|
||||
echo ""
|
||||
echo "To view logs: docker-compose logs -f"
|
||||
echo "To stop: docker-compose down"
|
||||
echo "📊 Service status:"
|
||||
docker-compose ps
|
||||
echo ""
|
||||
echo "🌐 Application is available at http://localhost:6925"
|
||||
echo "🔧 Solr Admin UI is available at http://localhost:8983"
|
||||
echo ""
|
||||
echo "📝 Note: The application will automatically perform bulk reindexing on startup."
|
||||
echo " Check backend logs with: docker-compose logs -f backend"
|
||||
|
||||
@@ -34,26 +34,30 @@ services:
|
||||
- SPRING_DATASOURCE_USERNAME=storycove
|
||||
- SPRING_DATASOURCE_PASSWORD=${DB_PASSWORD}
|
||||
- JWT_SECRET=${JWT_SECRET}
|
||||
- TYPESENSE_API_KEY=${TYPESENSE_API_KEY}
|
||||
- TYPESENSE_HOST=typesense
|
||||
- TYPESENSE_PORT=8108
|
||||
- SOLR_HOST=solr
|
||||
- SOLR_PORT=8983
|
||||
- SOLR_SCHEME=http
|
||||
- SEARCH_ENGINE=${SEARCH_ENGINE:-solr}
|
||||
- IMAGE_STORAGE_PATH=/app/images
|
||||
- APP_PASSWORD=${APP_PASSWORD}
|
||||
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
|
||||
volumes:
|
||||
- images_data:/app/images
|
||||
- library_config:/app/config
|
||||
- automatic_backups:/app/automatic-backups
|
||||
depends_on:
|
||||
- postgres
|
||||
- typesense
|
||||
postgres:
|
||||
condition: service_healthy
|
||||
solr:
|
||||
condition: service_started
|
||||
networks:
|
||||
- storycove-network
|
||||
|
||||
postgres:
|
||||
image: postgres:15-alpine
|
||||
# No port mapping - only accessible within the Docker network
|
||||
ports:
|
||||
- "5432:5432"
|
||||
#ports:
|
||||
# - "5432:5432"
|
||||
environment:
|
||||
- POSTGRES_DB=storycove
|
||||
- POSTGRES_USER=storycove
|
||||
@@ -62,23 +66,48 @@ services:
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
networks:
|
||||
- storycove-network
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "pg_isready -U storycove -d storycove"]
|
||||
interval: 5s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
|
||||
typesense:
|
||||
image: typesense/typesense:29.0
|
||||
# No port mapping - only accessible within the Docker network
|
||||
|
||||
solr:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: solr.Dockerfile
|
||||
ports:
|
||||
- "8983:8983" # Expose Solr Admin UI for development
|
||||
environment:
|
||||
- TYPESENSE_API_KEY=${TYPESENSE_API_KEY}
|
||||
- TYPESENSE_DATA_DIR=/data
|
||||
- SOLR_HEAP=512m
|
||||
- SOLR_JAVA_MEM=-Xms256m -Xmx512m
|
||||
volumes:
|
||||
- typesense_data:/data
|
||||
- solr_data:/var/solr
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
memory: 1G
|
||||
reservations:
|
||||
memory: 512M
|
||||
stop_grace_period: 30s
|
||||
healthcheck:
|
||||
test: ["CMD-SHELL", "curl -f http://localhost:8983/solr/admin/ping || exit 1"]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 5
|
||||
start_period: 60s
|
||||
networks:
|
||||
- storycove-network
|
||||
restart: unless-stopped
|
||||
|
||||
|
||||
volumes:
|
||||
postgres_data:
|
||||
typesense_data:
|
||||
solr_data:
|
||||
images_data:
|
||||
library_config:
|
||||
automatic_backups:
|
||||
|
||||
configs:
|
||||
nginx_config:
|
||||
@@ -95,7 +124,7 @@ configs:
|
||||
}
|
||||
server {
|
||||
listen 80;
|
||||
client_max_body_size 256M;
|
||||
client_max_body_size 600M;
|
||||
location / {
|
||||
proxy_pass http://frontend;
|
||||
proxy_http_version 1.1;
|
||||
@@ -113,22 +142,18 @@ configs:
|
||||
proxy_set_header X-Real-IP $$remote_addr;
|
||||
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $$scheme;
|
||||
proxy_connect_timeout 60s;
|
||||
proxy_send_timeout 60s;
|
||||
proxy_read_timeout 60s;
|
||||
proxy_connect_timeout 900s;
|
||||
proxy_send_timeout 900s;
|
||||
proxy_read_timeout 900s;
|
||||
# Large upload settings
|
||||
client_max_body_size 600M;
|
||||
proxy_request_buffering off;
|
||||
proxy_max_temp_file_size 0;
|
||||
}
|
||||
location /images/ {
|
||||
alias /app/images/;
|
||||
expires 1y;
|
||||
add_header Cache-Control public;
|
||||
}
|
||||
location /typesense/ {
|
||||
proxy_pass http://typesense:8108/;
|
||||
proxy_set_header Host $$host;
|
||||
proxy_set_header X-Real-IP $$remote_addr;
|
||||
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $$scheme;
|
||||
proxy_set_header X-Typesense-API-Key $$http_x_typesense_api_key;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -9,7 +9,7 @@ RUN apk add --no-cache dumb-init
|
||||
COPY package*.json ./
|
||||
|
||||
# Install dependencies with optimized settings
|
||||
RUN npm ci --prefer-offline --no-audit --frozen-lockfile
|
||||
RUN npm install --prefer-offline --no-audit --legacy-peer-deps
|
||||
|
||||
# Build stage
|
||||
FROM node:18-alpine AS builder
|
||||
@@ -20,12 +20,23 @@ COPY --from=deps /app/node_modules ./node_modules
|
||||
COPY . .
|
||||
|
||||
# Set Node.js memory limit for build
|
||||
ENV NODE_OPTIONS="--max-old-space-size=1024"
|
||||
ENV NODE_OPTIONS="--max-old-space-size=2048"
|
||||
ENV NEXT_TELEMETRY_DISABLED=1
|
||||
|
||||
# Build the application
|
||||
# List files to ensure everything is copied correctly
|
||||
RUN ls -la
|
||||
|
||||
# Force clean build - remove any cached build artifacts
|
||||
RUN rm -rf .next || true
|
||||
|
||||
# Build the application with verbose logging
|
||||
RUN npm run build
|
||||
|
||||
# Verify the build output exists
|
||||
RUN ls -la .next/ || (echo ".next directory not found!" && exit 1)
|
||||
RUN ls -la .next/standalone/ || (echo ".next/standalone directory not found!" && cat build.log && exit 1)
|
||||
RUN ls -la .next/static/ || (echo ".next/static directory not found!" && exit 1)
|
||||
|
||||
# Production stage
|
||||
FROM node:18-alpine AS runner
|
||||
WORKDIR /app
|
||||
|
||||
2
frontend/next-env.d.ts
vendored
2
frontend/next-env.d.ts
vendored
@@ -2,4 +2,4 @@
|
||||
/// <reference types="next/image-types/global" />
|
||||
|
||||
// NOTE: This file should not be edited
|
||||
// see https://nextjs.org/docs/basic-features/typescript for more information.
|
||||
// see https://nextjs.org/docs/app/building-your-application/configuring/typescript for more information.
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
const nextConfig = {
|
||||
// Enable standalone output for optimized Docker builds
|
||||
output: 'standalone',
|
||||
// Note: Body size limits are handled by nginx and backend, not Next.js frontend
|
||||
// Removed Next.js rewrites since nginx handles all API routing
|
||||
webpack: (config, { isServer }) => {
|
||||
// Exclude cheerio and its dependencies from client-side bundling
|
||||
|
||||
613
frontend/package-lock.json
generated
613
frontend/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -10,18 +10,23 @@
|
||||
"type-check": "tsc --noEmit"
|
||||
},
|
||||
"dependencies": {
|
||||
"@heroicons/react": "^2.2.0",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"axios": "^1.11.0",
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"dompurify": "^3.2.6",
|
||||
"next": "14.0.0",
|
||||
"postcss": "^8.4.31",
|
||||
"react": "^18",
|
||||
"react-dom": "^18",
|
||||
"react-dropzone": "^14.2.3",
|
||||
"server-only": "^0.0.1",
|
||||
"tailwindcss": "^3.3.0"
|
||||
"@heroicons/react": "^2.2.0",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"axios": "^1.7.7",
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"dompurify": "^3.2.6",
|
||||
"next": "^14.2.32",
|
||||
"postcss": "^8.4.31",
|
||||
"react": "^18",
|
||||
"react-dom": "^18",
|
||||
"react-dropzone": "^14.2.3",
|
||||
"rxjs": "^7.8.1",
|
||||
"server-only": "^0.0.1",
|
||||
"slate": "^0.118.1",
|
||||
"slate-react": "^0.117.4",
|
||||
"slate-history": "^0.113.1",
|
||||
"slate-dom": "^0.117.0",
|
||||
"tailwindcss": "^3.3.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/dompurify": "^3.0.5",
|
||||
|
||||
37
frontend/package.json.with-portabletext
Normal file
37
frontend/package.json.with-portabletext
Normal file
@@ -0,0 +1,37 @@
|
||||
{
|
||||
"name": "storycove-frontend",
|
||||
"version": "0.1.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"dev": "next dev",
|
||||
"build": "next build",
|
||||
"start": "next start",
|
||||
"lint": "next lint",
|
||||
"type-check": "tsc --noEmit"
|
||||
},
|
||||
"dependencies": {
|
||||
"@heroicons/react": "^2.2.0",
|
||||
"@portabletext/react": "4.0.3",
|
||||
"@portabletext/types": "2.0.14",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"axios": "^1.11.0",
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"dompurify": "^3.2.6",
|
||||
"next": "14.0.0",
|
||||
"postcss": "^8.4.31",
|
||||
"react": "^18",
|
||||
"react-dom": "^18",
|
||||
"react-dropzone": "^14.2.3",
|
||||
"server-only": "^0.0.1",
|
||||
"tailwindcss": "^3.3.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/dompurify": "^3.0.5",
|
||||
"@types/node": "^20",
|
||||
"@types/react": "^18",
|
||||
"@types/react-dom": "^18",
|
||||
"eslint": "^8",
|
||||
"eslint-config-next": "14.0.0",
|
||||
"typescript": "^5"
|
||||
}
|
||||
}
|
||||
550
frontend/src/app/add-story/AddStoryContent.tsx
Normal file
550
frontend/src/app/add-story/AddStoryContent.tsx
Normal file
@@ -0,0 +1,550 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import { useAuth } from '../../contexts/AuthContext';
|
||||
import { Input, Textarea } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import TagInput from '../../components/stories/TagInput';
|
||||
import SlateEditor from '../../components/stories/SlateEditor';
|
||||
import ImageUpload from '../../components/ui/ImageUpload';
|
||||
import AuthorSelector from '../../components/stories/AuthorSelector';
|
||||
import SeriesSelector from '../../components/stories/SeriesSelector';
|
||||
import { storyApi, authorApi } from '../../lib/api';
|
||||
|
||||
export default function AddStoryContent() {
|
||||
const [formData, setFormData] = useState({
|
||||
title: '',
|
||||
summary: '',
|
||||
authorName: '',
|
||||
authorId: undefined as string | undefined,
|
||||
contentHtml: '',
|
||||
sourceUrl: '',
|
||||
tags: [] as string[],
|
||||
seriesName: '',
|
||||
seriesId: undefined as string | undefined,
|
||||
volume: '',
|
||||
});
|
||||
|
||||
const [coverImage, setCoverImage] = useState<File | null>(null);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [processingImages, setProcessingImages] = useState(false);
|
||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||
const [duplicateWarning, setDuplicateWarning] = useState<{
|
||||
show: boolean;
|
||||
count: number;
|
||||
duplicates: Array<{
|
||||
id: string;
|
||||
title: string;
|
||||
authorName: string;
|
||||
createdAt: string;
|
||||
}>;
|
||||
}>({ show: false, count: 0, duplicates: [] });
|
||||
const [checkingDuplicates, setCheckingDuplicates] = useState(false);
|
||||
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { isAuthenticated } = useAuth();
|
||||
|
||||
// Handle URL parameters
|
||||
useEffect(() => {
|
||||
const authorId = searchParams.get('authorId');
|
||||
const from = searchParams.get('from');
|
||||
|
||||
// Pre-fill author if authorId is provided in URL
|
||||
if (authorId) {
|
||||
const loadAuthor = async () => {
|
||||
try {
|
||||
const author = await authorApi.getAuthor(authorId);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName: author.name,
|
||||
authorId: author.id
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error('Failed to load author:', error);
|
||||
}
|
||||
};
|
||||
loadAuthor();
|
||||
}
|
||||
|
||||
// Handle URL import data
|
||||
if (from === 'url-import') {
|
||||
const title = searchParams.get('title') || '';
|
||||
const summary = searchParams.get('summary') || '';
|
||||
const author = searchParams.get('author') || '';
|
||||
const sourceUrl = searchParams.get('sourceUrl') || '';
|
||||
const tagsParam = searchParams.get('tags');
|
||||
const content = searchParams.get('content') || '';
|
||||
|
||||
let tags: string[] = [];
|
||||
try {
|
||||
tags = tagsParam ? JSON.parse(tagsParam) : [];
|
||||
} catch (error) {
|
||||
console.error('Failed to parse tags:', error);
|
||||
tags = [];
|
||||
}
|
||||
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
title,
|
||||
summary,
|
||||
authorName: author,
|
||||
authorId: undefined, // Reset author ID when importing from URL
|
||||
contentHtml: content,
|
||||
sourceUrl,
|
||||
tags
|
||||
}));
|
||||
|
||||
// Show success message
|
||||
setErrors({ success: 'Story data imported successfully! Review and edit as needed before saving.' });
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Load pending story data from bulk combine operation
|
||||
useEffect(() => {
|
||||
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
|
||||
if (fromBulkCombine) {
|
||||
const pendingStoryData = localStorage.getItem('pendingStory');
|
||||
if (pendingStoryData) {
|
||||
try {
|
||||
const storyData = JSON.parse(pendingStoryData);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
title: storyData.title || '',
|
||||
authorName: storyData.author || '',
|
||||
authorId: undefined, // Reset author ID for bulk combined stories
|
||||
contentHtml: storyData.content || '',
|
||||
sourceUrl: storyData.sourceUrl || '',
|
||||
summary: storyData.summary || '',
|
||||
tags: storyData.tags || []
|
||||
}));
|
||||
// Clear the pending data
|
||||
localStorage.removeItem('pendingStory');
|
||||
} catch (error) {
|
||||
console.error('Failed to load pending story data:', error);
|
||||
}
|
||||
}
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Check for duplicates when title and author are both present
|
||||
useEffect(() => {
|
||||
const checkDuplicates = async () => {
|
||||
const title = formData.title.trim();
|
||||
const authorName = formData.authorName.trim();
|
||||
|
||||
// Don't check if user isn't authenticated or if title/author are empty
|
||||
if (!isAuthenticated || !title || !authorName) {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
return;
|
||||
}
|
||||
|
||||
// Debounce the check to avoid too many API calls
|
||||
const timeoutId = setTimeout(async () => {
|
||||
try {
|
||||
setCheckingDuplicates(true);
|
||||
const result = await storyApi.checkDuplicate(title, authorName);
|
||||
|
||||
if (result.hasDuplicates) {
|
||||
setDuplicateWarning({
|
||||
show: true,
|
||||
count: result.count,
|
||||
duplicates: result.duplicates
|
||||
});
|
||||
} else {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to check for duplicates:', error);
|
||||
// Clear any existing duplicate warnings on error
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
// Don't show error to user as this is just a helpful warning
|
||||
// Authentication errors will be handled by the API interceptor
|
||||
} finally {
|
||||
setCheckingDuplicates(false);
|
||||
}
|
||||
}, 500); // 500ms debounce
|
||||
|
||||
return () => clearTimeout(timeoutId);
|
||||
};
|
||||
|
||||
checkDuplicates();
|
||||
}, [formData.title, formData.authorName, isAuthenticated]);
|
||||
|
||||
const handleInputChange = (field: string) => (
|
||||
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
|
||||
) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
[field]: e.target.value
|
||||
}));
|
||||
|
||||
// Clear error when user starts typing
|
||||
if (errors[field]) {
|
||||
setErrors(prev => ({ ...prev, [field]: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleContentChange = (html: string) => {
|
||||
setFormData(prev => ({ ...prev, contentHtml: html }));
|
||||
if (errors.contentHtml) {
|
||||
setErrors(prev => ({ ...prev, contentHtml: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleTagsChange = (tags: string[]) => {
|
||||
setFormData(prev => ({ ...prev, tags }));
|
||||
};
|
||||
|
||||
const handleAuthorChange = (authorName: string, authorId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName,
|
||||
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes author
|
||||
if (errors.authorName) {
|
||||
setErrors(prev => ({ ...prev, authorName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleSeriesChange = (seriesName: string, seriesId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
seriesName,
|
||||
seriesId: seriesId // This will be undefined if creating new series, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes series
|
||||
if (errors.seriesName) {
|
||||
setErrors(prev => ({ ...prev, seriesName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
const newErrors: Record<string, string> = {};
|
||||
|
||||
if (!formData.title.trim()) {
|
||||
newErrors.title = 'Title is required';
|
||||
}
|
||||
|
||||
if (!formData.authorName.trim()) {
|
||||
newErrors.authorName = 'Author name is required';
|
||||
}
|
||||
|
||||
if (!formData.contentHtml.trim()) {
|
||||
newErrors.contentHtml = 'Story content is required';
|
||||
}
|
||||
|
||||
if (formData.seriesName && !formData.volume) {
|
||||
newErrors.volume = 'Volume number is required when series is specified';
|
||||
}
|
||||
|
||||
if (formData.volume && !formData.seriesName.trim()) {
|
||||
newErrors.seriesName = 'Series name is required when volume is specified';
|
||||
}
|
||||
|
||||
setErrors(newErrors);
|
||||
return Object.keys(newErrors).length === 0;
|
||||
};
|
||||
|
||||
// Helper function to detect external images in HTML content
|
||||
const hasExternalImages = (htmlContent: string): boolean => {
|
||||
if (!htmlContent) return false;
|
||||
|
||||
// Create a temporary DOM element to parse HTML
|
||||
const tempDiv = document.createElement('div');
|
||||
tempDiv.innerHTML = htmlContent;
|
||||
|
||||
const images = tempDiv.querySelectorAll('img');
|
||||
for (let i = 0; i < images.length; i++) {
|
||||
const img = images[i];
|
||||
const src = img.getAttribute('src');
|
||||
if (src && (src.startsWith('http://') || src.startsWith('https://'))) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!validateForm()) {
|
||||
return;
|
||||
}
|
||||
|
||||
setLoading(true);
|
||||
|
||||
try {
|
||||
// First, create the story with JSON data
|
||||
const storyData = {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: formData.contentHtml,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
// Send seriesId if we have it (existing series), otherwise send seriesName (new series)
|
||||
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
|
||||
// Send authorId if we have it (existing author), otherwise send authorName (new author)
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
};
|
||||
|
||||
const story = await storyApi.createStory(storyData);
|
||||
|
||||
// Process images if there are external images in the content
|
||||
if (hasExternalImages(formData.contentHtml)) {
|
||||
try {
|
||||
setProcessingImages(true);
|
||||
const imageResult = await storyApi.processContentImages(story.id, formData.contentHtml);
|
||||
|
||||
// If images were processed and content was updated, save the updated content
|
||||
if (imageResult.processedContent !== formData.contentHtml) {
|
||||
await storyApi.updateStory(story.id, {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: imageResult.processedContent,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
});
|
||||
|
||||
// Show success message with image processing info
|
||||
if (imageResult.downloadedImages.length > 0) {
|
||||
console.log(`Successfully processed ${imageResult.downloadedImages.length} images`);
|
||||
}
|
||||
if (imageResult.warnings && imageResult.warnings.length > 0) {
|
||||
console.warn('Image processing warnings:', imageResult.warnings);
|
||||
}
|
||||
}
|
||||
} catch (imageError) {
|
||||
console.error('Failed to process images:', imageError);
|
||||
// Don't fail the entire operation if image processing fails
|
||||
// The story was created successfully, just without processed images
|
||||
} finally {
|
||||
setProcessingImages(false);
|
||||
}
|
||||
}
|
||||
|
||||
// If there's a cover image, upload it separately
|
||||
if (coverImage) {
|
||||
await storyApi.uploadCover(story.id, coverImage);
|
||||
}
|
||||
|
||||
router.push(`/stories/${story.id}/detail`);
|
||||
} catch (error: any) {
|
||||
console.error('Failed to create story:', error);
|
||||
const errorMessage = error.response?.data?.message || 'Failed to create story';
|
||||
setErrors({ submit: errorMessage });
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<>
|
||||
{/* Success Message */}
|
||||
{errors.success && (
|
||||
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg mb-6">
|
||||
<p className="text-green-800 dark:text-green-200">{errors.success}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<form onSubmit={handleSubmit} className="space-y-6">
|
||||
{/* Title */}
|
||||
<Input
|
||||
label="Title *"
|
||||
value={formData.title}
|
||||
onChange={handleInputChange('title')}
|
||||
placeholder="Enter the story title"
|
||||
error={errors.title}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Author Selector */}
|
||||
<AuthorSelector
|
||||
label="Author *"
|
||||
value={formData.authorName}
|
||||
onChange={handleAuthorChange}
|
||||
placeholder="Select or enter author name"
|
||||
error={errors.authorName}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Duplicate Warning */}
|
||||
{duplicateWarning.show && (
|
||||
<div className="p-4 bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg">
|
||||
<div className="flex items-start gap-3">
|
||||
<div className="text-yellow-600 dark:text-yellow-400 mt-0.5">
|
||||
⚠️
|
||||
</div>
|
||||
<div>
|
||||
<h4 className="font-medium text-yellow-800 dark:text-yellow-200">
|
||||
Potential Duplicate Detected
|
||||
</h4>
|
||||
<p className="text-sm text-yellow-700 dark:text-yellow-300 mt-1">
|
||||
Found {duplicateWarning.count} existing {duplicateWarning.count === 1 ? 'story' : 'stories'} with the same title and author:
|
||||
</p>
|
||||
<ul className="mt-2 space-y-1">
|
||||
{duplicateWarning.duplicates.map((duplicate, index) => (
|
||||
<li key={duplicate.id} className="text-sm text-yellow-700 dark:text-yellow-300">
|
||||
• <span className="font-medium">{duplicate.title}</span> by {duplicate.authorName}
|
||||
<span className="text-xs ml-2">
|
||||
(added {new Date(duplicate.createdAt).toLocaleDateString()})
|
||||
</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-xs text-yellow-600 dark:text-yellow-400 mt-2">
|
||||
You can still create this story if it's different from the existing ones.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Checking indicator */}
|
||||
{checkingDuplicates && (
|
||||
<div className="flex items-center gap-2 text-sm theme-text">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||
Checking for duplicates...
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Summary */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Summary
|
||||
</label>
|
||||
<Textarea
|
||||
value={formData.summary}
|
||||
onChange={handleInputChange('summary')}
|
||||
placeholder="Brief summary or description of the story..."
|
||||
rows={3}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-1">
|
||||
Optional summary that will be displayed on the story detail page
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Cover Image Upload */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Cover Image
|
||||
</label>
|
||||
<ImageUpload
|
||||
onImageSelect={setCoverImage}
|
||||
accept="image/jpeg,image/png"
|
||||
maxSizeMB={5}
|
||||
aspectRatio="3:4"
|
||||
placeholder="Drop a cover image here or click to select"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Story Content *
|
||||
</label>
|
||||
<SlateEditor
|
||||
value={formData.contentHtml}
|
||||
onChange={handleContentChange}
|
||||
placeholder="Write or paste your story content here..."
|
||||
error={errors.contentHtml}
|
||||
enableImageProcessing={false}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-2">
|
||||
💡 <strong>Tip:</strong> If you paste content with images, they'll be automatically downloaded and stored locally when you save the story.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Tags
|
||||
</label>
|
||||
<TagInput
|
||||
tags={formData.tags}
|
||||
onChange={handleTagsChange}
|
||||
placeholder="Add tags to categorize your story..."
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Series and Volume */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<SeriesSelector
|
||||
label="Series (optional)"
|
||||
value={formData.seriesName}
|
||||
onChange={handleSeriesChange}
|
||||
placeholder="Select or enter series name if part of a series"
|
||||
error={errors.seriesName}
|
||||
authorId={formData.authorId}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Volume/Part (optional)"
|
||||
type="number"
|
||||
min="1"
|
||||
value={formData.volume}
|
||||
onChange={handleInputChange('volume')}
|
||||
placeholder="Enter volume/part number"
|
||||
error={errors.volume}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Source URL */}
|
||||
<Input
|
||||
label="Source URL (optional)"
|
||||
type="url"
|
||||
value={formData.sourceUrl}
|
||||
onChange={handleInputChange('sourceUrl')}
|
||||
placeholder="https://example.com/original-story-url"
|
||||
/>
|
||||
|
||||
{/* Image Processing Indicator */}
|
||||
{processingImages && (
|
||||
<div className="p-4 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg">
|
||||
<div className="flex items-center gap-3">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
|
||||
<p className="text-blue-800 dark:text-blue-200">
|
||||
Processing and downloading images...
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Submit Error */}
|
||||
{errors.submit && (
|
||||
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<p className="text-red-800 dark:text-red-200">{errors.submit}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4 pt-6">
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={() => router.back()}
|
||||
disabled={loading}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
type="submit"
|
||||
loading={loading}
|
||||
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
|
||||
>
|
||||
{processingImages ? 'Processing Images...' : 'Add Story'}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</>
|
||||
);
|
||||
}
|
||||
@@ -1,554 +1,23 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import { useAuth } from '../../contexts/AuthContext';
|
||||
import { Suspense } from 'react';
|
||||
import ImportLayout from '../../components/layout/ImportLayout';
|
||||
import { Input, Textarea } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import TagInput from '../../components/stories/TagInput';
|
||||
import RichTextEditor from '../../components/stories/RichTextEditor';
|
||||
import ImageUpload from '../../components/ui/ImageUpload';
|
||||
import AuthorSelector from '../../components/stories/AuthorSelector';
|
||||
import SeriesSelector from '../../components/stories/SeriesSelector';
|
||||
import { storyApi, authorApi } from '../../lib/api';
|
||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||
import AddStoryContent from './AddStoryContent';
|
||||
|
||||
export default function AddStoryPage() {
|
||||
const [formData, setFormData] = useState({
|
||||
title: '',
|
||||
summary: '',
|
||||
authorName: '',
|
||||
authorId: undefined as string | undefined,
|
||||
contentHtml: '',
|
||||
sourceUrl: '',
|
||||
tags: [] as string[],
|
||||
seriesName: '',
|
||||
seriesId: undefined as string | undefined,
|
||||
volume: '',
|
||||
});
|
||||
|
||||
const [coverImage, setCoverImage] = useState<File | null>(null);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [processingImages, setProcessingImages] = useState(false);
|
||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||
const [duplicateWarning, setDuplicateWarning] = useState<{
|
||||
show: boolean;
|
||||
count: number;
|
||||
duplicates: Array<{
|
||||
id: string;
|
||||
title: string;
|
||||
authorName: string;
|
||||
createdAt: string;
|
||||
}>;
|
||||
}>({ show: false, count: 0, duplicates: [] });
|
||||
const [checkingDuplicates, setCheckingDuplicates] = useState(false);
|
||||
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { isAuthenticated } = useAuth();
|
||||
|
||||
// Handle URL parameters
|
||||
useEffect(() => {
|
||||
const authorId = searchParams.get('authorId');
|
||||
const from = searchParams.get('from');
|
||||
|
||||
// Pre-fill author if authorId is provided in URL
|
||||
if (authorId) {
|
||||
const loadAuthor = async () => {
|
||||
try {
|
||||
const author = await authorApi.getAuthor(authorId);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName: author.name,
|
||||
authorId: author.id
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error('Failed to load author:', error);
|
||||
}
|
||||
};
|
||||
loadAuthor();
|
||||
}
|
||||
|
||||
// Handle URL import data
|
||||
if (from === 'url-import') {
|
||||
const title = searchParams.get('title') || '';
|
||||
const summary = searchParams.get('summary') || '';
|
||||
const author = searchParams.get('author') || '';
|
||||
const sourceUrl = searchParams.get('sourceUrl') || '';
|
||||
const tagsParam = searchParams.get('tags');
|
||||
const content = searchParams.get('content') || '';
|
||||
|
||||
let tags: string[] = [];
|
||||
try {
|
||||
tags = tagsParam ? JSON.parse(tagsParam) : [];
|
||||
} catch (error) {
|
||||
console.error('Failed to parse tags:', error);
|
||||
tags = [];
|
||||
}
|
||||
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
title,
|
||||
summary,
|
||||
authorName: author,
|
||||
authorId: undefined, // Reset author ID when importing from URL
|
||||
contentHtml: content,
|
||||
sourceUrl,
|
||||
tags
|
||||
}));
|
||||
|
||||
// Show success message
|
||||
setErrors({ success: 'Story data imported successfully! Review and edit as needed before saving.' });
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Load pending story data from bulk combine operation
|
||||
useEffect(() => {
|
||||
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
|
||||
if (fromBulkCombine) {
|
||||
const pendingStoryData = localStorage.getItem('pendingStory');
|
||||
if (pendingStoryData) {
|
||||
try {
|
||||
const storyData = JSON.parse(pendingStoryData);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
title: storyData.title || '',
|
||||
authorName: storyData.author || '',
|
||||
authorId: undefined, // Reset author ID for bulk combined stories
|
||||
contentHtml: storyData.content || '',
|
||||
sourceUrl: storyData.sourceUrl || '',
|
||||
summary: storyData.summary || '',
|
||||
tags: storyData.tags || []
|
||||
}));
|
||||
// Clear the pending data
|
||||
localStorage.removeItem('pendingStory');
|
||||
} catch (error) {
|
||||
console.error('Failed to load pending story data:', error);
|
||||
}
|
||||
}
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Check for duplicates when title and author are both present
|
||||
useEffect(() => {
|
||||
const checkDuplicates = async () => {
|
||||
const title = formData.title.trim();
|
||||
const authorName = formData.authorName.trim();
|
||||
|
||||
// Don't check if user isn't authenticated or if title/author are empty
|
||||
if (!isAuthenticated || !title || !authorName) {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
return;
|
||||
}
|
||||
|
||||
// Debounce the check to avoid too many API calls
|
||||
const timeoutId = setTimeout(async () => {
|
||||
try {
|
||||
setCheckingDuplicates(true);
|
||||
const result = await storyApi.checkDuplicate(title, authorName);
|
||||
|
||||
if (result.hasDuplicates) {
|
||||
setDuplicateWarning({
|
||||
show: true,
|
||||
count: result.count,
|
||||
duplicates: result.duplicates
|
||||
});
|
||||
} else {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to check for duplicates:', error);
|
||||
// Clear any existing duplicate warnings on error
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
// Don't show error to user as this is just a helpful warning
|
||||
// Authentication errors will be handled by the API interceptor
|
||||
} finally {
|
||||
setCheckingDuplicates(false);
|
||||
}
|
||||
}, 500); // 500ms debounce
|
||||
|
||||
return () => clearTimeout(timeoutId);
|
||||
};
|
||||
|
||||
checkDuplicates();
|
||||
}, [formData.title, formData.authorName, isAuthenticated]);
|
||||
|
||||
const handleInputChange = (field: string) => (
|
||||
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
|
||||
) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
[field]: e.target.value
|
||||
}));
|
||||
|
||||
// Clear error when user starts typing
|
||||
if (errors[field]) {
|
||||
setErrors(prev => ({ ...prev, [field]: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleContentChange = (html: string) => {
|
||||
setFormData(prev => ({ ...prev, contentHtml: html }));
|
||||
if (errors.contentHtml) {
|
||||
setErrors(prev => ({ ...prev, contentHtml: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleTagsChange = (tags: string[]) => {
|
||||
setFormData(prev => ({ ...prev, tags }));
|
||||
};
|
||||
|
||||
const handleAuthorChange = (authorName: string, authorId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName,
|
||||
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes author
|
||||
if (errors.authorName) {
|
||||
setErrors(prev => ({ ...prev, authorName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleSeriesChange = (seriesName: string, seriesId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
seriesName,
|
||||
seriesId: seriesId // This will be undefined if creating new series, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes series
|
||||
if (errors.seriesName) {
|
||||
setErrors(prev => ({ ...prev, seriesName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
const newErrors: Record<string, string> = {};
|
||||
|
||||
if (!formData.title.trim()) {
|
||||
newErrors.title = 'Title is required';
|
||||
}
|
||||
|
||||
if (!formData.authorName.trim()) {
|
||||
newErrors.authorName = 'Author name is required';
|
||||
}
|
||||
|
||||
if (!formData.contentHtml.trim()) {
|
||||
newErrors.contentHtml = 'Story content is required';
|
||||
}
|
||||
|
||||
if (formData.seriesName && !formData.volume) {
|
||||
newErrors.volume = 'Volume number is required when series is specified';
|
||||
}
|
||||
|
||||
if (formData.volume && !formData.seriesName.trim()) {
|
||||
newErrors.seriesName = 'Series name is required when volume is specified';
|
||||
}
|
||||
|
||||
setErrors(newErrors);
|
||||
return Object.keys(newErrors).length === 0;
|
||||
};
|
||||
|
||||
// Helper function to detect external images in HTML content
|
||||
const hasExternalImages = (htmlContent: string): boolean => {
|
||||
if (!htmlContent) return false;
|
||||
|
||||
// Create a temporary DOM element to parse HTML
|
||||
const tempDiv = document.createElement('div');
|
||||
tempDiv.innerHTML = htmlContent;
|
||||
|
||||
const images = tempDiv.querySelectorAll('img');
|
||||
for (let i = 0; i < images.length; i++) {
|
||||
const img = images[i];
|
||||
const src = img.getAttribute('src');
|
||||
if (src && (src.startsWith('http://') || src.startsWith('https://'))) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!validateForm()) {
|
||||
return;
|
||||
}
|
||||
|
||||
setLoading(true);
|
||||
|
||||
try {
|
||||
// First, create the story with JSON data
|
||||
const storyData = {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: formData.contentHtml,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
// Send seriesId if we have it (existing series), otherwise send seriesName (new series)
|
||||
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
|
||||
// Send authorId if we have it (existing author), otherwise send authorName (new author)
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
};
|
||||
|
||||
const story = await storyApi.createStory(storyData);
|
||||
|
||||
// Process images if there are external images in the content
|
||||
if (hasExternalImages(formData.contentHtml)) {
|
||||
try {
|
||||
setProcessingImages(true);
|
||||
const imageResult = await storyApi.processContentImages(story.id, formData.contentHtml);
|
||||
|
||||
// If images were processed and content was updated, save the updated content
|
||||
if (imageResult.processedContent !== formData.contentHtml) {
|
||||
await storyApi.updateStory(story.id, {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: imageResult.processedContent,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
});
|
||||
|
||||
// Show success message with image processing info
|
||||
if (imageResult.downloadedImages.length > 0) {
|
||||
console.log(`Successfully processed ${imageResult.downloadedImages.length} images`);
|
||||
}
|
||||
if (imageResult.warnings && imageResult.warnings.length > 0) {
|
||||
console.warn('Image processing warnings:', imageResult.warnings);
|
||||
}
|
||||
}
|
||||
} catch (imageError) {
|
||||
console.error('Failed to process images:', imageError);
|
||||
// Don't fail the entire operation if image processing fails
|
||||
// The story was created successfully, just without processed images
|
||||
} finally {
|
||||
setProcessingImages(false);
|
||||
}
|
||||
}
|
||||
|
||||
// If there's a cover image, upload it separately
|
||||
if (coverImage) {
|
||||
await storyApi.uploadCover(story.id, coverImage);
|
||||
}
|
||||
|
||||
router.push(`/stories/${story.id}/detail`);
|
||||
} catch (error: any) {
|
||||
console.error('Failed to create story:', error);
|
||||
const errorMessage = error.response?.data?.message || 'Failed to create story';
|
||||
setErrors({ submit: errorMessage });
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<ImportLayout
|
||||
title="Add New Story"
|
||||
description="Add a story to your personal collection"
|
||||
>
|
||||
{/* Success Message */}
|
||||
{errors.success && (
|
||||
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg mb-6">
|
||||
<p className="text-green-800 dark:text-green-200">{errors.success}</p>
|
||||
<Suspense fallback={
|
||||
<div className="flex items-center justify-center py-20">
|
||||
<LoadingSpinner size="lg" />
|
||||
</div>
|
||||
)}
|
||||
|
||||
<form onSubmit={handleSubmit} className="space-y-6">
|
||||
{/* Title */}
|
||||
<Input
|
||||
label="Title *"
|
||||
value={formData.title}
|
||||
onChange={handleInputChange('title')}
|
||||
placeholder="Enter the story title"
|
||||
error={errors.title}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Author Selector */}
|
||||
<AuthorSelector
|
||||
label="Author *"
|
||||
value={formData.authorName}
|
||||
onChange={handleAuthorChange}
|
||||
placeholder="Select or enter author name"
|
||||
error={errors.authorName}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Duplicate Warning */}
|
||||
{duplicateWarning.show && (
|
||||
<div className="p-4 bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg">
|
||||
<div className="flex items-start gap-3">
|
||||
<div className="text-yellow-600 dark:text-yellow-400 mt-0.5">
|
||||
⚠️
|
||||
</div>
|
||||
<div>
|
||||
<h4 className="font-medium text-yellow-800 dark:text-yellow-200">
|
||||
Potential Duplicate Detected
|
||||
</h4>
|
||||
<p className="text-sm text-yellow-700 dark:text-yellow-300 mt-1">
|
||||
Found {duplicateWarning.count} existing {duplicateWarning.count === 1 ? 'story' : 'stories'} with the same title and author:
|
||||
</p>
|
||||
<ul className="mt-2 space-y-1">
|
||||
{duplicateWarning.duplicates.map((duplicate, index) => (
|
||||
<li key={duplicate.id} className="text-sm text-yellow-700 dark:text-yellow-300">
|
||||
• <span className="font-medium">{duplicate.title}</span> by {duplicate.authorName}
|
||||
<span className="text-xs ml-2">
|
||||
(added {new Date(duplicate.createdAt).toLocaleDateString()})
|
||||
</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-xs text-yellow-600 dark:text-yellow-400 mt-2">
|
||||
You can still create this story if it's different from the existing ones.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Checking indicator */}
|
||||
{checkingDuplicates && (
|
||||
<div className="flex items-center gap-2 text-sm theme-text">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||
Checking for duplicates...
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Summary */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Summary
|
||||
</label>
|
||||
<Textarea
|
||||
value={formData.summary}
|
||||
onChange={handleInputChange('summary')}
|
||||
placeholder="Brief summary or description of the story..."
|
||||
rows={3}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-1">
|
||||
Optional summary that will be displayed on the story detail page
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Cover Image Upload */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Cover Image
|
||||
</label>
|
||||
<ImageUpload
|
||||
onImageSelect={setCoverImage}
|
||||
accept="image/jpeg,image/png"
|
||||
maxSizeMB={5}
|
||||
aspectRatio="3:4"
|
||||
placeholder="Drop a cover image here or click to select"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Story Content *
|
||||
</label>
|
||||
<RichTextEditor
|
||||
value={formData.contentHtml}
|
||||
onChange={handleContentChange}
|
||||
placeholder="Write or paste your story content here..."
|
||||
error={errors.contentHtml}
|
||||
enableImageProcessing={false}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-2">
|
||||
💡 <strong>Tip:</strong> If you paste content with images, they'll be automatically downloaded and stored locally when you save the story.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Tags
|
||||
</label>
|
||||
<TagInput
|
||||
tags={formData.tags}
|
||||
onChange={handleTagsChange}
|
||||
placeholder="Add tags to categorize your story..."
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Series and Volume */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<SeriesSelector
|
||||
label="Series (optional)"
|
||||
value={formData.seriesName}
|
||||
onChange={handleSeriesChange}
|
||||
placeholder="Select or enter series name if part of a series"
|
||||
error={errors.seriesName}
|
||||
authorId={formData.authorId}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Volume/Part (optional)"
|
||||
type="number"
|
||||
min="1"
|
||||
value={formData.volume}
|
||||
onChange={handleInputChange('volume')}
|
||||
placeholder="Enter volume/part number"
|
||||
error={errors.volume}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Source URL */}
|
||||
<Input
|
||||
label="Source URL (optional)"
|
||||
type="url"
|
||||
value={formData.sourceUrl}
|
||||
onChange={handleInputChange('sourceUrl')}
|
||||
placeholder="https://example.com/original-story-url"
|
||||
/>
|
||||
|
||||
{/* Image Processing Indicator */}
|
||||
{processingImages && (
|
||||
<div className="p-4 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg">
|
||||
<div className="flex items-center gap-3">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
|
||||
<p className="text-blue-800 dark:text-blue-200">
|
||||
Processing and downloading images...
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Submit Error */}
|
||||
{errors.submit && (
|
||||
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<p className="text-red-800 dark:text-red-200">{errors.submit}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4 pt-6">
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={() => router.back()}
|
||||
disabled={loading}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
type="submit"
|
||||
loading={loading}
|
||||
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
|
||||
>
|
||||
{processingImages ? 'Processing Images...' : 'Add Story'}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
}>
|
||||
<AddStoryContent />
|
||||
</Suspense>
|
||||
</ImportLayout>
|
||||
);
|
||||
}
|
||||
@@ -22,7 +22,7 @@ export default function AuthorsPage() {
|
||||
const [currentPage, setCurrentPage] = useState(0);
|
||||
const [totalHits, setTotalHits] = useState(0);
|
||||
const [hasMore, setHasMore] = useState(false);
|
||||
const ITEMS_PER_PAGE = 50; // Safe limit under Typesense's 250 limit
|
||||
const ITEMS_PER_PAGE = 50;
|
||||
|
||||
useEffect(() => {
|
||||
const debounceTimer = setTimeout(() => {
|
||||
@@ -35,12 +35,14 @@ export default function AuthorsPage() {
|
||||
} else {
|
||||
setSearchLoading(true);
|
||||
}
|
||||
const searchResults = await authorApi.searchAuthorsTypesense({
|
||||
q: searchQuery || '*',
|
||||
|
||||
// Use Solr search for all queries (including empty search)
|
||||
const searchResults = await authorApi.searchAuthors({
|
||||
query: searchQuery || '*', // Use '*' for all authors when no search query
|
||||
page: currentPage,
|
||||
size: ITEMS_PER_PAGE,
|
||||
sortBy: sortBy,
|
||||
sortOrder: sortOrder
|
||||
sortDir: sortOrder
|
||||
});
|
||||
|
||||
if (currentPage === 0) {
|
||||
@@ -53,23 +55,11 @@ export default function AuthorsPage() {
|
||||
setFilteredAuthors(prev => [...prev, ...(searchResults.results || [])]);
|
||||
}
|
||||
|
||||
setTotalHits(searchResults.totalHits);
|
||||
setHasMore(searchResults.results.length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < searchResults.totalHits);
|
||||
setTotalHits(searchResults.totalHits || 0);
|
||||
setHasMore((searchResults.results || []).length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < (searchResults.totalHits || 0));
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to load authors:', error);
|
||||
// Fallback to regular API if Typesense fails (only for first page)
|
||||
if (currentPage === 0) {
|
||||
try {
|
||||
const authorsResult = await authorApi.getAuthors({ page: 0, size: ITEMS_PER_PAGE });
|
||||
setAuthors(authorsResult.content || []);
|
||||
setFilteredAuthors(authorsResult.content || []);
|
||||
setTotalHits(authorsResult.totalElements || 0);
|
||||
setHasMore(authorsResult.content.length === ITEMS_PER_PAGE);
|
||||
} catch (fallbackError) {
|
||||
console.error('Fallback also failed:', fallbackError);
|
||||
}
|
||||
}
|
||||
console.error('Failed to search authors:', error);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
setSearchLoading(false);
|
||||
@@ -95,7 +85,7 @@ export default function AuthorsPage() {
|
||||
}
|
||||
};
|
||||
|
||||
// Client-side filtering no longer needed since we use Typesense
|
||||
// No longer needed - Solr search handles filtering directly
|
||||
|
||||
// Note: We no longer have individual story ratings in the author list
|
||||
// Average rating would need to be calculated on backend if needed
|
||||
@@ -118,8 +108,7 @@ export default function AuthorsPage() {
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold theme-header">Authors</h1>
|
||||
<p className="theme-text mt-1">
|
||||
{filteredAuthors.length} of {totalHits} {totalHits === 1 ? 'author' : 'authors'}
|
||||
{searchQuery ? ` found` : ` in your library`}
|
||||
{searchQuery ? `${totalHits} authors found` : `${totalHits} authors in your library`}
|
||||
{hasMore && ` (showing first ${filteredAuthors.length})`}
|
||||
</p>
|
||||
</div>
|
||||
@@ -218,7 +207,7 @@ export default function AuthorsPage() {
|
||||
)}
|
||||
|
||||
{/* Load More Button */}
|
||||
{hasMore && (
|
||||
{hasMore && !searchQuery && (
|
||||
<div className="flex justify-center pt-8">
|
||||
<Button
|
||||
onClick={loadMore}
|
||||
|
||||
@@ -139,6 +139,15 @@
|
||||
@apply max-w-full h-auto mx-auto my-6 rounded-lg shadow-sm;
|
||||
max-height: 80vh; /* Prevent images from being too tall */
|
||||
display: block;
|
||||
/* Optimize for performance and prevent reloading */
|
||||
will-change: auto;
|
||||
transform: translateZ(0); /* Force hardware acceleration */
|
||||
backface-visibility: hidden;
|
||||
image-rendering: optimizeQuality;
|
||||
/* Prevent layout shifts that might trigger reloads */
|
||||
box-sizing: border-box;
|
||||
/* Ensure stable dimensions */
|
||||
min-height: 1px;
|
||||
}
|
||||
|
||||
.reading-content img[align="left"] {
|
||||
|
||||
341
frontend/src/app/library/LibraryContent.tsx
Normal file
341
frontend/src/app/library/LibraryContent.tsx
Normal file
@@ -0,0 +1,341 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import { searchApi, storyApi, tagApi } from '../../lib/api';
|
||||
import { Story, Tag, FacetCount, AdvancedFilters } from '../../types/api';
|
||||
import { Input } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import StoryMultiSelect from '../../components/stories/StoryMultiSelect';
|
||||
import TagFilter from '../../components/stories/TagFilter';
|
||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||
import SidebarLayout from '../../components/library/SidebarLayout';
|
||||
import ToolbarLayout from '../../components/library/ToolbarLayout';
|
||||
import MinimalLayout from '../../components/library/MinimalLayout';
|
||||
import { useLibraryLayout } from '../../hooks/useLibraryLayout';
|
||||
|
||||
type ViewMode = 'grid' | 'list';
|
||||
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastReadAt';
|
||||
|
||||
export default function LibraryContent() {
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { layout } = useLibraryLayout();
|
||||
const [stories, setStories] = useState<Story[]>([]);
|
||||
const [tags, setTags] = useState<Tag[]>([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [searchLoading, setSearchLoading] = useState(false);
|
||||
const [randomLoading, setRandomLoading] = useState(false);
|
||||
const [searchQuery, setSearchQuery] = useState('');
|
||||
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
||||
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
||||
const [sortOption, setSortOption] = useState<SortOption>('lastReadAt');
|
||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
||||
const [page, setPage] = useState(0);
|
||||
const [totalPages, setTotalPages] = useState(1);
|
||||
const [totalElements, setTotalElements] = useState(0);
|
||||
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
||||
const [urlParamsProcessed, setUrlParamsProcessed] = useState(false);
|
||||
const [advancedFilters, setAdvancedFilters] = useState<AdvancedFilters>({});
|
||||
|
||||
// Initialize filters from URL parameters
|
||||
useEffect(() => {
|
||||
const tagsParam = searchParams.get('tags');
|
||||
if (tagsParam) {
|
||||
console.log('URL tag filter detected:', tagsParam);
|
||||
// Use functional updates to ensure all state changes happen together
|
||||
setSelectedTags([tagsParam]);
|
||||
setPage(0); // Reset to first page when applying URL filter
|
||||
}
|
||||
setUrlParamsProcessed(true);
|
||||
}, [searchParams]);
|
||||
|
||||
// Convert facet counts to Tag objects for the UI, enriched with full tag data
|
||||
const [fullTags, setFullTags] = useState<Tag[]>([]);
|
||||
|
||||
// Fetch full tag data for enrichment
|
||||
useEffect(() => {
|
||||
const fetchFullTags = async () => {
|
||||
try {
|
||||
const result = await tagApi.getTags({ size: 1000 }); // Get all tags
|
||||
setFullTags(result.content || []);
|
||||
} catch (error) {
|
||||
console.error('Failed to fetch full tag data:', error);
|
||||
setFullTags([]);
|
||||
}
|
||||
};
|
||||
|
||||
fetchFullTags();
|
||||
}, []);
|
||||
|
||||
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
|
||||
if (!facets || !facets.tagNames_facet) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return facets.tagNames_facet.map(facet => {
|
||||
// Find the full tag data by name
|
||||
const fullTag = fullTags.find(tag => tag.name.toLowerCase() === facet.value.toLowerCase());
|
||||
|
||||
return {
|
||||
id: fullTag?.id || facet.value, // Use actual ID if available, fallback to name
|
||||
name: facet.value,
|
||||
storyCount: facet.count,
|
||||
// Include color and other metadata from the full tag data
|
||||
color: fullTag?.color,
|
||||
description: fullTag?.description,
|
||||
aliasCount: fullTag?.aliasCount,
|
||||
createdAt: fullTag?.createdAt,
|
||||
aliases: fullTag?.aliases
|
||||
};
|
||||
});
|
||||
};
|
||||
|
||||
// Enrich existing tags when fullTags are loaded
|
||||
useEffect(() => {
|
||||
if (fullTags.length > 0) {
|
||||
// Use functional update to get the current tags state
|
||||
setTags(currentTags => {
|
||||
if (currentTags.length > 0) {
|
||||
// Check if tags already have color data to avoid infinite loops
|
||||
const hasColors = currentTags.some(tag => tag.color);
|
||||
if (!hasColors) {
|
||||
// Re-enrich existing tags with color data
|
||||
return currentTags.map(tag => {
|
||||
const fullTag = fullTags.find(ft => ft.name.toLowerCase() === tag.name.toLowerCase());
|
||||
return {
|
||||
...tag,
|
||||
color: fullTag?.color,
|
||||
description: fullTag?.description,
|
||||
aliasCount: fullTag?.aliasCount,
|
||||
createdAt: fullTag?.createdAt,
|
||||
aliases: fullTag?.aliases,
|
||||
id: fullTag?.id || tag.id
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
return currentTags; // Return unchanged if no enrichment needed
|
||||
});
|
||||
}
|
||||
}, [fullTags]); // Only run when fullTags change
|
||||
|
||||
// Debounce search to avoid too many API calls
|
||||
useEffect(() => {
|
||||
// Don't run search until URL parameters have been processed
|
||||
if (!urlParamsProcessed) return;
|
||||
|
||||
const debounceTimer = setTimeout(() => {
|
||||
const performSearch = async () => {
|
||||
try {
|
||||
// Use searchLoading for background search, loading only for initial load
|
||||
const isInitialLoad = stories.length === 0 && !searchQuery;
|
||||
if (isInitialLoad) {
|
||||
setLoading(true);
|
||||
} else {
|
||||
setSearchLoading(true);
|
||||
}
|
||||
|
||||
// Always use search API for consistency - use '*' for match-all when no query
|
||||
const apiParams = {
|
||||
query: searchQuery.trim() || '*',
|
||||
page: page, // Use 0-based pagination consistently
|
||||
size: 20,
|
||||
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
||||
sortBy: sortOption,
|
||||
sortDir: sortDirection,
|
||||
facetBy: ['tagNames'], // Request tag facets for the filter UI
|
||||
// Advanced filters
|
||||
...advancedFilters
|
||||
};
|
||||
|
||||
console.log('Performing search with params:', apiParams);
|
||||
const result = await searchApi.search(apiParams);
|
||||
|
||||
const currentStories = result?.results || [];
|
||||
setStories(currentStories);
|
||||
setTotalPages(Math.ceil((result?.totalHits || 0) / 20));
|
||||
setTotalElements(result?.totalHits || 0);
|
||||
|
||||
// Update tags from facets - these represent all matching stories, not just current page
|
||||
const resultTags = convertFacetsToTags(result?.facets);
|
||||
setTags(resultTags);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to load stories:', error);
|
||||
setStories([]);
|
||||
setTags([]);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
setSearchLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
performSearch();
|
||||
}, searchQuery ? 500 : 0); // Debounce search queries, but load immediately for filters/pagination
|
||||
|
||||
return () => clearTimeout(debounceTimer);
|
||||
}, [searchQuery, selectedTags, sortOption, sortDirection, page, refreshTrigger, urlParamsProcessed, advancedFilters]);
|
||||
|
||||
const handleSearchChange = (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
setSearchQuery(e.target.value);
|
||||
setPage(0);
|
||||
};
|
||||
|
||||
const handleStoryUpdate = () => {
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleRandomStory = async () => {
|
||||
if (totalElements === 0) return;
|
||||
|
||||
try {
|
||||
setRandomLoading(true);
|
||||
const randomStory = await storyApi.getRandomStory({
|
||||
searchQuery: searchQuery || undefined,
|
||||
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
||||
...advancedFilters
|
||||
});
|
||||
if (randomStory) {
|
||||
router.push(`/stories/${randomStory.id}`);
|
||||
} else {
|
||||
alert('No stories available. Please add some stories first.');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to get random story:', error);
|
||||
alert('Failed to get a random story. Please try again.');
|
||||
} finally {
|
||||
setRandomLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const clearFilters = () => {
|
||||
setSearchQuery('');
|
||||
setSelectedTags([]);
|
||||
setAdvancedFilters({});
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleTagToggle = (tagName: string) => {
|
||||
setSelectedTags(prev =>
|
||||
prev.includes(tagName)
|
||||
? prev.filter(t => t !== tagName)
|
||||
: [...prev, tagName]
|
||||
);
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleSortDirectionToggle = () => {
|
||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||
};
|
||||
|
||||
const handleAdvancedFiltersChange = (filters: AdvancedFilters) => {
|
||||
setAdvancedFilters(filters);
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="flex items-center justify-center py-20">
|
||||
<LoadingSpinner size="lg" />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
const handleSortChange = (option: string) => {
|
||||
setSortOption(option as SortOption);
|
||||
};
|
||||
|
||||
const layoutProps = {
|
||||
stories,
|
||||
tags,
|
||||
totalElements,
|
||||
searchQuery,
|
||||
selectedTags,
|
||||
viewMode,
|
||||
sortOption,
|
||||
sortDirection,
|
||||
advancedFilters,
|
||||
onSearchChange: handleSearchChange,
|
||||
onTagToggle: handleTagToggle,
|
||||
onViewModeChange: setViewMode,
|
||||
onSortChange: handleSortChange,
|
||||
onSortDirectionToggle: handleSortDirectionToggle,
|
||||
onAdvancedFiltersChange: handleAdvancedFiltersChange,
|
||||
onRandomStory: handleRandomStory,
|
||||
onClearFilters: clearFilters,
|
||||
};
|
||||
|
||||
const renderContent = () => {
|
||||
if (stories.length === 0 && !loading) {
|
||||
return (
|
||||
<div className="text-center py-12 theme-card theme-shadow rounded-lg">
|
||||
<p className="theme-text text-lg mb-4">
|
||||
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false)
|
||||
? 'No stories match your search criteria.'
|
||||
: 'Your library is empty.'
|
||||
}
|
||||
</p>
|
||||
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false) ? (
|
||||
<Button variant="ghost" onClick={clearFilters}>
|
||||
Clear Filters
|
||||
</Button>
|
||||
) : (
|
||||
<Button href="/add-story">
|
||||
Add Your First Story
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<StoryMultiSelect
|
||||
stories={stories}
|
||||
viewMode={viewMode}
|
||||
onUpdate={handleStoryUpdate}
|
||||
allowMultiSelect={true}
|
||||
/>
|
||||
|
||||
{/* Pagination */}
|
||||
{totalPages > 1 && (
|
||||
<div className="flex justify-center gap-2 mt-8">
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => setPage(page - 1)}
|
||||
disabled={page === 0}
|
||||
>
|
||||
Previous
|
||||
</Button>
|
||||
|
||||
<span className="flex items-center px-4 py-2 theme-text">
|
||||
Page {page + 1} of {totalPages}
|
||||
</span>
|
||||
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => setPage(page + 1)}
|
||||
disabled={page >= totalPages - 1}
|
||||
>
|
||||
Next
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
const LayoutComponent = layout === 'sidebar' ? SidebarLayout :
|
||||
layout === 'toolbar' ? ToolbarLayout :
|
||||
MinimalLayout;
|
||||
|
||||
return (
|
||||
<LayoutComponent {...layoutProps}>
|
||||
{renderContent()}
|
||||
</LayoutComponent>
|
||||
);
|
||||
}
|
||||
@@ -1,346 +1,20 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import { searchApi, storyApi, tagApi } from '../../lib/api';
|
||||
import { Story, Tag, FacetCount, AdvancedFilters } from '../../types/api';
|
||||
import { Suspense } from 'react';
|
||||
import AppLayout from '../../components/layout/AppLayout';
|
||||
import { Input } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import StoryMultiSelect from '../../components/stories/StoryMultiSelect';
|
||||
import TagFilter from '../../components/stories/TagFilter';
|
||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||
import SidebarLayout from '../../components/library/SidebarLayout';
|
||||
import ToolbarLayout from '../../components/library/ToolbarLayout';
|
||||
import MinimalLayout from '../../components/library/MinimalLayout';
|
||||
import { useLibraryLayout } from '../../hooks/useLibraryLayout';
|
||||
|
||||
type ViewMode = 'grid' | 'list';
|
||||
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastRead';
|
||||
import LibraryContent from './LibraryContent';
|
||||
|
||||
export default function LibraryPage() {
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { layout } = useLibraryLayout();
|
||||
const [stories, setStories] = useState<Story[]>([]);
|
||||
const [tags, setTags] = useState<Tag[]>([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [searchLoading, setSearchLoading] = useState(false);
|
||||
const [randomLoading, setRandomLoading] = useState(false);
|
||||
const [searchQuery, setSearchQuery] = useState('');
|
||||
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
||||
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
||||
const [sortOption, setSortOption] = useState<SortOption>('lastRead');
|
||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
||||
const [page, setPage] = useState(0);
|
||||
const [totalPages, setTotalPages] = useState(1);
|
||||
const [totalElements, setTotalElements] = useState(0);
|
||||
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
||||
const [urlParamsProcessed, setUrlParamsProcessed] = useState(false);
|
||||
const [advancedFilters, setAdvancedFilters] = useState<AdvancedFilters>({});
|
||||
|
||||
// Initialize filters from URL parameters
|
||||
useEffect(() => {
|
||||
const tagsParam = searchParams.get('tags');
|
||||
if (tagsParam) {
|
||||
console.log('URL tag filter detected:', tagsParam);
|
||||
// Use functional updates to ensure all state changes happen together
|
||||
setSelectedTags([tagsParam]);
|
||||
setPage(0); // Reset to first page when applying URL filter
|
||||
}
|
||||
setUrlParamsProcessed(true);
|
||||
}, [searchParams]);
|
||||
|
||||
// Convert facet counts to Tag objects for the UI, enriched with full tag data
|
||||
const [fullTags, setFullTags] = useState<Tag[]>([]);
|
||||
|
||||
// Fetch full tag data for enrichment
|
||||
useEffect(() => {
|
||||
const fetchFullTags = async () => {
|
||||
try {
|
||||
const result = await tagApi.getTags({ size: 1000 }); // Get all tags
|
||||
setFullTags(result.content || []);
|
||||
} catch (error) {
|
||||
console.error('Failed to fetch full tag data:', error);
|
||||
setFullTags([]);
|
||||
}
|
||||
};
|
||||
|
||||
fetchFullTags();
|
||||
}, []);
|
||||
|
||||
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
|
||||
if (!facets || !facets.tagNames) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return facets.tagNames.map(facet => {
|
||||
// Find the full tag data by name
|
||||
const fullTag = fullTags.find(tag => tag.name.toLowerCase() === facet.value.toLowerCase());
|
||||
|
||||
return {
|
||||
id: fullTag?.id || facet.value, // Use actual ID if available, fallback to name
|
||||
name: facet.value,
|
||||
storyCount: facet.count,
|
||||
// Include color and other metadata from the full tag data
|
||||
color: fullTag?.color,
|
||||
description: fullTag?.description,
|
||||
aliasCount: fullTag?.aliasCount,
|
||||
createdAt: fullTag?.createdAt,
|
||||
aliases: fullTag?.aliases
|
||||
};
|
||||
});
|
||||
};
|
||||
|
||||
// Enrich existing tags when fullTags are loaded
|
||||
useEffect(() => {
|
||||
if (fullTags.length > 0) {
|
||||
// Use functional update to get the current tags state
|
||||
setTags(currentTags => {
|
||||
if (currentTags.length > 0) {
|
||||
// Check if tags already have color data to avoid infinite loops
|
||||
const hasColors = currentTags.some(tag => tag.color);
|
||||
if (!hasColors) {
|
||||
// Re-enrich existing tags with color data
|
||||
return currentTags.map(tag => {
|
||||
const fullTag = fullTags.find(ft => ft.name.toLowerCase() === tag.name.toLowerCase());
|
||||
return {
|
||||
...tag,
|
||||
color: fullTag?.color,
|
||||
description: fullTag?.description,
|
||||
aliasCount: fullTag?.aliasCount,
|
||||
createdAt: fullTag?.createdAt,
|
||||
aliases: fullTag?.aliases,
|
||||
id: fullTag?.id || tag.id
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
return currentTags; // Return unchanged if no enrichment needed
|
||||
});
|
||||
}
|
||||
}, [fullTags]); // Only run when fullTags change
|
||||
|
||||
// Debounce search to avoid too many API calls
|
||||
useEffect(() => {
|
||||
// Don't run search until URL parameters have been processed
|
||||
if (!urlParamsProcessed) return;
|
||||
|
||||
const debounceTimer = setTimeout(() => {
|
||||
const performSearch = async () => {
|
||||
try {
|
||||
// Use searchLoading for background search, loading only for initial load
|
||||
const isInitialLoad = stories.length === 0 && !searchQuery;
|
||||
if (isInitialLoad) {
|
||||
setLoading(true);
|
||||
} else {
|
||||
setSearchLoading(true);
|
||||
}
|
||||
|
||||
// Always use search API for consistency - use '*' for match-all when no query
|
||||
const apiParams = {
|
||||
query: searchQuery.trim() || '*',
|
||||
page: page, // Use 0-based pagination consistently
|
||||
size: 20,
|
||||
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
||||
sortBy: sortOption,
|
||||
sortDir: sortDirection,
|
||||
facetBy: ['tagNames'], // Request tag facets for the filter UI
|
||||
// Advanced filters
|
||||
...advancedFilters
|
||||
};
|
||||
|
||||
console.log('Performing search with params:', apiParams);
|
||||
const result = await searchApi.search(apiParams);
|
||||
|
||||
const currentStories = result?.results || [];
|
||||
setStories(currentStories);
|
||||
setTotalPages(Math.ceil((result?.totalHits || 0) / 20));
|
||||
setTotalElements(result?.totalHits || 0);
|
||||
|
||||
// Update tags from facets - these represent all matching stories, not just current page
|
||||
const resultTags = convertFacetsToTags(result?.facets);
|
||||
setTags(resultTags);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to load stories:', error);
|
||||
setStories([]);
|
||||
setTags([]);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
setSearchLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
performSearch();
|
||||
}, searchQuery ? 500 : 0); // Debounce search queries, but load immediately for filters/pagination
|
||||
|
||||
return () => clearTimeout(debounceTimer);
|
||||
}, [searchQuery, selectedTags, sortOption, sortDirection, page, refreshTrigger, urlParamsProcessed, advancedFilters]);
|
||||
|
||||
const handleSearchChange = (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
setSearchQuery(e.target.value);
|
||||
setPage(0);
|
||||
};
|
||||
|
||||
const handleStoryUpdate = () => {
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleRandomStory = async () => {
|
||||
if (totalElements === 0) return;
|
||||
|
||||
try {
|
||||
setRandomLoading(true);
|
||||
const randomStory = await storyApi.getRandomStory({
|
||||
searchQuery: searchQuery || undefined,
|
||||
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
||||
...advancedFilters
|
||||
});
|
||||
if (randomStory) {
|
||||
router.push(`/stories/${randomStory.id}`);
|
||||
} else {
|
||||
alert('No stories available. Please add some stories first.');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to get random story:', error);
|
||||
alert('Failed to get a random story. Please try again.');
|
||||
} finally {
|
||||
setRandomLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const clearFilters = () => {
|
||||
setSearchQuery('');
|
||||
setSelectedTags([]);
|
||||
setAdvancedFilters({});
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleTagToggle = (tagName: string) => {
|
||||
setSelectedTags(prev =>
|
||||
prev.includes(tagName)
|
||||
? prev.filter(t => t !== tagName)
|
||||
: [...prev, tagName]
|
||||
);
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleSortDirectionToggle = () => {
|
||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||
};
|
||||
|
||||
const handleAdvancedFiltersChange = (filters: AdvancedFilters) => {
|
||||
setAdvancedFilters(filters);
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<AppLayout>
|
||||
return (
|
||||
<AppLayout>
|
||||
<Suspense fallback={
|
||||
<div className="flex items-center justify-center py-20">
|
||||
<LoadingSpinner size="lg" />
|
||||
</div>
|
||||
</AppLayout>
|
||||
);
|
||||
}
|
||||
|
||||
const handleSortChange = (option: string) => {
|
||||
setSortOption(option as SortOption);
|
||||
};
|
||||
|
||||
const layoutProps = {
|
||||
stories,
|
||||
tags,
|
||||
totalElements,
|
||||
searchQuery,
|
||||
selectedTags,
|
||||
viewMode,
|
||||
sortOption,
|
||||
sortDirection,
|
||||
advancedFilters,
|
||||
onSearchChange: handleSearchChange,
|
||||
onTagToggle: handleTagToggle,
|
||||
onViewModeChange: setViewMode,
|
||||
onSortChange: handleSortChange,
|
||||
onSortDirectionToggle: handleSortDirectionToggle,
|
||||
onAdvancedFiltersChange: handleAdvancedFiltersChange,
|
||||
onRandomStory: handleRandomStory,
|
||||
onClearFilters: clearFilters,
|
||||
};
|
||||
|
||||
const renderContent = () => {
|
||||
if (stories.length === 0 && !loading) {
|
||||
return (
|
||||
<div className="text-center py-12 theme-card theme-shadow rounded-lg">
|
||||
<p className="theme-text text-lg mb-4">
|
||||
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false)
|
||||
? 'No stories match your search criteria.'
|
||||
: 'Your library is empty.'
|
||||
}
|
||||
</p>
|
||||
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false) ? (
|
||||
<Button variant="ghost" onClick={clearFilters}>
|
||||
Clear Filters
|
||||
</Button>
|
||||
) : (
|
||||
<Button href="/add-story">
|
||||
Add Your First Story
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<StoryMultiSelect
|
||||
stories={stories}
|
||||
viewMode={viewMode}
|
||||
onUpdate={handleStoryUpdate}
|
||||
allowMultiSelect={true}
|
||||
/>
|
||||
|
||||
{/* Pagination */}
|
||||
{totalPages > 1 && (
|
||||
<div className="flex justify-center gap-2 mt-8">
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => setPage(page - 1)}
|
||||
disabled={page === 0}
|
||||
>
|
||||
Previous
|
||||
</Button>
|
||||
|
||||
<span className="flex items-center px-4 py-2 theme-text">
|
||||
Page {page + 1} of {totalPages}
|
||||
</span>
|
||||
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => setPage(page + 1)}
|
||||
disabled={page >= totalPages - 1}
|
||||
>
|
||||
Next
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
const LayoutComponent = layout === 'sidebar' ? SidebarLayout :
|
||||
layout === 'toolbar' ? ToolbarLayout :
|
||||
MinimalLayout;
|
||||
|
||||
return (
|
||||
<AppLayout>
|
||||
<LayoutComponent {...layoutProps}>
|
||||
{renderContent()}
|
||||
</LayoutComponent>
|
||||
}>
|
||||
<LibraryContent />
|
||||
</Suspense>
|
||||
</AppLayout>
|
||||
);
|
||||
}
|
||||
@@ -1,27 +1,9 @@
|
||||
import { NextRequest } from 'next/server';
|
||||
import { progressStore, type ProgressUpdate } from '../../../../lib/progress';
|
||||
|
||||
// Configure route timeout for long-running progress streams
|
||||
export const maxDuration = 900; // 15 minutes (900 seconds)
|
||||
|
||||
interface ProgressUpdate {
|
||||
type: 'progress' | 'completed' | 'error';
|
||||
current: number;
|
||||
total: number;
|
||||
message: string;
|
||||
url?: string;
|
||||
title?: string;
|
||||
author?: string;
|
||||
wordCount?: number;
|
||||
totalWordCount?: number;
|
||||
error?: string;
|
||||
combinedStory?: any;
|
||||
results?: any[];
|
||||
summary?: any;
|
||||
}
|
||||
|
||||
// Global progress storage (in production, use Redis or database)
|
||||
const progressStore = new Map<string, ProgressUpdate[]>();
|
||||
|
||||
export async function GET(request: NextRequest) {
|
||||
const searchParams = request.nextUrl.searchParams;
|
||||
const sessionId = searchParams.get('sessionId');
|
||||
@@ -81,13 +63,3 @@ export async function GET(request: NextRequest) {
|
||||
});
|
||||
}
|
||||
|
||||
// Helper function for other routes to send progress updates
|
||||
export function sendProgressUpdate(sessionId: string, update: ProgressUpdate) {
|
||||
if (!progressStore.has(sessionId)) {
|
||||
progressStore.set(sessionId, []);
|
||||
}
|
||||
progressStore.get(sessionId)!.push(update);
|
||||
}
|
||||
|
||||
// Export the helper for other modules to use
|
||||
export { progressStore };
|
||||
@@ -4,15 +4,7 @@ import { NextRequest, NextResponse } from 'next/server';
|
||||
export const maxDuration = 900; // 15 minutes (900 seconds)
|
||||
|
||||
// Import progress tracking helper
|
||||
async function sendProgressUpdate(sessionId: string, update: any) {
|
||||
try {
|
||||
// Dynamic import to avoid circular dependency
|
||||
const { sendProgressUpdate: sendUpdate } = await import('./progress/route');
|
||||
sendUpdate(sessionId, update);
|
||||
} catch (error) {
|
||||
console.warn('Failed to send progress update:', error);
|
||||
}
|
||||
}
|
||||
import { sendProgressUpdate } from '../../../lib/progress';
|
||||
|
||||
interface BulkImportRequest {
|
||||
urls: string[];
|
||||
@@ -193,24 +185,42 @@ async function processCombinedMode(
|
||||
console.log(`Combined content character length: ${combinedContentString.length}`);
|
||||
console.log(`Combined content parts count: ${combinedContent.length}`);
|
||||
|
||||
// Handle content truncation if needed
|
||||
let finalContent = contentSizeInMB > 10 ?
|
||||
combinedContentString.substring(0, Math.floor(combinedContentString.length * (10 / contentSizeInMB))) + '\n\n<!-- Content truncated due to size limit -->' :
|
||||
combinedContentString;
|
||||
|
||||
let finalSummary = contentSizeInMB > 10 ? baseSummary + ' (Content truncated due to size limit)' : baseSummary;
|
||||
|
||||
// Check if combined content has images and mark for processing
|
||||
const hasImages = /<img[^>]+src=['"'][^'"']*['"][^>]*>/i.test(finalContent);
|
||||
if (hasImages) {
|
||||
finalSummary += ' (Contains embedded images - will be processed after story creation)';
|
||||
console.log(`Combined story contains embedded images - will need processing after creation`);
|
||||
}
|
||||
|
||||
// Return the combined story data via progress update
|
||||
const combinedStory = {
|
||||
title: baseTitle,
|
||||
author: baseAuthor,
|
||||
content: contentSizeInMB > 10 ?
|
||||
combinedContentString.substring(0, Math.floor(combinedContentString.length * (10 / contentSizeInMB))) + '\n\n<!-- Content truncated due to size limit -->' :
|
||||
combinedContentString,
|
||||
summary: contentSizeInMB > 10 ? baseSummary + ' (Content truncated due to size limit)' : baseSummary,
|
||||
content: finalContent,
|
||||
summary: finalSummary,
|
||||
sourceUrl: baseSourceUrl,
|
||||
tags: Array.from(combinedTags)
|
||||
tags: Array.from(combinedTags),
|
||||
hasImages: hasImages
|
||||
};
|
||||
|
||||
// Send completion notification for combine mode
|
||||
let completionMessage = `Combined scraping completed: ${totalWordCount.toLocaleString()} words from ${importedCount} stories`;
|
||||
if (hasImages) {
|
||||
completionMessage += ` (embedded images will be processed when story is created)`;
|
||||
}
|
||||
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'completed',
|
||||
current: urls.length,
|
||||
total: urls.length,
|
||||
message: `Combined scraping completed: ${totalWordCount.toLocaleString()} words from ${importedCount} stories`,
|
||||
message: completionMessage,
|
||||
totalWordCount: totalWordCount,
|
||||
combinedStory: combinedStory
|
||||
});
|
||||
@@ -347,6 +357,61 @@ async function processIndividualMode(
|
||||
|
||||
const createdStory = await createResponse.json();
|
||||
|
||||
// Process embedded images if content contains images
|
||||
let imageProcessingWarnings: string[] = [];
|
||||
const hasImages = /<img[^>]+src=['"'][^'"']*['"][^>]*>/i.test(scrapedStory.content);
|
||||
|
||||
if (hasImages) {
|
||||
try {
|
||||
console.log(`Processing embedded images for story: ${createdStory.id}`);
|
||||
const imageProcessUrl = `http://backend:8080/api/stories/${createdStory.id}/process-content-images`;
|
||||
const imageProcessResponse = await fetch(imageProcessUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ htmlContent: scrapedStory.content }),
|
||||
});
|
||||
|
||||
if (imageProcessResponse.ok) {
|
||||
const imageResult = await imageProcessResponse.json();
|
||||
if (imageResult.hasWarnings && imageResult.warnings) {
|
||||
imageProcessingWarnings = imageResult.warnings;
|
||||
console.log(`Image processing completed with warnings for story ${createdStory.id}:`, imageResult.warnings);
|
||||
} else {
|
||||
console.log(`Image processing completed successfully for story ${createdStory.id}. Downloaded ${imageResult.downloadedImages?.length || 0} images.`);
|
||||
}
|
||||
|
||||
// Update story content with processed images
|
||||
if (imageResult.processedContent && imageResult.processedContent !== scrapedStory.content) {
|
||||
const updateUrl = `http://backend:8080/api/stories/${createdStory.id}`;
|
||||
const updateResponse = await fetch(updateUrl, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Authorization': authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
contentHtml: imageResult.processedContent
|
||||
}),
|
||||
});
|
||||
|
||||
if (!updateResponse.ok) {
|
||||
console.warn(`Failed to update story content after image processing for ${createdStory.id}`);
|
||||
imageProcessingWarnings.push('Failed to update story content with processed images');
|
||||
}
|
||||
}
|
||||
} else {
|
||||
console.warn(`Image processing failed for story ${createdStory.id}:`, imageProcessResponse.status);
|
||||
imageProcessingWarnings.push('Image processing failed');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error processing images for story ${createdStory.id}:`, error);
|
||||
imageProcessingWarnings.push(`Image processing error: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
}
|
||||
|
||||
results.push({
|
||||
url: trimmedUrl,
|
||||
status: 'imported',
|
||||
@@ -356,17 +421,24 @@ async function processIndividualMode(
|
||||
});
|
||||
importedCount++;
|
||||
|
||||
console.log(`Successfully imported: ${scrapedStory.title} by ${scrapedStory.author} (ID: ${createdStory.id})`);
|
||||
console.log(`Successfully imported: ${scrapedStory.title} by ${scrapedStory.author} (ID: ${createdStory.id})${hasImages ? ` with ${imageProcessingWarnings.length > 0 ? 'warnings' : 'successful image processing'}` : ''}`);
|
||||
|
||||
// Send progress update for successful import
|
||||
let progressMessage = `Imported "${scrapedStory.title}" by ${scrapedStory.author}`;
|
||||
if (hasImages) {
|
||||
progressMessage += imageProcessingWarnings.length > 0 ? ' (with image warnings)' : ' (with images)';
|
||||
}
|
||||
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'progress',
|
||||
current: i + 1,
|
||||
total: urls.length,
|
||||
message: `Imported "${scrapedStory.title}" by ${scrapedStory.author}`,
|
||||
message: progressMessage,
|
||||
url: trimmedUrl,
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author
|
||||
author: scrapedStory.author,
|
||||
hasImages: hasImages,
|
||||
imageWarnings: imageProcessingWarnings
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
@@ -421,11 +493,11 @@ async function processIndividualMode(
|
||||
|
||||
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
|
||||
|
||||
// Trigger Typesense reindex if any stories were imported
|
||||
// Trigger Solr reindex if any stories were imported
|
||||
if (importedCount > 0) {
|
||||
try {
|
||||
console.log('Triggering Typesense reindex after bulk import...');
|
||||
const reindexUrl = `http://backend:8080/api/stories/reindex-typesense`;
|
||||
console.log('Triggering Solr reindex after bulk import...');
|
||||
const reindexUrl = `http://backend:8080/api/admin/search/solr/reindex`;
|
||||
const reindexResponse = await fetch(reindexUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
@@ -436,12 +508,12 @@ async function processIndividualMode(
|
||||
|
||||
if (reindexResponse.ok) {
|
||||
const reindexResult = await reindexResponse.json();
|
||||
console.log('Typesense reindex completed:', reindexResult);
|
||||
console.log('Solr reindex completed:', reindexResult);
|
||||
} else {
|
||||
console.warn('Typesense reindex failed:', reindexResponse.status);
|
||||
console.warn('Solr reindex failed:', reindexResponse.status);
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('Failed to trigger Typesense reindex:', error);
|
||||
console.warn('Failed to trigger Solr reindex:', error);
|
||||
// Don't fail the whole request if reindex fails
|
||||
}
|
||||
}
|
||||
|
||||
@@ -19,6 +19,9 @@ export async function POST(request: NextRequest) {
|
||||
const scraper = new StoryScraper();
|
||||
const story = await scraper.scrapeStory(url);
|
||||
|
||||
// Check if scraped content contains embedded images
|
||||
const hasImages = story.content ? /<img[^>]+src=['"'][^'"']*['"][^>]*>/i.test(story.content) : false;
|
||||
|
||||
// Debug logging
|
||||
console.log('Scraped story data:', {
|
||||
url: url,
|
||||
@@ -28,10 +31,15 @@ export async function POST(request: NextRequest) {
|
||||
contentLength: story.content?.length || 0,
|
||||
contentPreview: story.content?.substring(0, 200) + '...',
|
||||
tags: story.tags,
|
||||
coverImage: story.coverImage
|
||||
coverImage: story.coverImage,
|
||||
hasEmbeddedImages: hasImages
|
||||
});
|
||||
|
||||
return NextResponse.json(story);
|
||||
// Add image processing flag to response for frontend handling
|
||||
return NextResponse.json({
|
||||
...story,
|
||||
hasEmbeddedImages: hasImages
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Story scraping error:', error);
|
||||
|
||||
|
||||
183
frontend/src/app/settings/SettingsContent.tsx
Normal file
183
frontend/src/app/settings/SettingsContent.tsx
Normal file
@@ -0,0 +1,183 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import TabNavigation from '../../components/ui/TabNavigation';
|
||||
import AppearanceSettings from '../../components/settings/AppearanceSettings';
|
||||
import ContentSettings from '../../components/settings/ContentSettings';
|
||||
import SystemSettings from '../../components/settings/SystemSettings';
|
||||
import Button from '../../components/ui/Button';
|
||||
import { useTheme } from '../../lib/theme';
|
||||
|
||||
type FontFamily = 'serif' | 'sans' | 'mono';
|
||||
type FontSize = 'small' | 'medium' | 'large' | 'extra-large';
|
||||
type ReadingWidth = 'narrow' | 'medium' | 'wide';
|
||||
|
||||
interface Settings {
|
||||
theme: 'light' | 'dark';
|
||||
fontFamily: FontFamily;
|
||||
fontSize: FontSize;
|
||||
readingWidth: ReadingWidth;
|
||||
readingSpeed: number; // words per minute
|
||||
}
|
||||
|
||||
const defaultSettings: Settings = {
|
||||
theme: 'light',
|
||||
fontFamily: 'serif',
|
||||
fontSize: 'medium',
|
||||
readingWidth: 'medium',
|
||||
readingSpeed: 200,
|
||||
};
|
||||
|
||||
const tabs = [
|
||||
{ id: 'appearance', label: 'Appearance', icon: '🎨' },
|
||||
{ id: 'content', label: 'Content', icon: '🏷️' },
|
||||
{ id: 'system', label: 'System', icon: '🔧' },
|
||||
];
|
||||
|
||||
export default function SettingsContent() {
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { theme, setTheme } = useTheme();
|
||||
const [settings, setSettings] = useState<Settings>(defaultSettings);
|
||||
const [saved, setSaved] = useState(false);
|
||||
const [activeTab, setActiveTab] = useState('appearance');
|
||||
|
||||
// Initialize tab from URL parameter
|
||||
useEffect(() => {
|
||||
const tabFromUrl = searchParams.get('tab');
|
||||
if (tabFromUrl && tabs.some(tab => tab.id === tabFromUrl)) {
|
||||
setActiveTab(tabFromUrl);
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Load settings from localStorage on mount
|
||||
useEffect(() => {
|
||||
const savedSettings = localStorage.getItem('storycove-settings');
|
||||
if (savedSettings) {
|
||||
try {
|
||||
const parsed = JSON.parse(savedSettings);
|
||||
setSettings({ ...defaultSettings, ...parsed, theme });
|
||||
} catch (error) {
|
||||
console.error('Failed to parse saved settings:', error);
|
||||
setSettings({ ...defaultSettings, theme });
|
||||
}
|
||||
} else {
|
||||
setSettings({ ...defaultSettings, theme });
|
||||
}
|
||||
}, [theme]);
|
||||
|
||||
// Update URL when tab changes
|
||||
const handleTabChange = (tabId: string) => {
|
||||
setActiveTab(tabId);
|
||||
const newUrl = `/settings?tab=${tabId}`;
|
||||
router.replace(newUrl, { scroll: false });
|
||||
};
|
||||
|
||||
// Save settings to localStorage
|
||||
const saveSettings = () => {
|
||||
localStorage.setItem('storycove-settings', JSON.stringify(settings));
|
||||
|
||||
// Apply theme change
|
||||
setTheme(settings.theme);
|
||||
|
||||
// Apply font settings to CSS custom properties
|
||||
const root = document.documentElement;
|
||||
|
||||
const fontFamilyMap = {
|
||||
serif: 'Georgia, Times, serif',
|
||||
sans: 'Inter, system-ui, sans-serif',
|
||||
mono: 'Monaco, Consolas, monospace',
|
||||
};
|
||||
|
||||
const fontSizeMap = {
|
||||
small: '14px',
|
||||
medium: '16px',
|
||||
large: '18px',
|
||||
'extra-large': '20px',
|
||||
};
|
||||
|
||||
const readingWidthMap = {
|
||||
narrow: '600px',
|
||||
medium: '800px',
|
||||
wide: '1000px',
|
||||
};
|
||||
|
||||
root.style.setProperty('--reading-font-family', fontFamilyMap[settings.fontFamily]);
|
||||
root.style.setProperty('--reading-font-size', fontSizeMap[settings.fontSize]);
|
||||
root.style.setProperty('--reading-max-width', readingWidthMap[settings.readingWidth]);
|
||||
|
||||
setSaved(true);
|
||||
setTimeout(() => setSaved(false), 2000);
|
||||
};
|
||||
|
||||
const updateSetting = <K extends keyof Settings>(key: K, value: Settings[K]) => {
|
||||
setSettings(prev => ({ ...prev, [key]: value }));
|
||||
};
|
||||
|
||||
const resetToDefaults = () => {
|
||||
setSettings({ ...defaultSettings, theme });
|
||||
};
|
||||
|
||||
const renderTabContent = () => {
|
||||
switch (activeTab) {
|
||||
case 'appearance':
|
||||
return (
|
||||
<AppearanceSettings
|
||||
settings={settings}
|
||||
onSettingChange={updateSetting}
|
||||
/>
|
||||
);
|
||||
case 'content':
|
||||
return <ContentSettings />;
|
||||
case 'system':
|
||||
return <SystemSettings />;
|
||||
default:
|
||||
return <AppearanceSettings settings={settings} onSettingChange={updateSetting} />;
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="max-w-4xl mx-auto space-y-6">
|
||||
{/* Header */}
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold theme-header">Settings</h1>
|
||||
<p className="theme-text mt-2">
|
||||
Customize your StoryCove experience and manage system settings
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Tab Navigation */}
|
||||
<TabNavigation
|
||||
tabs={tabs}
|
||||
activeTab={activeTab}
|
||||
onTabChange={handleTabChange}
|
||||
className="mb-6"
|
||||
/>
|
||||
|
||||
{/* Tab Content */}
|
||||
<div className="min-h-[400px]">
|
||||
{renderTabContent()}
|
||||
</div>
|
||||
|
||||
{/* Save Actions - Only show for Appearance tab */}
|
||||
{activeTab === 'appearance' && (
|
||||
<div className="flex justify-end gap-4 pt-6 border-t theme-border">
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={resetToDefaults}
|
||||
>
|
||||
Reset to Defaults
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
onClick={saveSettings}
|
||||
className={saved ? 'bg-green-600 hover:bg-green-700' : ''}
|
||||
>
|
||||
{saved ? '✓ Saved!' : 'Save Settings'}
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,818 +1,20 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { Suspense } from 'react';
|
||||
import AppLayout from '../../components/layout/AppLayout';
|
||||
import { useTheme } from '../../lib/theme';
|
||||
import Button from '../../components/ui/Button';
|
||||
import { storyApi, authorApi, databaseApi } from '../../lib/api';
|
||||
import { useLibraryLayout, LibraryLayoutType } from '../../hooks/useLibraryLayout';
|
||||
import LibrarySettings from '../../components/library/LibrarySettings';
|
||||
|
||||
type FontFamily = 'serif' | 'sans' | 'mono';
|
||||
type FontSize = 'small' | 'medium' | 'large' | 'extra-large';
|
||||
type ReadingWidth = 'narrow' | 'medium' | 'wide';
|
||||
|
||||
interface Settings {
|
||||
theme: 'light' | 'dark';
|
||||
fontFamily: FontFamily;
|
||||
fontSize: FontSize;
|
||||
readingWidth: ReadingWidth;
|
||||
readingSpeed: number; // words per minute
|
||||
}
|
||||
|
||||
const defaultSettings: Settings = {
|
||||
theme: 'light',
|
||||
fontFamily: 'serif',
|
||||
fontSize: 'medium',
|
||||
readingWidth: 'medium',
|
||||
readingSpeed: 200,
|
||||
};
|
||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||
import SettingsContent from './SettingsContent';
|
||||
|
||||
export default function SettingsPage() {
|
||||
const { theme, setTheme } = useTheme();
|
||||
const { layout, setLayout } = useLibraryLayout();
|
||||
const [settings, setSettings] = useState<Settings>(defaultSettings);
|
||||
const [saved, setSaved] = useState(false);
|
||||
const [typesenseStatus, setTypesenseStatus] = useState<{
|
||||
stories: { loading: boolean; message: string; success?: boolean };
|
||||
authors: { loading: boolean; message: string; success?: boolean };
|
||||
}>({
|
||||
stories: { loading: false, message: '' },
|
||||
authors: { loading: false, message: '' }
|
||||
});
|
||||
const [authorsSchema, setAuthorsSchema] = useState<any>(null);
|
||||
const [showSchema, setShowSchema] = useState(false);
|
||||
const [databaseStatus, setDatabaseStatus] = useState<{
|
||||
completeBackup: { loading: boolean; message: string; success?: boolean };
|
||||
completeRestore: { loading: boolean; message: string; success?: boolean };
|
||||
completeClear: { loading: boolean; message: string; success?: boolean };
|
||||
}>({
|
||||
completeBackup: { loading: false, message: '' },
|
||||
completeRestore: { loading: false, message: '' },
|
||||
completeClear: { loading: false, message: '' }
|
||||
});
|
||||
|
||||
// Load settings from localStorage on mount
|
||||
useEffect(() => {
|
||||
const savedSettings = localStorage.getItem('storycove-settings');
|
||||
if (savedSettings) {
|
||||
try {
|
||||
const parsed = JSON.parse(savedSettings);
|
||||
setSettings({ ...defaultSettings, ...parsed, theme });
|
||||
} catch (error) {
|
||||
console.error('Failed to parse saved settings:', error);
|
||||
setSettings({ ...defaultSettings, theme });
|
||||
}
|
||||
} else {
|
||||
setSettings({ ...defaultSettings, theme });
|
||||
}
|
||||
}, [theme]);
|
||||
|
||||
// Save settings to localStorage
|
||||
const saveSettings = () => {
|
||||
localStorage.setItem('storycove-settings', JSON.stringify(settings));
|
||||
|
||||
// Apply theme change
|
||||
setTheme(settings.theme);
|
||||
|
||||
// Apply font settings to CSS custom properties
|
||||
const root = document.documentElement;
|
||||
|
||||
const fontFamilyMap = {
|
||||
serif: 'Georgia, Times, serif',
|
||||
sans: 'Inter, system-ui, sans-serif',
|
||||
mono: 'Monaco, Consolas, monospace',
|
||||
};
|
||||
|
||||
const fontSizeMap = {
|
||||
small: '14px',
|
||||
medium: '16px',
|
||||
large: '18px',
|
||||
'extra-large': '20px',
|
||||
};
|
||||
|
||||
const readingWidthMap = {
|
||||
narrow: '600px',
|
||||
medium: '800px',
|
||||
wide: '1000px',
|
||||
};
|
||||
|
||||
root.style.setProperty('--reading-font-family', fontFamilyMap[settings.fontFamily]);
|
||||
root.style.setProperty('--reading-font-size', fontSizeMap[settings.fontSize]);
|
||||
root.style.setProperty('--reading-max-width', readingWidthMap[settings.readingWidth]);
|
||||
|
||||
setSaved(true);
|
||||
setTimeout(() => setSaved(false), 2000);
|
||||
};
|
||||
|
||||
const updateSetting = <K extends keyof Settings>(key: K, value: Settings[K]) => {
|
||||
setSettings(prev => ({ ...prev, [key]: value }));
|
||||
};
|
||||
|
||||
const handleTypesenseOperation = async (
|
||||
type: 'stories' | 'authors',
|
||||
operation: 'reindex' | 'recreate',
|
||||
apiCall: () => Promise<{ success: boolean; message: string; count?: number; error?: string }>
|
||||
) => {
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: { loading: true, message: 'Processing...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await apiCall();
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: {
|
||||
loading: false,
|
||||
message: result.success ? result.message : result.error || 'Operation failed',
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
|
||||
// Clear message after 5 seconds
|
||||
setTimeout(() => {
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 5000);
|
||||
} catch (error) {
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: {
|
||||
loading: false,
|
||||
message: 'Network error occurred',
|
||||
success: false
|
||||
}
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 5000);
|
||||
}
|
||||
};
|
||||
|
||||
const fetchAuthorsSchema = async () => {
|
||||
try {
|
||||
const result = await authorApi.getTypesenseSchema();
|
||||
if (result.success) {
|
||||
setAuthorsSchema(result.schema);
|
||||
} else {
|
||||
setAuthorsSchema({ error: result.error });
|
||||
}
|
||||
} catch (error) {
|
||||
setAuthorsSchema({ error: 'Failed to fetch schema' });
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
const handleCompleteBackup = async () => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: true, message: 'Creating complete backup...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const backupBlob = await databaseApi.backupComplete();
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(backupBlob);
|
||||
const link = document.createElement('a');
|
||||
link.href = url;
|
||||
|
||||
const timestamp = new Date().toISOString().replace(/[:.]/g, '-').slice(0, 19);
|
||||
link.download = `storycove_complete_backup_${timestamp}.zip`;
|
||||
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: 'Complete backup downloaded successfully', success: true }
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: error.message || 'Complete backup failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 5 seconds
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 5000);
|
||||
};
|
||||
|
||||
const handleCompleteRestore = async (event: React.ChangeEvent<HTMLInputElement>) => {
|
||||
const file = event.target.files?.[0];
|
||||
if (!file) return;
|
||||
|
||||
// Reset the input so the same file can be selected again
|
||||
event.target.value = '';
|
||||
|
||||
if (!file.name.endsWith('.zip')) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: 'Please select a .zip file', success: false }
|
||||
}));
|
||||
return;
|
||||
}
|
||||
|
||||
const confirmed = window.confirm(
|
||||
'Are you sure you want to restore the complete backup? This will PERMANENTLY DELETE all current data AND files (cover images, avatars) and replace them with the backup data. This action cannot be undone!'
|
||||
);
|
||||
|
||||
if (!confirmed) return;
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: true, message: 'Restoring complete backup...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await databaseApi.restoreComplete(file);
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: {
|
||||
loading: false,
|
||||
message: result.success ? result.message : result.message,
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: error.message || 'Complete restore failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 10 seconds for restore (longer because it's important)
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 10000);
|
||||
};
|
||||
|
||||
const handleCompleteClear = async () => {
|
||||
const confirmed = window.confirm(
|
||||
'Are you ABSOLUTELY SURE you want to clear the entire database AND all files? This will PERMANENTLY DELETE ALL stories, authors, series, tags, collections, AND all uploaded images (covers, avatars). This action cannot be undone!'
|
||||
);
|
||||
|
||||
if (!confirmed) return;
|
||||
|
||||
const doubleConfirmed = window.confirm(
|
||||
'This is your final warning! Clicking OK will DELETE EVERYTHING in your StoryCove database AND all uploaded files. Are you completely certain you want to proceed?'
|
||||
);
|
||||
|
||||
if (!doubleConfirmed) return;
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: true, message: 'Clearing database and files...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await databaseApi.clearComplete();
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: {
|
||||
loading: false,
|
||||
message: result.success
|
||||
? `Database and files cleared successfully. Deleted ${result.deletedRecords} records.`
|
||||
: result.message,
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: false, message: error.message || 'Clear operation failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 10 seconds for clear (longer because it's important)
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 10000);
|
||||
};
|
||||
|
||||
return (
|
||||
<AppLayout>
|
||||
<div className="max-w-2xl mx-auto space-y-8">
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold theme-header">Settings</h1>
|
||||
<p className="theme-text mt-2">
|
||||
Customize your StoryCove reading experience
|
||||
</p>
|
||||
<Suspense fallback={
|
||||
<div className="flex items-center justify-center py-20">
|
||||
<LoadingSpinner size="lg" />
|
||||
</div>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Theme Settings */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Appearance</h2>
|
||||
|
||||
<div className="space-y-4">
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Theme
|
||||
</label>
|
||||
<div className="flex gap-4">
|
||||
<button
|
||||
onClick={() => updateSetting('theme', 'light')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors ${
|
||||
settings.theme === 'light'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
☀️ Light
|
||||
</button>
|
||||
<button
|
||||
onClick={() => updateSetting('theme', 'dark')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors ${
|
||||
settings.theme === 'dark'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
🌙 Dark
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Library Layout */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Library Layout
|
||||
</label>
|
||||
<div className="space-y-3">
|
||||
<div className="flex gap-4 flex-wrap">
|
||||
<button
|
||||
onClick={() => setLayout('sidebar')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors ${
|
||||
layout === 'sidebar'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
📋 Sidebar Layout
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setLayout('toolbar')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors ${
|
||||
layout === 'toolbar'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
🛠️ Toolbar Layout
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setLayout('minimal')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors ${
|
||||
layout === 'minimal'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
✨ Minimal Layout
|
||||
</button>
|
||||
</div>
|
||||
<div className="text-sm theme-text">
|
||||
<div className="grid grid-cols-1 md:grid-cols-3 gap-3 mt-3">
|
||||
<div className="text-xs">
|
||||
<strong>Sidebar:</strong> Filters and controls in a side panel, maximum space for stories
|
||||
</div>
|
||||
<div className="text-xs">
|
||||
<strong>Toolbar:</strong> Everything visible at once with integrated search and tag filters
|
||||
</div>
|
||||
<div className="text-xs">
|
||||
<strong>Minimal:</strong> Clean, content-focused design with floating controls
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Reading Settings */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Reading Experience</h2>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Font Family */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Font Family
|
||||
</label>
|
||||
<div className="flex gap-4 flex-wrap">
|
||||
<button
|
||||
onClick={() => updateSetting('fontFamily', 'serif')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors font-serif ${
|
||||
settings.fontFamily === 'serif'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
Serif
|
||||
</button>
|
||||
<button
|
||||
onClick={() => updateSetting('fontFamily', 'sans')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors font-sans ${
|
||||
settings.fontFamily === 'sans'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
Sans Serif
|
||||
</button>
|
||||
<button
|
||||
onClick={() => updateSetting('fontFamily', 'mono')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors font-mono ${
|
||||
settings.fontFamily === 'mono'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
Monospace
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Font Size */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Font Size
|
||||
</label>
|
||||
<div className="flex gap-4 flex-wrap">
|
||||
{(['small', 'medium', 'large', 'extra-large'] as FontSize[]).map((size) => (
|
||||
<button
|
||||
key={size}
|
||||
onClick={() => updateSetting('fontSize', size)}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors capitalize ${
|
||||
settings.fontSize === size
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
{size.replace('-', ' ')}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Reading Width */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Reading Width
|
||||
</label>
|
||||
<div className="flex gap-4">
|
||||
{(['narrow', 'medium', 'wide'] as ReadingWidth[]).map((width) => (
|
||||
<button
|
||||
key={width}
|
||||
onClick={() => updateSetting('readingWidth', width)}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors capitalize ${
|
||||
settings.readingWidth === width
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
{width}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Reading Speed */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Reading Speed (words per minute)
|
||||
</label>
|
||||
<div className="flex items-center gap-4">
|
||||
<input
|
||||
type="range"
|
||||
min="100"
|
||||
max="400"
|
||||
step="25"
|
||||
value={settings.readingSpeed}
|
||||
onChange={(e) => updateSetting('readingSpeed', parseInt(e.target.value))}
|
||||
className="flex-1 h-2 bg-gray-200 rounded-lg appearance-none cursor-pointer dark:bg-gray-700"
|
||||
/>
|
||||
<div className="min-w-[80px] text-center">
|
||||
<span className="text-lg font-medium theme-header">{settings.readingSpeed}</span>
|
||||
<div className="text-xs theme-text">WPM</div>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex justify-between text-xs theme-text mt-1">
|
||||
<span>Slow (100)</span>
|
||||
<span>Average (200)</span>
|
||||
<span>Fast (400)</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Preview */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Preview</h2>
|
||||
|
||||
<div
|
||||
className="p-4 theme-card border theme-border rounded-lg"
|
||||
style={{
|
||||
fontFamily: settings.fontFamily === 'serif' ? 'Georgia, Times, serif'
|
||||
: settings.fontFamily === 'sans' ? 'Inter, system-ui, sans-serif'
|
||||
: 'Monaco, Consolas, monospace',
|
||||
fontSize: settings.fontSize === 'small' ? '14px'
|
||||
: settings.fontSize === 'medium' ? '16px'
|
||||
: settings.fontSize === 'large' ? '18px'
|
||||
: '20px',
|
||||
maxWidth: settings.readingWidth === 'narrow' ? '600px'
|
||||
: settings.readingWidth === 'medium' ? '800px'
|
||||
: '1000px',
|
||||
}}
|
||||
>
|
||||
<h3 className="text-xl font-bold theme-header mb-2">Sample Story Title</h3>
|
||||
<p className="theme-text mb-4">by Sample Author</p>
|
||||
<p className="theme-text leading-relaxed">
|
||||
This is how your story text will look with the current settings.
|
||||
The quick brown fox jumps over the lazy dog. Lorem ipsum dolor sit amet,
|
||||
consectetur adipiscing elit. Sed do eiusmod tempor incididunt ut labore
|
||||
et dolore magna aliqua.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Typesense Search Management */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Search Index Management</h2>
|
||||
<p className="theme-text mb-6">
|
||||
Manage the Typesense search indexes for stories and authors. Use these tools if search functionality isn't working properly.
|
||||
</p>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Stories Section */}
|
||||
<div className="border theme-border rounded-lg p-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">Stories Index</h3>
|
||||
<div className="flex flex-col sm:flex-row gap-3 mb-3">
|
||||
<Button
|
||||
onClick={() => handleTypesenseOperation('stories', 'reindex', storyApi.reindexTypesense)}
|
||||
disabled={typesenseStatus.stories.loading}
|
||||
loading={typesenseStatus.stories.loading}
|
||||
variant="ghost"
|
||||
className="flex-1"
|
||||
>
|
||||
{typesenseStatus.stories.loading ? 'Reindexing...' : 'Reindex Stories'}
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => handleTypesenseOperation('stories', 'recreate', storyApi.recreateTypesenseCollection)}
|
||||
disabled={typesenseStatus.stories.loading}
|
||||
loading={typesenseStatus.stories.loading}
|
||||
variant="secondary"
|
||||
className="flex-1"
|
||||
>
|
||||
{typesenseStatus.stories.loading ? 'Recreating...' : 'Recreate Collection'}
|
||||
</Button>
|
||||
</div>
|
||||
{typesenseStatus.stories.message && (
|
||||
<div className={`text-sm p-2 rounded ${
|
||||
typesenseStatus.stories.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{typesenseStatus.stories.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Authors Section */}
|
||||
<div className="border theme-border rounded-lg p-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">Authors Index</h3>
|
||||
<div className="flex flex-col sm:flex-row gap-3 mb-3">
|
||||
<Button
|
||||
onClick={() => handleTypesenseOperation('authors', 'reindex', authorApi.reindexTypesense)}
|
||||
disabled={typesenseStatus.authors.loading}
|
||||
loading={typesenseStatus.authors.loading}
|
||||
variant="ghost"
|
||||
className="flex-1"
|
||||
>
|
||||
{typesenseStatus.authors.loading ? 'Reindexing...' : 'Reindex Authors'}
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => handleTypesenseOperation('authors', 'recreate', authorApi.recreateTypesenseCollection)}
|
||||
disabled={typesenseStatus.authors.loading}
|
||||
loading={typesenseStatus.authors.loading}
|
||||
variant="secondary"
|
||||
className="flex-1"
|
||||
>
|
||||
{typesenseStatus.authors.loading ? 'Recreating...' : 'Recreate Collection'}
|
||||
</Button>
|
||||
</div>
|
||||
{typesenseStatus.authors.message && (
|
||||
<div className={`text-sm p-2 rounded ${
|
||||
typesenseStatus.authors.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{typesenseStatus.authors.message}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Debug Schema Section */}
|
||||
<div className="border-t theme-border pt-3">
|
||||
<div className="flex items-center gap-2 mb-2">
|
||||
<Button
|
||||
onClick={fetchAuthorsSchema}
|
||||
variant="ghost"
|
||||
className="text-xs"
|
||||
>
|
||||
Inspect Schema
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => setShowSchema(!showSchema)}
|
||||
variant="ghost"
|
||||
className="text-xs"
|
||||
disabled={!authorsSchema}
|
||||
>
|
||||
{showSchema ? 'Hide' : 'Show'} Schema
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{showSchema && authorsSchema && (
|
||||
<div className="text-xs theme-text bg-gray-50 dark:bg-gray-800 p-3 rounded border overflow-auto max-h-48">
|
||||
<pre>{JSON.stringify(authorsSchema, null, 2)}</pre>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="text-sm theme-text bg-blue-50 dark:bg-blue-900/20 p-3 rounded-lg">
|
||||
<p className="font-medium mb-1">When to use these tools:</p>
|
||||
<ul className="text-xs space-y-1 ml-4">
|
||||
<li>• <strong>Reindex:</strong> Refresh search data while keeping the existing schema</li>
|
||||
<li>• <strong>Recreate Collection:</strong> Delete and rebuild the entire search index (fixes schema issues)</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Database Management */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Database Management</h2>
|
||||
<p className="theme-text mb-6">
|
||||
Backup, restore, or clear your StoryCove database and files. These comprehensive operations include both your data and uploaded images.
|
||||
</p>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Complete Backup Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-blue-200 dark:border-blue-800">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">📦 Create Backup</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
Download a complete backup as a ZIP file. This includes your database AND all uploaded files (cover images, avatars). This is a comprehensive backup of your entire StoryCove installation.
|
||||
</p>
|
||||
<Button
|
||||
onClick={handleCompleteBackup}
|
||||
disabled={databaseStatus.completeBackup.loading}
|
||||
loading={databaseStatus.completeBackup.loading}
|
||||
variant="primary"
|
||||
className="w-full sm:w-auto"
|
||||
>
|
||||
{databaseStatus.completeBackup.loading ? 'Creating Backup...' : 'Download Backup'}
|
||||
</Button>
|
||||
{databaseStatus.completeBackup.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeBackup.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeBackup.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Restore Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-orange-200 dark:border-orange-800">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">📥 Restore Backup</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
<strong className="text-orange-600 dark:text-orange-400">⚠️ Warning:</strong> This will completely replace your current database AND all files with the backup. All existing data and uploaded files will be permanently deleted.
|
||||
</p>
|
||||
<div className="flex items-center gap-3">
|
||||
<input
|
||||
type="file"
|
||||
accept=".zip"
|
||||
onChange={handleCompleteRestore}
|
||||
disabled={databaseStatus.completeRestore.loading}
|
||||
className="flex-1 text-sm theme-text file:mr-4 file:py-2 file:px-4 file:rounded-lg file:border-0 file:text-sm file:font-medium file:theme-accent-bg file:text-white hover:file:bg-opacity-90 file:cursor-pointer"
|
||||
/>
|
||||
</div>
|
||||
{databaseStatus.completeRestore.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeRestore.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeRestore.message}
|
||||
</div>
|
||||
)}
|
||||
{databaseStatus.completeRestore.loading && (
|
||||
<div className="text-sm theme-text mt-3 flex items-center gap-2">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
|
||||
Restoring backup...
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Clear Everything Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-red-200 dark:border-red-800 bg-red-50 dark:bg-red-900/10">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">🗑️ Clear Everything</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
<strong className="text-red-600 dark:text-red-400">⚠️ Danger Zone:</strong> This will permanently delete ALL data from your database AND all uploaded files (cover images, avatars). Everything will be completely removed. This action cannot be undone!
|
||||
</p>
|
||||
<Button
|
||||
onClick={handleCompleteClear}
|
||||
disabled={databaseStatus.completeClear.loading}
|
||||
loading={databaseStatus.completeClear.loading}
|
||||
variant="secondary"
|
||||
className="w-full sm:w-auto bg-red-700 hover:bg-red-800 text-white border-red-700"
|
||||
>
|
||||
{databaseStatus.completeClear.loading ? 'Clearing Everything...' : 'Clear Everything'}
|
||||
</Button>
|
||||
{databaseStatus.completeClear.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeClear.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeClear.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="text-sm theme-text bg-blue-50 dark:bg-blue-900/20 p-3 rounded-lg">
|
||||
<p className="font-medium mb-1">💡 Best Practices:</p>
|
||||
<ul className="text-xs space-y-1 ml-4">
|
||||
<li>• <strong>Always backup</strong> before performing restore or clear operations</li>
|
||||
<li>• <strong>Store backups safely</strong> in multiple locations for important data</li>
|
||||
<li>• <strong>Test restores</strong> in a development environment when possible</li>
|
||||
<li>• <strong>Backup files (.zip)</strong> contain both database and all uploaded files</li>
|
||||
<li>• <strong>Verify backup files</strong> are complete before relying on them</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Library Settings */}
|
||||
<LibrarySettings />
|
||||
|
||||
{/* Tag Management */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Tag Management</h2>
|
||||
<p className="theme-text mb-6">
|
||||
Manage your story tags with colors, descriptions, and aliases. Use the Tag Maintenance page to organize and customize your tags.
|
||||
</p>
|
||||
<Button
|
||||
href="/settings/tag-maintenance"
|
||||
variant="secondary"
|
||||
className="w-full sm:w-auto"
|
||||
>
|
||||
🏷️ Open Tag Maintenance
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4">
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => {
|
||||
setSettings({ ...defaultSettings, theme });
|
||||
}}
|
||||
>
|
||||
Reset to Defaults
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
onClick={saveSettings}
|
||||
className={saved ? 'bg-green-600 hover:bg-green-700' : ''}
|
||||
>
|
||||
{saved ? '✓ Saved!' : 'Save Settings'}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
}>
|
||||
<SettingsContent />
|
||||
</Suspense>
|
||||
</AppLayout>
|
||||
);
|
||||
}
|
||||
@@ -7,7 +7,7 @@ import { Input, Textarea } from '../../../../components/ui/Input';
|
||||
import Button from '../../../../components/ui/Button';
|
||||
import TagInput from '../../../../components/stories/TagInput';
|
||||
import TagSuggestions from '../../../../components/tags/TagSuggestions';
|
||||
import RichTextEditor from '../../../../components/stories/RichTextEditor';
|
||||
import SlateEditor from '../../../../components/stories/SlateEditor';
|
||||
import ImageUpload from '../../../../components/ui/ImageUpload';
|
||||
import AuthorSelector from '../../../../components/stories/AuthorSelector';
|
||||
import SeriesSelector from '../../../../components/stories/SeriesSelector';
|
||||
@@ -337,7 +337,7 @@ export default function EditStoryPage() {
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Story Content *
|
||||
</label>
|
||||
<RichTextEditor
|
||||
<SlateEditor
|
||||
value={formData.contentHtml}
|
||||
onChange={handleContentChange}
|
||||
placeholder="Edit your story content here..."
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect, useRef, useCallback } from 'react';
|
||||
import { useState, useEffect, useRef, useCallback, useMemo, memo } from 'react';
|
||||
import { useParams, useRouter } from 'next/navigation';
|
||||
import Link from 'next/link';
|
||||
import { storyApi, seriesApi } from '../../../lib/api';
|
||||
@@ -11,6 +11,65 @@ import StoryRating from '../../../components/stories/StoryRating';
|
||||
import TagDisplay from '../../../components/tags/TagDisplay';
|
||||
import TableOfContents from '../../../components/stories/TableOfContents';
|
||||
import { sanitizeHtml, preloadSanitizationConfig } from '../../../lib/sanitization';
|
||||
import { debug } from '../../../lib/debug';
|
||||
|
||||
// Memoized content component that only re-renders when content changes
|
||||
const StoryContent = memo(({
|
||||
content,
|
||||
contentRef
|
||||
}: {
|
||||
content: string;
|
||||
contentRef: React.RefObject<HTMLDivElement>;
|
||||
}) => {
|
||||
const renderTime = Date.now();
|
||||
debug.log('🔄 StoryContent component rendering at', renderTime, 'with content length:', content.length, 'hash:', content.slice(0, 50) + '...');
|
||||
|
||||
// Add observer to track image loading events
|
||||
useEffect(() => {
|
||||
if (!contentRef.current) return;
|
||||
|
||||
const images = contentRef.current.querySelectorAll('img');
|
||||
debug.log('📸 Found', images.length, 'images in content');
|
||||
|
||||
const handleImageLoad = (e: Event) => {
|
||||
const img = e.target as HTMLImageElement;
|
||||
debug.log('🖼️ Image loaded:', img.src);
|
||||
};
|
||||
|
||||
const handleImageError = (e: Event) => {
|
||||
const img = e.target as HTMLImageElement;
|
||||
debug.log('❌ Image error:', img.src);
|
||||
};
|
||||
|
||||
images.forEach(img => {
|
||||
img.addEventListener('load', handleImageLoad);
|
||||
img.addEventListener('error', handleImageError);
|
||||
debug.log('👀 Monitoring image:', img.src);
|
||||
});
|
||||
|
||||
return () => {
|
||||
images.forEach(img => {
|
||||
img.removeEventListener('load', handleImageLoad);
|
||||
img.removeEventListener('error', handleImageError);
|
||||
});
|
||||
};
|
||||
}, [content]);
|
||||
|
||||
return (
|
||||
<div
|
||||
ref={contentRef}
|
||||
className="reading-content"
|
||||
dangerouslySetInnerHTML={{ __html: content }}
|
||||
style={{
|
||||
// Prevent layout shifts that might cause image reloads
|
||||
minHeight: '100vh',
|
||||
contain: 'layout style'
|
||||
}}
|
||||
/>
|
||||
);
|
||||
});
|
||||
|
||||
StoryContent.displayName = 'StoryContent';
|
||||
|
||||
export default function StoryReadingPage() {
|
||||
const params = useParams();
|
||||
@@ -20,6 +79,7 @@ export default function StoryReadingPage() {
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const [readingProgress, setReadingProgress] = useState(0);
|
||||
const [readingPercentage, setReadingPercentage] = useState(0);
|
||||
const [sanitizedContent, setSanitizedContent] = useState<string>('');
|
||||
const [hasScrolledToPosition, setHasScrolledToPosition] = useState(false);
|
||||
const [showToc, setShowToc] = useState(false);
|
||||
@@ -48,15 +108,25 @@ export default function StoryReadingPage() {
|
||||
));
|
||||
|
||||
// Convert to character position in the plain text content
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
return Math.floor(scrollRatio * textLength);
|
||||
}, [story]);
|
||||
|
||||
// Calculate reading percentage from character position
|
||||
const calculateReadingPercentage = useCallback((currentPosition: number): number => {
|
||||
if (!story) return 0;
|
||||
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (totalLength === 0) return 0;
|
||||
|
||||
return Math.round((currentPosition / totalLength) * 100);
|
||||
}, [story]);
|
||||
|
||||
// Convert character position back to scroll position for auto-scroll
|
||||
const scrollToCharacterPosition = useCallback((position: number) => {
|
||||
if (!contentRef.current || !story || hasScrolledToPosition) return;
|
||||
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (textLength === 0 || position === 0) return;
|
||||
|
||||
const ratio = position / textLength;
|
||||
@@ -80,14 +150,14 @@ export default function StoryReadingPage() {
|
||||
// Debounced function to save reading position
|
||||
const saveReadingPosition = useCallback(async (position: number) => {
|
||||
if (!story || position === story.readingPosition) {
|
||||
console.log('Skipping save - no story or position unchanged:', { story: !!story, position, current: story?.readingPosition });
|
||||
debug.log('Skipping save - no story or position unchanged:', { story: !!story, position, current: story?.readingPosition });
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('Saving reading position:', position, 'for story:', story.id);
|
||||
debug.log('Saving reading position:', position, 'for story:', story.id);
|
||||
try {
|
||||
const updatedStory = await storyApi.updateReadingProgress(story.id, position);
|
||||
console.log('Reading position saved successfully, updated story:', updatedStory.readingPosition);
|
||||
debug.log('Reading position saved successfully, updated story:', updatedStory.readingPosition);
|
||||
setStory(prev => prev ? { ...prev, readingPosition: position, lastReadAt: updatedStory.lastReadAt } : null);
|
||||
} catch (error) {
|
||||
console.error('Failed to save reading position:', error);
|
||||
@@ -168,12 +238,12 @@ export default function StoryReadingPage() {
|
||||
if (story && sanitizedContent && !hasScrolledToPosition) {
|
||||
// Use a small delay to ensure content is rendered
|
||||
const timeout = setTimeout(() => {
|
||||
console.log('Initializing reading position tracking, saved position:', story.readingPosition);
|
||||
debug.log('Initializing reading position tracking, saved position:', story.readingPosition);
|
||||
|
||||
// Check if there's a hash in the URL (for TOC navigation)
|
||||
const hash = window.location.hash.substring(1);
|
||||
if (hash && hash.startsWith('heading-')) {
|
||||
console.log('Auto-scrolling to heading from URL hash:', hash);
|
||||
debug.log('Auto-scrolling to heading from URL hash:', hash);
|
||||
const element = document.getElementById(hash);
|
||||
if (element) {
|
||||
element.scrollIntoView({
|
||||
@@ -187,71 +257,90 @@ export default function StoryReadingPage() {
|
||||
|
||||
// Otherwise, use saved reading position
|
||||
if (story.readingPosition && story.readingPosition > 0) {
|
||||
console.log('Auto-scrolling to saved position:', story.readingPosition);
|
||||
debug.log('Auto-scrolling to saved position:', story.readingPosition);
|
||||
const initialPercentage = calculateReadingPercentage(story.readingPosition);
|
||||
setReadingPercentage(initialPercentage);
|
||||
scrollToCharacterPosition(story.readingPosition);
|
||||
} else {
|
||||
// Even if there's no saved position, mark as ready for tracking
|
||||
console.log('No saved position, starting fresh tracking');
|
||||
debug.log('No saved position, starting fresh tracking');
|
||||
setReadingPercentage(0);
|
||||
setHasScrolledToPosition(true);
|
||||
}
|
||||
}, 500);
|
||||
|
||||
return () => clearTimeout(timeout);
|
||||
}
|
||||
}, [story, sanitizedContent, scrollToCharacterPosition, hasScrolledToPosition]);
|
||||
}, [story, sanitizedContent, scrollToCharacterPosition, calculateReadingPercentage, hasScrolledToPosition]);
|
||||
|
||||
// Track reading progress and save position
|
||||
useEffect(() => {
|
||||
let ticking = false;
|
||||
let scrollEventCount = 0;
|
||||
|
||||
const handleScroll = () => {
|
||||
const article = document.querySelector('[data-reading-content]') as HTMLElement;
|
||||
if (article) {
|
||||
const scrolled = window.scrollY;
|
||||
const articleTop = article.offsetTop;
|
||||
const articleHeight = article.scrollHeight;
|
||||
const windowHeight = window.innerHeight;
|
||||
scrollEventCount++;
|
||||
if (scrollEventCount % 10 === 0) {
|
||||
debug.log('📜 Scroll event #', scrollEventCount, 'at', Date.now());
|
||||
}
|
||||
|
||||
const progress = Math.min(100, Math.max(0,
|
||||
((scrolled - articleTop + windowHeight) / articleHeight) * 100
|
||||
));
|
||||
if (!ticking) {
|
||||
requestAnimationFrame(() => {
|
||||
const article = document.querySelector('[data-reading-content]') as HTMLElement;
|
||||
if (article) {
|
||||
const scrolled = window.scrollY;
|
||||
const articleTop = article.offsetTop;
|
||||
const articleHeight = article.scrollHeight;
|
||||
const windowHeight = window.innerHeight;
|
||||
|
||||
setReadingProgress(progress);
|
||||
const progress = Math.min(100, Math.max(0,
|
||||
((scrolled - articleTop + windowHeight) / articleHeight) * 100
|
||||
));
|
||||
|
||||
// Multi-method end-of-story detection
|
||||
const documentHeight = document.documentElement.scrollHeight;
|
||||
const windowBottom = scrolled + windowHeight;
|
||||
const distanceFromBottom = documentHeight - windowBottom;
|
||||
setReadingProgress(progress);
|
||||
|
||||
// Method 1: Distance from bottom (most reliable)
|
||||
const nearBottom = distanceFromBottom <= 200;
|
||||
// Multi-method end-of-story detection
|
||||
const documentHeight = document.documentElement.scrollHeight;
|
||||
const windowBottom = scrolled + windowHeight;
|
||||
const distanceFromBottom = documentHeight - windowBottom;
|
||||
|
||||
// Method 2: High progress but only as secondary check
|
||||
const highProgress = progress >= 98;
|
||||
// Method 1: Distance from bottom (most reliable)
|
||||
const nearBottom = distanceFromBottom <= 200;
|
||||
|
||||
// Method 3: Check if story content itself is fully visible
|
||||
const storyContentElement = contentRef.current;
|
||||
let storyContentFullyVisible = false;
|
||||
if (storyContentElement) {
|
||||
const contentRect = storyContentElement.getBoundingClientRect();
|
||||
const contentBottom = scrolled + contentRect.bottom;
|
||||
const documentContentHeight = Math.max(documentHeight - 300, contentBottom); // Account for footer padding
|
||||
storyContentFullyVisible = windowBottom >= documentContentHeight;
|
||||
}
|
||||
// Method 2: High progress but only as secondary check
|
||||
const highProgress = progress >= 98;
|
||||
|
||||
// Trigger end detection if user is near bottom AND (has high progress OR story content is fully visible)
|
||||
if (nearBottom && (highProgress || storyContentFullyVisible) && !hasReachedEnd && hasScrolledToPosition) {
|
||||
console.log('End of story detected:', { nearBottom, highProgress, storyContentFullyVisible, distanceFromBottom, progress });
|
||||
setHasReachedEnd(true);
|
||||
setShowEndOfStoryPopup(true);
|
||||
}
|
||||
// Method 3: Check if story content itself is fully visible
|
||||
const storyContentElement = contentRef.current;
|
||||
let storyContentFullyVisible = false;
|
||||
if (storyContentElement) {
|
||||
const contentRect = storyContentElement.getBoundingClientRect();
|
||||
const contentBottom = scrolled + contentRect.bottom;
|
||||
const documentContentHeight = Math.max(documentHeight - 300, contentBottom); // Account for footer padding
|
||||
storyContentFullyVisible = windowBottom >= documentContentHeight;
|
||||
}
|
||||
|
||||
// Save reading position (debounced)
|
||||
if (hasScrolledToPosition) { // Only save after initial auto-scroll
|
||||
const characterPosition = getCharacterPositionFromScroll();
|
||||
console.log('Scroll detected, character position:', characterPosition);
|
||||
debouncedSavePosition(characterPosition);
|
||||
} else {
|
||||
console.log('Scroll detected but not ready for tracking yet');
|
||||
}
|
||||
// Trigger end detection if user is near bottom AND (has high progress OR story content is fully visible)
|
||||
if (nearBottom && (highProgress || storyContentFullyVisible) && !hasReachedEnd && hasScrolledToPosition) {
|
||||
debug.log('End of story detected:', { nearBottom, highProgress, storyContentFullyVisible, distanceFromBottom, progress });
|
||||
setHasReachedEnd(true);
|
||||
setShowEndOfStoryPopup(true);
|
||||
}
|
||||
|
||||
// Save reading position and update percentage (debounced)
|
||||
if (hasScrolledToPosition) { // Only save after initial auto-scroll
|
||||
const characterPosition = getCharacterPositionFromScroll();
|
||||
const percentage = calculateReadingPercentage(characterPosition);
|
||||
debug.log('Scroll detected, character position:', characterPosition, 'percentage:', percentage);
|
||||
setReadingPercentage(percentage);
|
||||
debouncedSavePosition(characterPosition);
|
||||
} else {
|
||||
debug.log('Scroll detected but not ready for tracking yet');
|
||||
}
|
||||
}
|
||||
ticking = false;
|
||||
});
|
||||
ticking = true;
|
||||
}
|
||||
};
|
||||
|
||||
@@ -263,7 +352,7 @@ export default function StoryReadingPage() {
|
||||
clearTimeout(saveTimeoutRef.current);
|
||||
}
|
||||
};
|
||||
}, [story, hasScrolledToPosition, getCharacterPositionFromScroll, debouncedSavePosition, hasReachedEnd]);
|
||||
}, [story, hasScrolledToPosition, getCharacterPositionFromScroll, calculateReadingPercentage, debouncedSavePosition, hasReachedEnd]);
|
||||
|
||||
const handleRatingUpdate = async (newRating: number) => {
|
||||
if (!story) return;
|
||||
@@ -313,6 +402,11 @@ export default function StoryReadingPage() {
|
||||
const nextStory = findNextStory();
|
||||
const previousStory = findPreviousStory();
|
||||
|
||||
// Memoize the sanitized content to prevent re-processing on scroll
|
||||
const memoizedContent = useMemo(() => {
|
||||
return sanitizedContent;
|
||||
}, [sanitizedContent]);
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="min-h-screen theme-bg flex items-center justify-center">
|
||||
@@ -359,6 +453,11 @@ export default function StoryReadingPage() {
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-4">
|
||||
{/* Reading percentage indicator */}
|
||||
<div className="text-sm theme-text font-mono bg-gray-100 dark:bg-gray-800 px-2 py-1 rounded">
|
||||
{readingPercentage}%
|
||||
</div>
|
||||
|
||||
{hasHeadings && (
|
||||
<button
|
||||
onClick={() => setShowToc(!showToc)}
|
||||
@@ -514,10 +613,10 @@ export default function StoryReadingPage() {
|
||||
</header>
|
||||
|
||||
{/* Story Content */}
|
||||
<div
|
||||
ref={contentRef}
|
||||
className="reading-content"
|
||||
dangerouslySetInnerHTML={{ __html: sanitizedContent }}
|
||||
<StoryContent
|
||||
key={`story-content-${story?.id || 'loading'}`}
|
||||
content={memoizedContent}
|
||||
contentRef={contentRef}
|
||||
/>
|
||||
</article>
|
||||
|
||||
|
||||
259
frontend/src/components/ImageProcessingProgress.tsx
Normal file
259
frontend/src/components/ImageProcessingProgress.tsx
Normal file
@@ -0,0 +1,259 @@
|
||||
import React, { useState, useEffect } from 'react';
|
||||
import { ImageProcessingProgressTracker, ImageProcessingProgress } from '../utils/imageProcessingProgress';
|
||||
|
||||
interface ImageProcessingProgressProps {
|
||||
storyId: string;
|
||||
autoStart?: boolean;
|
||||
onComplete?: () => void;
|
||||
onError?: (error: string) => void;
|
||||
}
|
||||
|
||||
export const ImageProcessingProgressComponent: React.FC<ImageProcessingProgressProps> = ({
|
||||
storyId,
|
||||
autoStart = false,
|
||||
onComplete,
|
||||
onError
|
||||
}) => {
|
||||
const [progress, setProgress] = useState<ImageProcessingProgress | null>(null);
|
||||
const [isTracking, setIsTracking] = useState(false);
|
||||
const [tracker, setTracker] = useState<ImageProcessingProgressTracker | null>(null);
|
||||
|
||||
const startTracking = () => {
|
||||
if (tracker) {
|
||||
tracker.stop();
|
||||
}
|
||||
|
||||
const newTracker = new ImageProcessingProgressTracker(storyId);
|
||||
|
||||
newTracker.onProgress((progress) => {
|
||||
setProgress(progress);
|
||||
});
|
||||
|
||||
newTracker.onComplete((finalProgress) => {
|
||||
setProgress(finalProgress);
|
||||
setIsTracking(false);
|
||||
onComplete?.();
|
||||
});
|
||||
|
||||
newTracker.onError((error) => {
|
||||
console.error('Image processing error:', error);
|
||||
setIsTracking(false);
|
||||
onError?.(error);
|
||||
});
|
||||
|
||||
setTracker(newTracker);
|
||||
setIsTracking(true);
|
||||
newTracker.start();
|
||||
};
|
||||
|
||||
const stopTracking = () => {
|
||||
if (tracker) {
|
||||
tracker.stop();
|
||||
setIsTracking(false);
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
if (autoStart) {
|
||||
startTracking();
|
||||
}
|
||||
|
||||
return () => {
|
||||
if (tracker) {
|
||||
tracker.stop();
|
||||
}
|
||||
};
|
||||
}, [storyId, autoStart]);
|
||||
|
||||
if (!progress && !isTracking) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!progress?.isProcessing && !progress?.completed) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="image-processing-progress">
|
||||
<div className="progress-header">
|
||||
<h4>Processing Images</h4>
|
||||
{isTracking && (
|
||||
<button onClick={stopTracking} className="btn btn-sm btn-secondary">
|
||||
Cancel
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{progress && (
|
||||
<div className="progress-content">
|
||||
{progress.error ? (
|
||||
<div className="alert alert-danger">
|
||||
<strong>Error:</strong> {progress.error}
|
||||
</div>
|
||||
) : progress.completed ? (
|
||||
<div className="alert alert-success">
|
||||
<strong>Completed:</strong> {progress.status}
|
||||
</div>
|
||||
) : (
|
||||
<div className="progress-info">
|
||||
<div className="status-text">
|
||||
<strong>Status:</strong> {progress.status}
|
||||
</div>
|
||||
|
||||
<div className="progress-stats">
|
||||
Processing {progress.processedImages} of {progress.totalImages} images
|
||||
({progress.progressPercentage.toFixed(1)}%)
|
||||
</div>
|
||||
|
||||
{progress.currentImageUrl && (
|
||||
<div className="current-image">
|
||||
<strong>Current:</strong>
|
||||
<span className="image-url" title={progress.currentImageUrl}>
|
||||
{progress.currentImageUrl.length > 60
|
||||
? `...${progress.currentImageUrl.slice(-60)}`
|
||||
: progress.currentImageUrl
|
||||
}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Progress bar */}
|
||||
<div className="progress-bar-container">
|
||||
<div className="progress-bar">
|
||||
<div
|
||||
className="progress-bar-fill"
|
||||
style={{ width: `${progress.progressPercentage}%` }}
|
||||
/>
|
||||
</div>
|
||||
<span className="progress-percentage">
|
||||
{progress.progressPercentage.toFixed(1)}%
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
|
||||
<style jsx>{`
|
||||
.image-processing-progress {
|
||||
background: #f8f9fa;
|
||||
border: 1px solid #dee2e6;
|
||||
border-radius: 4px;
|
||||
padding: 1rem;
|
||||
margin: 1rem 0;
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif;
|
||||
}
|
||||
|
||||
.progress-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: 0.75rem;
|
||||
}
|
||||
|
||||
.progress-header h4 {
|
||||
margin: 0;
|
||||
font-size: 1.1rem;
|
||||
color: #495057;
|
||||
}
|
||||
|
||||
.progress-content {
|
||||
space-y: 0.5rem;
|
||||
}
|
||||
|
||||
.status-text {
|
||||
font-size: 0.9rem;
|
||||
color: #6c757d;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.progress-stats {
|
||||
font-weight: 500;
|
||||
color: #495057;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.current-image {
|
||||
font-size: 0.85rem;
|
||||
color: #6c757d;
|
||||
margin-bottom: 0.75rem;
|
||||
}
|
||||
|
||||
.image-url {
|
||||
font-family: 'Monaco', 'Menlo', 'Ubuntu Mono', monospace;
|
||||
background: #e9ecef;
|
||||
padding: 0.1rem 0.3rem;
|
||||
border-radius: 2px;
|
||||
margin-left: 0.5rem;
|
||||
}
|
||||
|
||||
.progress-bar-container {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.progress-bar {
|
||||
flex: 1;
|
||||
height: 8px;
|
||||
background: #e9ecef;
|
||||
border-radius: 4px;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.progress-bar-fill {
|
||||
height: 100%;
|
||||
background: linear-gradient(90deg, #007bff, #0056b3);
|
||||
transition: width 0.3s ease;
|
||||
}
|
||||
|
||||
.progress-percentage {
|
||||
font-size: 0.85rem;
|
||||
font-weight: 500;
|
||||
color: #495057;
|
||||
min-width: 3rem;
|
||||
text-align: right;
|
||||
}
|
||||
|
||||
.btn {
|
||||
padding: 0.25rem 0.5rem;
|
||||
border-radius: 3px;
|
||||
border: 1px solid transparent;
|
||||
cursor: pointer;
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
.btn-secondary {
|
||||
background: #6c757d;
|
||||
color: white;
|
||||
border-color: #6c757d;
|
||||
}
|
||||
|
||||
.btn-secondary:hover {
|
||||
background: #5a6268;
|
||||
border-color: #545b62;
|
||||
}
|
||||
|
||||
.alert {
|
||||
padding: 0.75rem;
|
||||
border-radius: 4px;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
|
||||
.alert-danger {
|
||||
background: #f8d7da;
|
||||
color: #721c24;
|
||||
border: 1px solid #f5c6cb;
|
||||
}
|
||||
|
||||
.alert-success {
|
||||
background: #d4edda;
|
||||
color: #155724;
|
||||
border: 1px solid #c3e6cb;
|
||||
}
|
||||
`}</style>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default ImageProcessingProgressComponent;
|
||||
@@ -20,6 +20,7 @@ export default function CollectionReadingView({
|
||||
}: CollectionReadingViewProps) {
|
||||
const { story, collection } = data;
|
||||
const [hasScrolledToPosition, setHasScrolledToPosition] = useState(false);
|
||||
const [readingPercentage, setReadingPercentage] = useState(0);
|
||||
const contentRef = useRef<HTMLDivElement>(null);
|
||||
const saveTimeoutRef = useRef<NodeJS.Timeout | null>(null);
|
||||
|
||||
@@ -39,15 +40,25 @@ export default function CollectionReadingView({
|
||||
));
|
||||
|
||||
// Convert to character position in the plain text content
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
return Math.floor(scrollRatio * textLength);
|
||||
}, [story]);
|
||||
|
||||
// Calculate reading percentage from character position
|
||||
const calculateReadingPercentage = useCallback((currentPosition: number): number => {
|
||||
if (!story) return 0;
|
||||
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (totalLength === 0) return 0;
|
||||
|
||||
return Math.round((currentPosition / totalLength) * 100);
|
||||
}, [story]);
|
||||
|
||||
// Convert character position back to scroll position for auto-scroll
|
||||
const scrollToCharacterPosition = useCallback((position: number) => {
|
||||
if (!contentRef.current || !story || hasScrolledToPosition) return;
|
||||
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (textLength === 0 || position === 0) return;
|
||||
|
||||
const ratio = position / textLength;
|
||||
@@ -102,23 +113,28 @@ export default function CollectionReadingView({
|
||||
console.log('Collection view - initializing reading position tracking, saved position:', story.readingPosition);
|
||||
if (story.readingPosition && story.readingPosition > 0) {
|
||||
console.log('Collection view - auto-scrolling to saved position:', story.readingPosition);
|
||||
const initialPercentage = calculateReadingPercentage(story.readingPosition);
|
||||
setReadingPercentage(initialPercentage);
|
||||
scrollToCharacterPosition(story.readingPosition);
|
||||
} else {
|
||||
console.log('Collection view - no saved position, starting fresh tracking');
|
||||
setReadingPercentage(0);
|
||||
setHasScrolledToPosition(true);
|
||||
}
|
||||
}, 500);
|
||||
|
||||
return () => clearTimeout(timeout);
|
||||
}
|
||||
}, [story, scrollToCharacterPosition, hasScrolledToPosition]);
|
||||
}, [story, scrollToCharacterPosition, calculateReadingPercentage, hasScrolledToPosition]);
|
||||
|
||||
// Track reading progress and save position
|
||||
useEffect(() => {
|
||||
const handleScroll = () => {
|
||||
if (hasScrolledToPosition) {
|
||||
const characterPosition = getCharacterPositionFromScroll();
|
||||
console.log('Collection view - scroll detected, character position:', characterPosition);
|
||||
const percentage = calculateReadingPercentage(characterPosition);
|
||||
console.log('Collection view - scroll detected, character position:', characterPosition, 'percentage:', percentage);
|
||||
setReadingPercentage(percentage);
|
||||
debouncedSavePosition(characterPosition);
|
||||
} else {
|
||||
console.log('Collection view - scroll detected but not ready for tracking yet');
|
||||
@@ -132,7 +148,7 @@ export default function CollectionReadingView({
|
||||
clearTimeout(saveTimeoutRef.current);
|
||||
}
|
||||
};
|
||||
}, [hasScrolledToPosition, getCharacterPositionFromScroll, debouncedSavePosition]);
|
||||
}, [hasScrolledToPosition, getCharacterPositionFromScroll, calculateReadingPercentage, debouncedSavePosition]);
|
||||
|
||||
const handlePrevious = () => {
|
||||
if (collection.previousStoryId) {
|
||||
@@ -190,6 +206,11 @@ export default function CollectionReadingView({
|
||||
|
||||
{/* Progress Bar */}
|
||||
<div className="flex items-center gap-4">
|
||||
{/* Reading percentage indicator */}
|
||||
<div className="text-sm text-blue-700 dark:text-blue-300 font-mono bg-blue-100 dark:bg-blue-900 px-2 py-1 rounded">
|
||||
{readingPercentage}%
|
||||
</div>
|
||||
|
||||
<div className="w-32 bg-blue-200 dark:bg-blue-800 rounded-full h-2">
|
||||
<div
|
||||
className="bg-blue-600 dark:bg-blue-400 h-2 rounded-full transition-all duration-300"
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user