Compare commits
53 Commits
feature/ri
...
4e02cd8eaa
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4e02cd8eaa | ||
|
|
48b0087b01 | ||
|
|
c291559366 | ||
|
|
622cf9ac76 | ||
|
|
df5e124115 | ||
|
|
2b4cb1456f | ||
|
|
c2e5445196 | ||
|
|
360b69effc | ||
|
|
3bc8bb9e0c | ||
|
|
7ca4823573 | ||
|
|
5325169495 | ||
|
|
74cdd5dc57 | ||
|
|
574f20bfd7 | ||
|
|
c8249c94d6 | ||
|
|
51a1a69b45 | ||
|
|
6ee2d67027 | ||
|
|
9472210d8b | ||
|
|
62f017c4ca | ||
|
|
857871273d | ||
|
|
a9521a9da1 | ||
|
|
1f41974208 | ||
|
|
b68fde71c0 | ||
|
|
f61be90d5c | ||
|
|
87f37567fb | ||
|
|
9e684a956b | ||
|
|
379ef0d209 | ||
|
|
b1ff684df6 | ||
|
|
0032590030 | ||
|
|
db38d68399 | ||
|
|
48a0865199 | ||
|
|
7daed22d2d | ||
|
|
6c02b8831f | ||
|
|
042f80dd2a | ||
|
|
a472c11ac8 | ||
|
|
a037dd92af | ||
|
|
634de0b6a5 | ||
|
|
b4635b56a3 | ||
|
|
bfb68e81a8 | ||
|
|
1247a3420e | ||
|
|
6caee8a007 | ||
|
|
cf93d3b3a6 | ||
|
|
53cb296adc | ||
|
|
f71b70d03b | ||
|
|
0bdc3f4731 | ||
|
|
345065c03b | ||
|
|
c50dc618bf | ||
|
|
96e6ced8da | ||
|
|
4738ae3a75 | ||
|
|
591ca5a149 | ||
|
|
41ff3a9961 | ||
|
|
0101c0ca2c | ||
| 58bb7f8229 | |||
| a5628019f8 |
@@ -19,7 +19,7 @@ JWT_SECRET=REPLACE_WITH_SECURE_JWT_SECRET_MINIMUM_32_CHARS
|
|||||||
APP_PASSWORD=REPLACE_WITH_SECURE_APP_PASSWORD
|
APP_PASSWORD=REPLACE_WITH_SECURE_APP_PASSWORD
|
||||||
|
|
||||||
# OpenSearch Configuration
|
# OpenSearch Configuration
|
||||||
OPENSEARCH_PASSWORD=REPLACE_WITH_SECURE_OPENSEARCH_PASSWORD
|
#OPENSEARCH_PASSWORD=REPLACE_WITH_SECURE_OPENSEARCH_PASSWORD
|
||||||
SEARCH_ENGINE=opensearch
|
SEARCH_ENGINE=opensearch
|
||||||
|
|
||||||
# Image Storage
|
# Image Storage
|
||||||
|
|||||||
220
ASYNC_IMAGE_PROCESSING.md
Normal file
220
ASYNC_IMAGE_PROCESSING.md
Normal file
@@ -0,0 +1,220 @@
|
|||||||
|
# Async Image Processing Implementation
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The image processing system has been updated to handle external images asynchronously, preventing timeouts when processing stories with many images. This provides real-time progress updates to users showing which images are being processed.
|
||||||
|
|
||||||
|
## Backend Components
|
||||||
|
|
||||||
|
### 1. `ImageProcessingProgressService`
|
||||||
|
- Tracks progress for individual story image processing sessions
|
||||||
|
- Thread-safe with `ConcurrentHashMap` for multi-user support
|
||||||
|
- Provides progress information: total images, processed count, current image, status, errors
|
||||||
|
|
||||||
|
### 2. `AsyncImageProcessingService`
|
||||||
|
- Handles asynchronous image processing using Spring's `@Async` annotation
|
||||||
|
- Counts external images before processing
|
||||||
|
- Provides progress callbacks during processing
|
||||||
|
- Updates story content when processing completes
|
||||||
|
- Automatic cleanup of progress data after completion
|
||||||
|
|
||||||
|
### 3. Enhanced `ImageService`
|
||||||
|
- Added `processContentImagesWithProgress()` method with callback support
|
||||||
|
- Progress callbacks provide real-time updates during image download/processing
|
||||||
|
- Maintains compatibility with existing synchronous processing
|
||||||
|
|
||||||
|
### 4. Updated `StoryController`
|
||||||
|
- `POST /api/stories` and `PUT /api/stories/{id}` now trigger async image processing
|
||||||
|
- `GET /api/stories/{id}/image-processing-progress` endpoint for progress polling
|
||||||
|
- Processing starts immediately after story save and returns control to user
|
||||||
|
|
||||||
|
## Frontend Components
|
||||||
|
|
||||||
|
### 1. `ImageProcessingProgressTracker` (Utility Class)
|
||||||
|
```typescript
|
||||||
|
const tracker = new ImageProcessingProgressTracker(storyId);
|
||||||
|
tracker.onProgress((progress) => {
|
||||||
|
console.log(`Processing ${progress.processedImages}/${progress.totalImages}`);
|
||||||
|
});
|
||||||
|
tracker.onComplete(() => console.log('Done!'));
|
||||||
|
tracker.start();
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. `ImageProcessingProgressComponent` (React Component)
|
||||||
|
```tsx
|
||||||
|
<ImageProcessingProgressComponent
|
||||||
|
storyId={storyId}
|
||||||
|
autoStart={true}
|
||||||
|
onComplete={() => refreshStory()}
|
||||||
|
/>
|
||||||
|
```
|
||||||
|
|
||||||
|
## User Experience
|
||||||
|
|
||||||
|
### Before (Synchronous)
|
||||||
|
1. User saves story with external images
|
||||||
|
2. Request hangs for 30+ seconds processing images
|
||||||
|
3. Browser may timeout
|
||||||
|
4. No feedback about progress
|
||||||
|
5. User doesn't know if it's working
|
||||||
|
|
||||||
|
### After (Asynchronous)
|
||||||
|
1. User saves story with external images
|
||||||
|
2. Save completes immediately
|
||||||
|
3. Progress indicator appears: "Processing 5 images. Currently image 2 of 5..."
|
||||||
|
4. User can continue using the application
|
||||||
|
5. Progress updates every second
|
||||||
|
6. Story automatically refreshes when processing completes
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
### Progress Endpoint
|
||||||
|
```
|
||||||
|
GET /api/stories/{id}/image-processing-progress
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response when processing:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"isProcessing": true,
|
||||||
|
"totalImages": 5,
|
||||||
|
"processedImages": 2,
|
||||||
|
"currentImageUrl": "https://example.com/image.jpg",
|
||||||
|
"status": "Processing image 3 of 5",
|
||||||
|
"progressPercentage": 40.0,
|
||||||
|
"completed": false,
|
||||||
|
"error": ""
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response when completed:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"isProcessing": false,
|
||||||
|
"totalImages": 5,
|
||||||
|
"processedImages": 5,
|
||||||
|
"currentImageUrl": "",
|
||||||
|
"status": "Completed: 5 images processed",
|
||||||
|
"progressPercentage": 100.0,
|
||||||
|
"completed": true,
|
||||||
|
"error": ""
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response when no processing:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"isProcessing": false,
|
||||||
|
"message": "No active image processing"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Integration Examples
|
||||||
|
|
||||||
|
### React Hook Usage
|
||||||
|
```tsx
|
||||||
|
import { useImageProcessingProgress } from '../utils/imageProcessingProgress';
|
||||||
|
|
||||||
|
function StoryEditor({ storyId }) {
|
||||||
|
const { progress, isTracking, startTracking } = useImageProcessingProgress(storyId);
|
||||||
|
|
||||||
|
const handleSave = async () => {
|
||||||
|
await saveStory();
|
||||||
|
startTracking(); // Start monitoring progress
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div>
|
||||||
|
{isTracking && progress && (
|
||||||
|
<div className="progress-indicator">
|
||||||
|
Processing {progress.processedImages}/{progress.totalImages} images...
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
<button onClick={handleSave}>Save Story</button>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Manual Progress Tracking
|
||||||
|
```typescript
|
||||||
|
// After saving a story with external images
|
||||||
|
const tracker = new ImageProcessingProgressTracker(storyId);
|
||||||
|
|
||||||
|
tracker.onProgress((progress) => {
|
||||||
|
updateProgressBar(progress.progressPercentage);
|
||||||
|
showStatus(progress.status);
|
||||||
|
if (progress.currentImageUrl) {
|
||||||
|
showCurrentImage(progress.currentImageUrl);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
tracker.onComplete((finalProgress) => {
|
||||||
|
hideProgressBar();
|
||||||
|
showNotification('Image processing completed!');
|
||||||
|
refreshStoryContent(); // Reload story with processed images
|
||||||
|
});
|
||||||
|
|
||||||
|
tracker.onError((error) => {
|
||||||
|
hideProgressBar();
|
||||||
|
showError(`Image processing failed: ${error}`);
|
||||||
|
});
|
||||||
|
|
||||||
|
tracker.start();
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Polling Interval
|
||||||
|
Default: 1 second (1000ms)
|
||||||
|
```typescript
|
||||||
|
const tracker = new ImageProcessingProgressTracker(storyId, 500); // Poll every 500ms
|
||||||
|
```
|
||||||
|
|
||||||
|
### Timeout
|
||||||
|
Default: 5 minutes (300000ms)
|
||||||
|
```typescript
|
||||||
|
const tracker = new ImageProcessingProgressTracker(storyId, 1000, 600000); // 10 minute timeout
|
||||||
|
```
|
||||||
|
|
||||||
|
### Spring Async Configuration
|
||||||
|
The backend uses Spring's default async executor. For production, consider configuring a custom thread pool in your application properties:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
spring:
|
||||||
|
task:
|
||||||
|
execution:
|
||||||
|
pool:
|
||||||
|
core-size: 4
|
||||||
|
max-size: 8
|
||||||
|
queue-capacity: 100
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Backend Errors
|
||||||
|
- Network timeouts downloading images
|
||||||
|
- Invalid image formats
|
||||||
|
- Disk space issues
|
||||||
|
- All errors are logged and returned in progress status
|
||||||
|
|
||||||
|
### Frontend Errors
|
||||||
|
- Network failures during progress polling
|
||||||
|
- Timeout if processing takes too long
|
||||||
|
- Graceful degradation - user can continue working
|
||||||
|
|
||||||
|
## Benefits
|
||||||
|
|
||||||
|
1. **No More Timeouts**: Large image processing operations won't timeout HTTP requests
|
||||||
|
2. **Better UX**: Users get real-time feedback about processing progress
|
||||||
|
3. **Improved Performance**: Users can continue using the app while images process
|
||||||
|
4. **Error Visibility**: Clear error messages when image processing fails
|
||||||
|
5. **Scalability**: Multiple users can process images simultaneously without blocking
|
||||||
|
|
||||||
|
## Future Enhancements
|
||||||
|
|
||||||
|
1. **WebSocket Support**: Replace polling with WebSocket for real-time push updates
|
||||||
|
2. **Batch Processing**: Queue multiple stories for batch image processing
|
||||||
|
3. **Retry Logic**: Automatic retry for failed image downloads
|
||||||
|
4. **Progress Persistence**: Save progress to database for recovery after server restart
|
||||||
|
5. **Image Optimization**: Automatic resize/compress images during processing
|
||||||
@@ -1,889 +0,0 @@
|
|||||||
# StoryCove Search Migration Specification: Typesense to OpenSearch
|
|
||||||
|
|
||||||
## Executive Summary
|
|
||||||
|
|
||||||
This document specifies the migration from Typesense to OpenSearch for the StoryCove application. The migration will be implemented using a parallel approach, maintaining Typesense functionality while gradually transitioning to OpenSearch, ensuring zero downtime and the ability to rollback if needed.
|
|
||||||
|
|
||||||
**Migration Goals:**
|
|
||||||
- Solve random query reliability issues
|
|
||||||
- Improve complex filtering performance
|
|
||||||
- Maintain feature parity during transition
|
|
||||||
- Zero downtime migration
|
|
||||||
- Improved developer experience
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Current State Analysis
|
|
||||||
|
|
||||||
### Typesense Implementation Overview
|
|
||||||
|
|
||||||
**Service Architecture:**
|
|
||||||
- `TypesenseService.java` (~2000 lines) - Primary search service
|
|
||||||
- 3 search indexes: Stories, Authors, Collections
|
|
||||||
- Multi-library support with dynamic collection names
|
|
||||||
- Integration with Spring Boot backend
|
|
||||||
|
|
||||||
**Core Functionality:**
|
|
||||||
1. **Full-text Search**: Stories, Authors with complex query building
|
|
||||||
2. **Random Story Selection**: `_rand()` function with fallback logic
|
|
||||||
3. **Advanced Filtering**: 15+ filter conditions with boolean logic
|
|
||||||
4. **Faceting**: Tag aggregations and counts
|
|
||||||
5. **Autocomplete**: Search suggestions with typeahead
|
|
||||||
6. **CRUD Operations**: Index/update/delete for all entity types
|
|
||||||
|
|
||||||
**Current Issues Identified:**
|
|
||||||
- `_rand()` function unreliability requiring complex fallback logic
|
|
||||||
- Complex filter query building with escaping issues
|
|
||||||
- Limited aggregation capabilities
|
|
||||||
- Inconsistent API behavior across query patterns
|
|
||||||
- Multi-collection management complexity
|
|
||||||
|
|
||||||
### Data Models and Schema
|
|
||||||
|
|
||||||
**Story Index Fields:**
|
|
||||||
```java
|
|
||||||
// Core fields
|
|
||||||
UUID id, String title, String description, String sourceUrl
|
|
||||||
Integer wordCount, Integer rating, Integer volume
|
|
||||||
Boolean isRead, LocalDateTime lastReadAt, Integer readingPosition
|
|
||||||
|
|
||||||
// Relationships
|
|
||||||
UUID authorId, String authorName
|
|
||||||
UUID seriesId, String seriesName
|
|
||||||
List<String> tagNames
|
|
||||||
|
|
||||||
// Metadata
|
|
||||||
LocalDateTime createdAt, LocalDateTime updatedAt
|
|
||||||
String coverPath, String sourceDomain
|
|
||||||
```
|
|
||||||
|
|
||||||
**Author Index Fields:**
|
|
||||||
```java
|
|
||||||
UUID id, String name, String notes
|
|
||||||
Integer authorRating, Double averageStoryRating, Integer storyCount
|
|
||||||
List<String> urls, String avatarImagePath
|
|
||||||
LocalDateTime createdAt, LocalDateTime updatedAt
|
|
||||||
```
|
|
||||||
|
|
||||||
**Collection Index Fields:**
|
|
||||||
```java
|
|
||||||
UUID id, String name, String description
|
|
||||||
List<String> tagNames, Boolean archived
|
|
||||||
LocalDateTime createdAt, LocalDateTime updatedAt
|
|
||||||
Integer storyCount, Integer currentPosition
|
|
||||||
```
|
|
||||||
|
|
||||||
### API Endpoints Current State
|
|
||||||
|
|
||||||
**Search Endpoints Analysis:**
|
|
||||||
|
|
||||||
**✅ USED by Frontend (Must Implement):**
|
|
||||||
- `GET /api/stories/search` - Main story search with complex filtering (CRITICAL)
|
|
||||||
- `GET /api/stories/random` - Random story selection with filters (CRITICAL)
|
|
||||||
- `GET /api/authors/search-typesense` - Author search (HIGH)
|
|
||||||
- `GET /api/tags/autocomplete` - Tag suggestions (MEDIUM)
|
|
||||||
- `POST /api/stories/reindex-typesense` - Admin reindex operations (MEDIUM)
|
|
||||||
- `POST /api/authors/reindex-typesense` - Admin reindex operations (MEDIUM)
|
|
||||||
- `POST /api/stories/recreate-typesense-collection` - Admin recreate (MEDIUM)
|
|
||||||
- `POST /api/authors/recreate-typesense-collection` - Admin recreate (MEDIUM)
|
|
||||||
|
|
||||||
**❌ UNUSED by Frontend (Skip Implementation):**
|
|
||||||
- `GET /api/stories/search/suggestions` - Not used by frontend
|
|
||||||
- `GET /api/authors/search` - Superseded by typesense version
|
|
||||||
- `GET /api/series/search` - Not used by frontend
|
|
||||||
- `GET /api/tags/search` - Superseded by autocomplete
|
|
||||||
- `POST /api/search/reindex` - Not used by frontend
|
|
||||||
- `GET /api/search/health` - Not used by frontend
|
|
||||||
|
|
||||||
**Scope Reduction: ~40% fewer endpoints to implement**
|
|
||||||
|
|
||||||
**Search Parameters (Stories):**
|
|
||||||
```
|
|
||||||
query, page, size, authors[], tags[], minRating, maxRating
|
|
||||||
sortBy, sortDir, facetBy[]
|
|
||||||
minWordCount, maxWordCount, createdAfter, createdBefore
|
|
||||||
lastReadAfter, lastReadBefore, unratedOnly, readingStatus
|
|
||||||
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter
|
|
||||||
minTagCount, popularOnly, hiddenGemsOnly
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Target OpenSearch Architecture
|
|
||||||
|
|
||||||
### Service Layer Design
|
|
||||||
|
|
||||||
**New Components:**
|
|
||||||
```
|
|
||||||
OpenSearchService.java - Primary search service (mirrors TypesenseService API)
|
|
||||||
OpenSearchConfig.java - Configuration and client setup
|
|
||||||
SearchMigrationService.java - Handles parallel operation during migration
|
|
||||||
SearchServiceAdapter.java - Abstraction layer for service switching
|
|
||||||
```
|
|
||||||
|
|
||||||
**Index Strategy:**
|
|
||||||
- **Single-node deployment** for development/small installations
|
|
||||||
- **Index-per-library** approach: `stories-{libraryId}`, `authors-{libraryId}`, `collections-{libraryId}`
|
|
||||||
- **Index templates** for consistent mapping across libraries
|
|
||||||
- **Aliases** for easy switching and zero-downtime updates
|
|
||||||
|
|
||||||
### OpenSearch Index Mappings
|
|
||||||
|
|
||||||
**Stories Index Mapping:**
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"settings": {
|
|
||||||
"number_of_shards": 1,
|
|
||||||
"number_of_replicas": 0,
|
|
||||||
"analysis": {
|
|
||||||
"analyzer": {
|
|
||||||
"story_analyzer": {
|
|
||||||
"type": "custom",
|
|
||||||
"tokenizer": "standard",
|
|
||||||
"filter": ["lowercase", "stop", "snowball"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"mappings": {
|
|
||||||
"properties": {
|
|
||||||
"id": {"type": "keyword"},
|
|
||||||
"title": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {"keyword": {"type": "keyword"}}
|
|
||||||
},
|
|
||||||
"description": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer"
|
|
||||||
},
|
|
||||||
"authorName": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {"keyword": {"type": "keyword"}}
|
|
||||||
},
|
|
||||||
"seriesName": {
|
|
||||||
"type": "text",
|
|
||||||
"fields": {"keyword": {"type": "keyword"}}
|
|
||||||
},
|
|
||||||
"tagNames": {"type": "keyword"},
|
|
||||||
"wordCount": {"type": "integer"},
|
|
||||||
"rating": {"type": "integer"},
|
|
||||||
"volume": {"type": "integer"},
|
|
||||||
"isRead": {"type": "boolean"},
|
|
||||||
"readingPosition": {"type": "integer"},
|
|
||||||
"lastReadAt": {"type": "date"},
|
|
||||||
"createdAt": {"type": "date"},
|
|
||||||
"updatedAt": {"type": "date"},
|
|
||||||
"coverPath": {"type": "keyword"},
|
|
||||||
"sourceUrl": {"type": "keyword"},
|
|
||||||
"sourceDomain": {"type": "keyword"}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
**Authors Index Mapping:**
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"mappings": {
|
|
||||||
"properties": {
|
|
||||||
"id": {"type": "keyword"},
|
|
||||||
"name": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {"keyword": {"type": "keyword"}}
|
|
||||||
},
|
|
||||||
"notes": {"type": "text"},
|
|
||||||
"authorRating": {"type": "integer"},
|
|
||||||
"averageStoryRating": {"type": "float"},
|
|
||||||
"storyCount": {"type": "integer"},
|
|
||||||
"urls": {"type": "keyword"},
|
|
||||||
"avatarImagePath": {"type": "keyword"},
|
|
||||||
"createdAt": {"type": "date"},
|
|
||||||
"updatedAt": {"type": "date"}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
**Collections Index Mapping:**
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"mappings": {
|
|
||||||
"properties": {
|
|
||||||
"id": {"type": "keyword"},
|
|
||||||
"name": {
|
|
||||||
"type": "text",
|
|
||||||
"fields": {"keyword": {"type": "keyword"}}
|
|
||||||
},
|
|
||||||
"description": {"type": "text"},
|
|
||||||
"tagNames": {"type": "keyword"},
|
|
||||||
"archived": {"type": "boolean"},
|
|
||||||
"storyCount": {"type": "integer"},
|
|
||||||
"currentPosition": {"type": "integer"},
|
|
||||||
"createdAt": {"type": "date"},
|
|
||||||
"updatedAt": {"type": "date"}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Query Translation Strategy
|
|
||||||
|
|
||||||
**Random Story Queries:**
|
|
||||||
```java
|
|
||||||
// Typesense (problematic)
|
|
||||||
String sortBy = seed != null ? "_rand(" + seed + ")" : "_rand()";
|
|
||||||
|
|
||||||
// OpenSearch (reliable)
|
|
||||||
QueryBuilder randomQuery = QueryBuilders.functionScoreQuery(
|
|
||||||
QueryBuilders.boolQuery().must(filters),
|
|
||||||
ScoreFunctionBuilders.randomFunction(seed != null ? seed.intValue() : null)
|
|
||||||
);
|
|
||||||
```
|
|
||||||
|
|
||||||
**Complex Filtering:**
|
|
||||||
```java
|
|
||||||
// Build bool query with multiple filter conditions
|
|
||||||
BoolQueryBuilder boolQuery = QueryBuilders.boolQuery()
|
|
||||||
.must(QueryBuilders.multiMatchQuery(query, "title", "description", "authorName"))
|
|
||||||
.filter(QueryBuilders.termsQuery("tagNames", tags))
|
|
||||||
.filter(QueryBuilders.rangeQuery("wordCount").gte(minWords).lte(maxWords))
|
|
||||||
.filter(QueryBuilders.rangeQuery("rating").gte(minRating).lte(maxRating));
|
|
||||||
```
|
|
||||||
|
|
||||||
**Faceting/Aggregations:**
|
|
||||||
```java
|
|
||||||
// Tags aggregation
|
|
||||||
AggregationBuilder tagsAgg = AggregationBuilders
|
|
||||||
.terms("tags")
|
|
||||||
.field("tagNames")
|
|
||||||
.size(100);
|
|
||||||
|
|
||||||
// Rating ranges
|
|
||||||
AggregationBuilder ratingRanges = AggregationBuilders
|
|
||||||
.range("rating_ranges")
|
|
||||||
.field("rating")
|
|
||||||
.addRange("unrated", 0, 1)
|
|
||||||
.addRange("low", 1, 3)
|
|
||||||
.addRange("high", 4, 6);
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Revised Implementation Phases (Scope Reduced by 40%)
|
|
||||||
|
|
||||||
### Phase 1: Infrastructure Setup (Week 1)
|
|
||||||
|
|
||||||
**Objectives:**
|
|
||||||
- Add OpenSearch to Docker Compose
|
|
||||||
- Create basic OpenSearch service
|
|
||||||
- Establish index templates and mappings
|
|
||||||
- **Focus**: Only stories, authors, and tags indexes (skip series, collections)
|
|
||||||
|
|
||||||
**Deliverables:**
|
|
||||||
1. **Docker Compose Updates:**
|
|
||||||
```yaml
|
|
||||||
opensearch:
|
|
||||||
image: opensearchproject/opensearch:2.11.0
|
|
||||||
environment:
|
|
||||||
- discovery.type=single-node
|
|
||||||
- DISABLE_SECURITY_PLUGIN=true
|
|
||||||
- OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx1g
|
|
||||||
ports:
|
|
||||||
- "9200:9200"
|
|
||||||
volumes:
|
|
||||||
- opensearch_data:/usr/share/opensearch/data
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **OpenSearchConfig.java:**
|
|
||||||
```java
|
|
||||||
@Configuration
|
|
||||||
@ConditionalOnProperty(name = "storycove.opensearch.enabled", havingValue = "true")
|
|
||||||
public class OpenSearchConfig {
|
|
||||||
@Bean
|
|
||||||
public OpenSearchClient openSearchClient() {
|
|
||||||
// Client configuration
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Basic Index Creation:**
|
|
||||||
- Create index templates for stories, authors, collections
|
|
||||||
- Implement index creation with proper mappings
|
|
||||||
- Add health check endpoint
|
|
||||||
|
|
||||||
**Success Criteria:**
|
|
||||||
- OpenSearch container starts successfully
|
|
||||||
- Basic connectivity established
|
|
||||||
- Index templates created and validated
|
|
||||||
|
|
||||||
### Phase 2: Core Service Implementation (Week 2)
|
|
||||||
|
|
||||||
**Objectives:**
|
|
||||||
- Implement OpenSearchService with core functionality
|
|
||||||
- Create service abstraction layer
|
|
||||||
- Implement basic search operations
|
|
||||||
- **Focus**: Only critical endpoints (stories search, random, authors)
|
|
||||||
|
|
||||||
**Deliverables:**
|
|
||||||
1. **OpenSearchService.java** - Core service implementing:
|
|
||||||
- `indexStory()`, `updateStory()`, `deleteStory()`
|
|
||||||
- `searchStories()` with basic query support (CRITICAL)
|
|
||||||
- `getRandomStoryId()` with reliable seed support (CRITICAL)
|
|
||||||
- `indexAuthor()`, `updateAuthor()`, `deleteAuthor()`
|
|
||||||
- `searchAuthors()` for authors page (HIGH)
|
|
||||||
- `bulkIndexStories()`, `bulkIndexAuthors()` for initial data loading
|
|
||||||
|
|
||||||
2. **SearchServiceAdapter.java** - Abstraction layer:
|
|
||||||
```java
|
|
||||||
@Service
|
|
||||||
public class SearchServiceAdapter {
|
|
||||||
@Autowired(required = false)
|
|
||||||
private TypesenseService typesenseService;
|
|
||||||
|
|
||||||
@Autowired(required = false)
|
|
||||||
private OpenSearchService openSearchService;
|
|
||||||
|
|
||||||
@Value("${storycove.search.provider:typesense}")
|
|
||||||
private String searchProvider;
|
|
||||||
|
|
||||||
public SearchResultDto<StorySearchDto> searchStories(...) {
|
|
||||||
return "opensearch".equals(searchProvider)
|
|
||||||
? openSearchService.searchStories(...)
|
|
||||||
: typesenseService.searchStories(...);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Basic Query Implementation:**
|
|
||||||
- Full-text search across title/description/author
|
|
||||||
- Basic filtering (tags, rating, word count)
|
|
||||||
- Pagination and sorting
|
|
||||||
|
|
||||||
**Success Criteria:**
|
|
||||||
- Basic search functionality working
|
|
||||||
- Service abstraction layer functional
|
|
||||||
- Can switch between Typesense and OpenSearch via configuration
|
|
||||||
|
|
||||||
### Phase 3: Advanced Features Implementation (Week 3)
|
|
||||||
|
|
||||||
**Objectives:**
|
|
||||||
- Implement complex filtering (all 15+ filter types)
|
|
||||||
- Add random story functionality
|
|
||||||
- Implement faceting/aggregations
|
|
||||||
- Add autocomplete/suggestions
|
|
||||||
|
|
||||||
**Deliverables:**
|
|
||||||
1. **Complex Query Builder:**
|
|
||||||
- All filter conditions from original implementation
|
|
||||||
- Date range filtering with proper timezone handling
|
|
||||||
- Boolean logic for reading status, coverage, series filters
|
|
||||||
|
|
||||||
2. **Random Story Implementation:**
|
|
||||||
```java
|
|
||||||
public Optional<UUID> getRandomStoryId(String searchQuery, List<String> tags, Long seed, ...) {
|
|
||||||
BoolQueryBuilder baseQuery = buildFilterQuery(searchQuery, tags, ...);
|
|
||||||
|
|
||||||
QueryBuilder randomQuery = QueryBuilders.functionScoreQuery(
|
|
||||||
baseQuery,
|
|
||||||
ScoreFunctionBuilders.randomFunction(seed != null ? seed.intValue() : null)
|
|
||||||
);
|
|
||||||
|
|
||||||
SearchRequest request = new SearchRequest("stories-" + getCurrentLibraryId())
|
|
||||||
.source(new SearchSourceBuilder()
|
|
||||||
.query(randomQuery)
|
|
||||||
.size(1)
|
|
||||||
.fetchSource(new String[]{"id"}, null));
|
|
||||||
|
|
||||||
// Execute and return result
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Faceting Implementation:**
|
|
||||||
- Tag aggregations with counts
|
|
||||||
- Rating range aggregations
|
|
||||||
- Author aggregations
|
|
||||||
- Custom facet builders
|
|
||||||
|
|
||||||
4. **Autocomplete Service:**
|
|
||||||
- Suggest-based implementation using completion fields
|
|
||||||
- Prefix matching for story titles and author names
|
|
||||||
|
|
||||||
**Success Criteria:**
|
|
||||||
- All filter conditions working correctly
|
|
||||||
- Random story selection reliable with seed support
|
|
||||||
- Faceting returns accurate counts
|
|
||||||
- Autocomplete responsive and accurate
|
|
||||||
|
|
||||||
### Phase 4: Data Migration & Parallel Operation (Week 4)
|
|
||||||
|
|
||||||
**Objectives:**
|
|
||||||
- Implement bulk data migration from database
|
|
||||||
- Enable parallel operation (write to both systems)
|
|
||||||
- Comprehensive testing of OpenSearch functionality
|
|
||||||
|
|
||||||
**Deliverables:**
|
|
||||||
1. **Migration Service:**
|
|
||||||
```java
|
|
||||||
@Service
|
|
||||||
public class SearchMigrationService {
|
|
||||||
public void performFullMigration() {
|
|
||||||
// Migrate all libraries
|
|
||||||
List<Library> libraries = libraryService.findAll();
|
|
||||||
for (Library library : libraries) {
|
|
||||||
migrateLibraryData(library);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private void migrateLibraryData(Library library) {
|
|
||||||
// Create indexes for library
|
|
||||||
// Bulk load stories, authors, collections
|
|
||||||
// Verify data integrity
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Dual-Write Implementation:**
|
|
||||||
- Modify all entity update operations to write to both systems
|
|
||||||
- Add configuration flag for dual-write mode
|
|
||||||
- Error handling for partial failures
|
|
||||||
|
|
||||||
3. **Data Validation Tools:**
|
|
||||||
- Compare search result counts between systems
|
|
||||||
- Validate random story selection consistency
|
|
||||||
- Check faceting accuracy
|
|
||||||
|
|
||||||
**Success Criteria:**
|
|
||||||
- Complete data migration with 100% accuracy
|
|
||||||
- Dual-write operations working without errors
|
|
||||||
- Search result parity between systems verified
|
|
||||||
|
|
||||||
### Phase 5: API Integration & Testing (Week 5)
|
|
||||||
|
|
||||||
**Objectives:**
|
|
||||||
- Update controller endpoints to use OpenSearch
|
|
||||||
- Comprehensive integration testing
|
|
||||||
- Performance testing and optimization
|
|
||||||
|
|
||||||
**Deliverables:**
|
|
||||||
1. **Controller Updates:**
|
|
||||||
- Modify controllers to use SearchServiceAdapter
|
|
||||||
- Add migration controls for gradual rollout
|
|
||||||
- Implement A/B testing capability
|
|
||||||
|
|
||||||
2. **Integration Tests:**
|
|
||||||
```java
|
|
||||||
@SpringBootTest
|
|
||||||
@TestMethodOrder(OrderAnnotation.class)
|
|
||||||
public class OpenSearchIntegrationTest {
|
|
||||||
@Test
|
|
||||||
@Order(1)
|
|
||||||
void testBasicSearch() {
|
|
||||||
// Test basic story search functionality
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
@Order(2)
|
|
||||||
void testComplexFiltering() {
|
|
||||||
// Test all 15+ filter conditions
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
@Order(3)
|
|
||||||
void testRandomStory() {
|
|
||||||
// Test random story with and without seed
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
@Order(4)
|
|
||||||
void testFaceting() {
|
|
||||||
// Test aggregation accuracy
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Performance Testing:**
|
|
||||||
- Load testing with realistic data volumes
|
|
||||||
- Query performance benchmarking
|
|
||||||
- Memory usage monitoring
|
|
||||||
|
|
||||||
**Success Criteria:**
|
|
||||||
- All integration tests passing
|
|
||||||
- Performance meets or exceeds Typesense baseline
|
|
||||||
- Memory usage within acceptable limits (< 2GB)
|
|
||||||
|
|
||||||
### Phase 6: Production Rollout & Monitoring (Week 6)
|
|
||||||
|
|
||||||
**Objectives:**
|
|
||||||
- Production deployment with feature flags
|
|
||||||
- Gradual user migration with monitoring
|
|
||||||
- Rollback capability testing
|
|
||||||
|
|
||||||
**Deliverables:**
|
|
||||||
1. **Feature Flag Implementation:**
|
|
||||||
```java
|
|
||||||
@Component
|
|
||||||
public class SearchFeatureFlags {
|
|
||||||
@Value("${storycove.search.opensearch.enabled:false}")
|
|
||||||
private boolean openSearchEnabled;
|
|
||||||
|
|
||||||
@Value("${storycove.search.opensearch.percentage:0}")
|
|
||||||
private int rolloutPercentage;
|
|
||||||
|
|
||||||
public boolean shouldUseOpenSearch(String userId) {
|
|
||||||
if (!openSearchEnabled) return false;
|
|
||||||
return userId.hashCode() % 100 < rolloutPercentage;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Monitoring & Alerting:**
|
|
||||||
- Query performance metrics
|
|
||||||
- Error rate monitoring
|
|
||||||
- Search result accuracy validation
|
|
||||||
- User experience metrics
|
|
||||||
|
|
||||||
3. **Rollback Procedures:**
|
|
||||||
- Immediate rollback to Typesense capability
|
|
||||||
- Data consistency verification
|
|
||||||
- Performance rollback triggers
|
|
||||||
|
|
||||||
**Success Criteria:**
|
|
||||||
- Successful production deployment
|
|
||||||
- Zero user-facing issues during rollout
|
|
||||||
- Monitoring showing improved performance
|
|
||||||
- Rollback procedures validated
|
|
||||||
|
|
||||||
### Phase 7: Cleanup & Documentation (Week 7)
|
|
||||||
|
|
||||||
**Objectives:**
|
|
||||||
- Remove Typesense dependencies
|
|
||||||
- Update documentation
|
|
||||||
- Performance optimization
|
|
||||||
|
|
||||||
**Deliverables:**
|
|
||||||
1. **Code Cleanup:**
|
|
||||||
- Remove TypesenseService and related classes
|
|
||||||
- Clean up Docker Compose configuration
|
|
||||||
- Remove unused dependencies
|
|
||||||
|
|
||||||
2. **Documentation Updates:**
|
|
||||||
- Update deployment documentation
|
|
||||||
- Search API documentation
|
|
||||||
- Troubleshooting guides
|
|
||||||
|
|
||||||
3. **Performance Tuning:**
|
|
||||||
- Index optimization
|
|
||||||
- Query performance tuning
|
|
||||||
- Resource allocation optimization
|
|
||||||
|
|
||||||
**Success Criteria:**
|
|
||||||
- Typesense completely removed
|
|
||||||
- Documentation up to date
|
|
||||||
- Optimized performance in production
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Data Migration Strategy
|
|
||||||
|
|
||||||
### Pre-Migration Validation
|
|
||||||
|
|
||||||
**Data Integrity Checks:**
|
|
||||||
1. Count validation: Ensure all stories/authors/collections are present
|
|
||||||
2. Field validation: Verify all required fields are populated
|
|
||||||
3. Relationship validation: Check author-story and series-story relationships
|
|
||||||
4. Library separation: Ensure proper multi-library data isolation
|
|
||||||
|
|
||||||
**Migration Process:**
|
|
||||||
|
|
||||||
1. **Index Creation:**
|
|
||||||
```java
|
|
||||||
// Create indexes with proper mappings for each library
|
|
||||||
for (Library library : libraries) {
|
|
||||||
String storiesIndex = "stories-" + library.getId();
|
|
||||||
createIndexWithMapping(storiesIndex, getStoriesMapping());
|
|
||||||
createIndexWithMapping("authors-" + library.getId(), getAuthorsMapping());
|
|
||||||
createIndexWithMapping("collections-" + library.getId(), getCollectionsMapping());
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Bulk Data Loading:**
|
|
||||||
```java
|
|
||||||
// Load in batches to manage memory usage
|
|
||||||
int batchSize = 1000;
|
|
||||||
List<Story> allStories = storyService.findByLibraryId(libraryId);
|
|
||||||
|
|
||||||
for (int i = 0; i < allStories.size(); i += batchSize) {
|
|
||||||
List<Story> batch = allStories.subList(i, Math.min(i + batchSize, allStories.size()));
|
|
||||||
List<StoryDocument> documents = batch.stream()
|
|
||||||
.map(this::convertToSearchDocument)
|
|
||||||
.collect(Collectors.toList());
|
|
||||||
|
|
||||||
bulkIndexStories(documents, "stories-" + libraryId);
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Post-Migration Validation:**
|
|
||||||
- Count comparison between database and OpenSearch
|
|
||||||
- Spot-check random records for field accuracy
|
|
||||||
- Test search functionality with known queries
|
|
||||||
- Verify faceting counts match expected values
|
|
||||||
|
|
||||||
### Rollback Strategy
|
|
||||||
|
|
||||||
**Immediate Rollback Triggers:**
|
|
||||||
- Search error rate > 1%
|
|
||||||
- Query performance degradation > 50%
|
|
||||||
- Data inconsistency detected
|
|
||||||
- Memory usage > 4GB sustained
|
|
||||||
|
|
||||||
**Rollback Process:**
|
|
||||||
1. Update feature flag to disable OpenSearch
|
|
||||||
2. Verify Typesense still operational
|
|
||||||
3. Clear OpenSearch indexes to free resources
|
|
||||||
4. Investigate and document issues
|
|
||||||
|
|
||||||
**Data Consistency During Rollback:**
|
|
||||||
- Continue dual-write during investigation
|
|
||||||
- Re-sync any missed updates to OpenSearch
|
|
||||||
- Validate data integrity before retry
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Testing Strategy
|
|
||||||
|
|
||||||
### Unit Tests
|
|
||||||
|
|
||||||
**OpenSearchService Unit Tests:**
|
|
||||||
```java
|
|
||||||
@ExtendWith(MockitoExtension.class)
|
|
||||||
class OpenSearchServiceTest {
|
|
||||||
@Mock private OpenSearchClient client;
|
|
||||||
@InjectMocks private OpenSearchService service;
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testSearchStoriesBasicQuery() {
|
|
||||||
// Mock OpenSearch response
|
|
||||||
// Test basic search functionality
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testComplexFilterQuery() {
|
|
||||||
// Test complex boolean query building
|
|
||||||
}
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testRandomStorySelection() {
|
|
||||||
// Test random query with seed
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
**Query Builder Tests:**
|
|
||||||
- Test all 15+ filter conditions
|
|
||||||
- Validate query structure and parameters
|
|
||||||
- Test edge cases and null handling
|
|
||||||
|
|
||||||
### Integration Tests
|
|
||||||
|
|
||||||
**Full Search Integration:**
|
|
||||||
```java
|
|
||||||
@SpringBootTest
|
|
||||||
@Testcontainers
|
|
||||||
class OpenSearchIntegrationTest {
|
|
||||||
@Container
|
|
||||||
static OpenSearchContainer opensearch = new OpenSearchContainer("opensearchproject/opensearch:2.11.0");
|
|
||||||
|
|
||||||
@Test
|
|
||||||
void testEndToEndStorySearch() {
|
|
||||||
// Insert test data
|
|
||||||
// Perform search via controller
|
|
||||||
// Validate results
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Performance Tests
|
|
||||||
|
|
||||||
**Load Testing Scenarios:**
|
|
||||||
1. **Concurrent Search Load:**
|
|
||||||
- 50 concurrent users performing searches
|
|
||||||
- Mixed query complexity
|
|
||||||
- Duration: 10 minutes
|
|
||||||
|
|
||||||
2. **Bulk Indexing Performance:**
|
|
||||||
- Index 10,000 stories in batches
|
|
||||||
- Measure throughput and memory usage
|
|
||||||
|
|
||||||
3. **Random Query Performance:**
|
|
||||||
- 1000 random story requests with different seeds
|
|
||||||
- Compare with Typesense baseline
|
|
||||||
|
|
||||||
### Acceptance Tests
|
|
||||||
|
|
||||||
**Functional Requirements:**
|
|
||||||
- All existing search functionality preserved
|
|
||||||
- Random story selection improved reliability
|
|
||||||
- Faceting accuracy maintained
|
|
||||||
- Multi-library separation working
|
|
||||||
|
|
||||||
**Performance Requirements:**
|
|
||||||
- Search response time < 100ms for 95th percentile
|
|
||||||
- Random story selection < 50ms
|
|
||||||
- Index update operations < 10ms
|
|
||||||
- Memory usage < 2GB in production
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Risk Analysis & Mitigation
|
|
||||||
|
|
||||||
### Technical Risks
|
|
||||||
|
|
||||||
**Risk: OpenSearch Memory Usage**
|
|
||||||
- *Probability: Medium*
|
|
||||||
- *Impact: High*
|
|
||||||
- *Mitigation: Resource monitoring, index optimization, container limits*
|
|
||||||
|
|
||||||
**Risk: Query Performance Regression**
|
|
||||||
- *Probability: Low*
|
|
||||||
- *Impact: High*
|
|
||||||
- *Mitigation: Performance testing, query optimization, caching layer*
|
|
||||||
|
|
||||||
**Risk: Data Migration Accuracy**
|
|
||||||
- *Probability: Low*
|
|
||||||
- *Impact: Critical*
|
|
||||||
- *Mitigation: Comprehensive validation, dual-write verification, rollback procedures*
|
|
||||||
|
|
||||||
**Risk: Complex Filter Compatibility**
|
|
||||||
- *Probability: Medium*
|
|
||||||
- *Impact: Medium*
|
|
||||||
- *Mitigation: Extensive testing, gradual rollout, feature flags*
|
|
||||||
|
|
||||||
### Operational Risks
|
|
||||||
|
|
||||||
**Risk: Production Deployment Issues**
|
|
||||||
- *Probability: Medium*
|
|
||||||
- *Impact: High*
|
|
||||||
- *Mitigation: Staging environment testing, gradual rollout, immediate rollback capability*
|
|
||||||
|
|
||||||
**Risk: Team Learning Curve**
|
|
||||||
- *Probability: High*
|
|
||||||
- *Impact: Low*
|
|
||||||
- *Mitigation: Documentation, training, gradual responsibility transfer*
|
|
||||||
|
|
||||||
### Business Continuity
|
|
||||||
|
|
||||||
**Zero-Downtime Requirements:**
|
|
||||||
- Maintain Typesense during entire migration
|
|
||||||
- Feature flag-based switching
|
|
||||||
- Immediate rollback capability
|
|
||||||
- Health monitoring with automated alerts
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Success Criteria
|
|
||||||
|
|
||||||
### Functional Requirements ✅
|
|
||||||
- [ ] All search functionality migrated successfully
|
|
||||||
- [ ] Random story selection working reliably with seeds
|
|
||||||
- [ ] Complex filtering (15+ conditions) working accurately
|
|
||||||
- [ ] Faceting/aggregation results match expected values
|
|
||||||
- [ ] Multi-library support maintained
|
|
||||||
- [ ] Autocomplete functionality preserved
|
|
||||||
|
|
||||||
### Performance Requirements ✅
|
|
||||||
- [ ] Search response time ≤ 100ms (95th percentile)
|
|
||||||
- [ ] Random story selection ≤ 50ms
|
|
||||||
- [ ] Index operations ≤ 10ms
|
|
||||||
- [ ] Memory usage ≤ 2GB sustained
|
|
||||||
- [ ] Zero search downtime during migration
|
|
||||||
|
|
||||||
### Technical Requirements ✅
|
|
||||||
- [ ] Code quality maintained (test coverage ≥ 80%)
|
|
||||||
- [ ] Documentation updated and comprehensive
|
|
||||||
- [ ] Monitoring and alerting implemented
|
|
||||||
- [ ] Rollback procedures tested and validated
|
|
||||||
- [ ] Typesense dependencies cleanly removed
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Timeline Summary
|
|
||||||
|
|
||||||
| Phase | Duration | Key Deliverables | Risk Level |
|
|
||||||
|-------|----------|------------------|------------|
|
|
||||||
| 1. Infrastructure | 1 week | Docker setup, basic service | Low |
|
|
||||||
| 2. Core Service | 1 week | Basic search operations | Medium |
|
|
||||||
| 3. Advanced Features | 1 week | Complex filtering, random queries | High |
|
|
||||||
| 4. Data Migration | 1 week | Full data migration, dual-write | High |
|
|
||||||
| 5. API Integration | 1 week | Controller updates, testing | Medium |
|
|
||||||
| 6. Production Rollout | 1 week | Gradual deployment, monitoring | High |
|
|
||||||
| 7. Cleanup | 1 week | Remove Typesense, documentation | Low |
|
|
||||||
|
|
||||||
**Total Estimated Duration: 7 weeks**
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Configuration Management
|
|
||||||
|
|
||||||
### Environment Variables
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# OpenSearch Configuration
|
|
||||||
OPENSEARCH_HOST=opensearch
|
|
||||||
OPENSEARCH_PORT=9200
|
|
||||||
OPENSEARCH_USERNAME=admin
|
|
||||||
OPENSEARCH_PASSWORD=${OPENSEARCH_PASSWORD}
|
|
||||||
|
|
||||||
# Feature Flags
|
|
||||||
STORYCOVE_OPENSEARCH_ENABLED=true
|
|
||||||
STORYCOVE_SEARCH_PROVIDER=opensearch
|
|
||||||
STORYCOVE_SEARCH_DUAL_WRITE=true
|
|
||||||
STORYCOVE_OPENSEARCH_ROLLOUT_PERCENTAGE=100
|
|
||||||
|
|
||||||
# Performance Tuning
|
|
||||||
OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx2g
|
|
||||||
STORYCOVE_SEARCH_BATCH_SIZE=1000
|
|
||||||
STORYCOVE_SEARCH_TIMEOUT=30s
|
|
||||||
```
|
|
||||||
|
|
||||||
### Docker Compose Updates
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
# Add to docker-compose.yml
|
|
||||||
opensearch:
|
|
||||||
image: opensearchproject/opensearch:2.11.0
|
|
||||||
environment:
|
|
||||||
- discovery.type=single-node
|
|
||||||
- DISABLE_SECURITY_PLUGIN=true
|
|
||||||
- OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx2g
|
|
||||||
volumes:
|
|
||||||
- opensearch_data:/usr/share/opensearch/data
|
|
||||||
networks:
|
|
||||||
- storycove-network
|
|
||||||
|
|
||||||
volumes:
|
|
||||||
opensearch_data:
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Conclusion
|
|
||||||
|
|
||||||
This specification provides a comprehensive roadmap for migrating StoryCove from Typesense to OpenSearch. The phased approach ensures minimal risk while delivering improved reliability and performance, particularly for random story queries.
|
|
||||||
|
|
||||||
The parallel implementation strategy allows for thorough validation and provides confidence in the migration while maintaining the ability to rollback if issues arise. Upon successful completion, StoryCove will have a more robust and scalable search infrastructure that better supports its growth and feature requirements.
|
|
||||||
|
|
||||||
**Next Steps:**
|
|
||||||
1. Review and approve this specification
|
|
||||||
2. Set up development environment with OpenSearch
|
|
||||||
3. Begin Phase 1 implementation
|
|
||||||
4. Establish monitoring and success metrics
|
|
||||||
5. Execute migration according to timeline
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
*Document Version: 1.0*
|
|
||||||
*Last Updated: 2025-01-17*
|
|
||||||
*Author: Claude Code Assistant*
|
|
||||||
244
SOLR_LIBRARY_MIGRATION.md
Normal file
244
SOLR_LIBRARY_MIGRATION.md
Normal file
@@ -0,0 +1,244 @@
|
|||||||
|
# Solr Library Separation Migration Guide
|
||||||
|
|
||||||
|
This guide explains how to migrate existing StoryCove deployments to support proper library separation in Solr search.
|
||||||
|
|
||||||
|
## What Changed
|
||||||
|
|
||||||
|
The Solr service has been enhanced to support multi-tenant library separation by:
|
||||||
|
- Adding a `libraryId` field to all Solr documents
|
||||||
|
- Filtering all search queries by the current library context
|
||||||
|
- Ensuring complete data isolation between libraries
|
||||||
|
|
||||||
|
## Migration Options
|
||||||
|
|
||||||
|
### Option 1: Docker Volume Reset (Recommended for Docker)
|
||||||
|
|
||||||
|
**Best for**: Development, staging, and Docker-based deployments where data loss is acceptable.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Stop the application
|
||||||
|
docker-compose down
|
||||||
|
|
||||||
|
# Remove only the Solr data volume (preserves database and images)
|
||||||
|
docker volume rm storycove_solr_data
|
||||||
|
|
||||||
|
# Restart - Solr will recreate cores with new schema
|
||||||
|
docker-compose up -d
|
||||||
|
|
||||||
|
# Wait for services to start, then trigger reindex via admin panel
|
||||||
|
```
|
||||||
|
|
||||||
|
**Pros**: Clean, simple, guaranteed to work
|
||||||
|
**Cons**: Requires downtime, loses existing search index
|
||||||
|
|
||||||
|
### Option 2: Schema API Migration (Production Safe)
|
||||||
|
|
||||||
|
**Best for**: Production environments where you need to preserve uptime.
|
||||||
|
|
||||||
|
**Method A: Automatic (Recommended)**
|
||||||
|
```bash
|
||||||
|
# Single endpoint that adds field and migrates data
|
||||||
|
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
|
||||||
|
-H "Authorization: Bearer YOUR_JWT_TOKEN"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Method B: Manual Steps**
|
||||||
|
```bash
|
||||||
|
# Step 1: Add libraryId field via app API
|
||||||
|
curl -X POST "http://your-app-host/api/admin/search/solr/add-library-field" \
|
||||||
|
-H "Authorization: Bearer YOUR_JWT_TOKEN"
|
||||||
|
|
||||||
|
# Step 2: Run migration
|
||||||
|
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
|
||||||
|
-H "Authorization: Bearer YOUR_JWT_TOKEN"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Method C: Direct Solr API (if app API fails)**
|
||||||
|
```bash
|
||||||
|
# Add libraryId field to stories core
|
||||||
|
curl -X POST "http://your-solr-host:8983/solr/storycove_stories/schema" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{
|
||||||
|
"add-field": {
|
||||||
|
"name": "libraryId",
|
||||||
|
"type": "string",
|
||||||
|
"indexed": true,
|
||||||
|
"stored": true,
|
||||||
|
"required": false
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
|
||||||
|
# Add libraryId field to authors core
|
||||||
|
curl -X POST "http://your-solr-host:8983/solr/storycove_authors/schema" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{
|
||||||
|
"add-field": {
|
||||||
|
"name": "libraryId",
|
||||||
|
"type": "string",
|
||||||
|
"indexed": true,
|
||||||
|
"stored": true,
|
||||||
|
"required": false
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
|
||||||
|
# Then run the migration
|
||||||
|
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
|
||||||
|
-H "Authorization: Bearer YOUR_JWT_TOKEN"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Pros**: No downtime, preserves service availability, automatic field addition
|
||||||
|
**Cons**: Requires API access
|
||||||
|
|
||||||
|
### Option 3: Application-Level Migration (Recommended for Production)
|
||||||
|
|
||||||
|
**Best for**: Production environments with proper admin access.
|
||||||
|
|
||||||
|
1. **Deploy the code changes** to your environment
|
||||||
|
2. **Access the admin panel** of your application
|
||||||
|
3. **Navigate to search settings**
|
||||||
|
4. **Use the "Migrate Library Schema" button** or API endpoint:
|
||||||
|
```
|
||||||
|
POST /api/admin/search/solr/migrate-library-schema
|
||||||
|
```
|
||||||
|
|
||||||
|
**Pros**: User-friendly, handles all complexity internally
|
||||||
|
**Cons**: Requires admin access to application
|
||||||
|
|
||||||
|
## Step-by-Step Migration Process
|
||||||
|
|
||||||
|
### For Docker Deployments
|
||||||
|
|
||||||
|
1. **Backup your data** (optional but recommended):
|
||||||
|
```bash
|
||||||
|
# Backup database
|
||||||
|
docker-compose exec postgres pg_dump -U storycove storycove > backup.sql
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Pull the latest code** with library separation fixes
|
||||||
|
|
||||||
|
3. **Choose migration approach**:
|
||||||
|
- **Quick & Clean**: Use Option 1 (volume reset)
|
||||||
|
- **Production**: Use Option 2 or 3
|
||||||
|
|
||||||
|
4. **Verify migration**:
|
||||||
|
- Log in with different library passwords
|
||||||
|
- Perform searches to confirm isolation
|
||||||
|
- Check that new content gets indexed with library IDs
|
||||||
|
|
||||||
|
### For Kubernetes/Production Deployments
|
||||||
|
|
||||||
|
1. **Update your deployment** with the new container images
|
||||||
|
|
||||||
|
2. **Add the libraryId field** to Solr schema using Option 2
|
||||||
|
|
||||||
|
3. **Use the migration endpoint** (Option 3):
|
||||||
|
```bash
|
||||||
|
kubectl exec -it deployment/storycove-backend -- \
|
||||||
|
curl -X POST http://localhost:8080/api/admin/search/solr/migrate-library-schema
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Monitor logs** for successful migration
|
||||||
|
|
||||||
|
## Verification Steps
|
||||||
|
|
||||||
|
After migration, verify that library separation is working:
|
||||||
|
|
||||||
|
1. **Test with multiple libraries**:
|
||||||
|
- Log in with Library A password
|
||||||
|
- Add/search content
|
||||||
|
- Log in with Library B password
|
||||||
|
- Confirm Library A content is not visible
|
||||||
|
|
||||||
|
2. **Check Solr directly** (if accessible):
|
||||||
|
```bash
|
||||||
|
# Should show documents with libraryId field
|
||||||
|
curl "http://solr:8983/solr/storycove_stories/select?q=*:*&fl=id,title,libraryId&rows=5"
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Monitor application logs** for any library separation errors
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### "unknown field 'libraryId'" Error
|
||||||
|
|
||||||
|
**Problem**: `ERROR: [doc=xxx] unknown field 'libraryId'`
|
||||||
|
|
||||||
|
**Cause**: The Solr schema doesn't have the libraryId field yet.
|
||||||
|
|
||||||
|
**Solutions**:
|
||||||
|
|
||||||
|
1. **Use the automated migration** (adds field automatically):
|
||||||
|
```bash
|
||||||
|
curl -X POST "http://your-app/api/admin/search/solr/migrate-library-schema"
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Add field manually first**:
|
||||||
|
```bash
|
||||||
|
# Add field via app API
|
||||||
|
curl -X POST "http://your-app/api/admin/search/solr/add-library-field"
|
||||||
|
|
||||||
|
# Then run migration
|
||||||
|
curl -X POST "http://your-app/api/admin/search/solr/migrate-library-schema"
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Direct Solr API** (if app API fails):
|
||||||
|
```bash
|
||||||
|
# Add to both cores
|
||||||
|
curl -X POST "http://solr:8983/solr/storycove_stories/schema" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"add-field":{"name":"libraryId","type":"string","indexed":true,"stored":true}}'
|
||||||
|
|
||||||
|
curl -X POST "http://solr:8983/solr/storycove_authors/schema" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"add-field":{"name":"libraryId","type":"string","indexed":true,"stored":true}}'
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **For development**: Use Option 1 (volume reset) for clean restart
|
||||||
|
|
||||||
|
### Migration Endpoint Returns Error
|
||||||
|
|
||||||
|
Common causes:
|
||||||
|
- Solr is not available (check connectivity)
|
||||||
|
- No active library context (ensure user is authenticated)
|
||||||
|
- Insufficient permissions (check JWT token/authentication)
|
||||||
|
|
||||||
|
### Search Results Still Mixed
|
||||||
|
|
||||||
|
This indicates incomplete migration:
|
||||||
|
- Clear all Solr data and reindex completely
|
||||||
|
- Verify that all documents have libraryId field
|
||||||
|
- Check that search queries include library filters
|
||||||
|
|
||||||
|
## Environment-Specific Notes
|
||||||
|
|
||||||
|
### Development
|
||||||
|
- Use Option 1 (volume reset) for simplicity
|
||||||
|
- Data loss is acceptable in dev environments
|
||||||
|
|
||||||
|
### Staging
|
||||||
|
- Use Option 2 or 3 to test production migration procedures
|
||||||
|
- Verify migration process before applying to production
|
||||||
|
|
||||||
|
### Production
|
||||||
|
- **Always backup data first**
|
||||||
|
- Use Option 2 (Schema API) or Option 3 (Admin endpoint)
|
||||||
|
- Plan for brief performance impact during reindexing
|
||||||
|
- Monitor system resources during bulk reindexing
|
||||||
|
|
||||||
|
## Performance Considerations
|
||||||
|
|
||||||
|
- **Reindexing time**: Depends on data size (typically 1000 docs/second)
|
||||||
|
- **Memory usage**: May increase during bulk indexing
|
||||||
|
- **Search performance**: Minimal impact from library filtering
|
||||||
|
- **Storage**: Slight increase due to libraryId field
|
||||||
|
|
||||||
|
## Rollback Plan
|
||||||
|
|
||||||
|
If issues occur:
|
||||||
|
|
||||||
|
1. **Immediate**: Restart Solr to previous state (if using Option 1)
|
||||||
|
2. **Schema revert**: Remove libraryId field via Schema API
|
||||||
|
3. **Code rollback**: Deploy previous version without library separation
|
||||||
|
4. **Data restore**: Restore from backup if necessary
|
||||||
|
|
||||||
|
This migration enables proper multi-tenant isolation while maintaining search performance and functionality.
|
||||||
@@ -2,8 +2,13 @@ FROM openjdk:17-jdk-slim
|
|||||||
|
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
|
|
||||||
# Install Maven
|
# Install Maven and PostgreSQL 15 client tools
|
||||||
RUN apt-get update && apt-get install -y maven && rm -rf /var/lib/apt/lists/*
|
RUN apt-get update && apt-get install -y wget ca-certificates gnupg maven && \
|
||||||
|
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | apt-key add - && \
|
||||||
|
echo "deb http://apt.postgresql.org/pub/repos/apt/ bullseye-pgdg main" > /etc/apt/sources.list.d/pgdg.list && \
|
||||||
|
apt-get update && \
|
||||||
|
apt-get install -y postgresql-client-15 && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
# Copy source code
|
# Copy source code
|
||||||
COPY . .
|
COPY . .
|
||||||
|
|||||||
@@ -84,9 +84,25 @@
|
|||||||
<artifactId>httpclient5</artifactId>
|
<artifactId>httpclient5</artifactId>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.opensearch.client</groupId>
|
<groupId>org.apache.solr</groupId>
|
||||||
<artifactId>opensearch-java</artifactId>
|
<artifactId>solr-solrj</artifactId>
|
||||||
<version>3.2.0</version>
|
<version>9.9.0</version>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.eclipse.jetty</groupId>
|
||||||
|
<artifactId>jetty-client</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.eclipse.jetty</groupId>
|
||||||
|
<artifactId>jetty-util</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.eclipse.jetty</groupId>
|
||||||
|
<artifactId>jetty-http</artifactId>
|
||||||
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>org.eclipse.jetty</groupId>
|
||||||
|
<artifactId>jetty-io</artifactId>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>org.apache.httpcomponents.core5</groupId>
|
<groupId>org.apache.httpcomponents.core5</groupId>
|
||||||
|
|||||||
@@ -2,10 +2,12 @@ package com.storycove;
|
|||||||
|
|
||||||
import org.springframework.boot.SpringApplication;
|
import org.springframework.boot.SpringApplication;
|
||||||
import org.springframework.boot.autoconfigure.SpringBootApplication;
|
import org.springframework.boot.autoconfigure.SpringBootApplication;
|
||||||
|
import org.springframework.scheduling.annotation.EnableAsync;
|
||||||
import org.springframework.scheduling.annotation.EnableScheduling;
|
import org.springframework.scheduling.annotation.EnableScheduling;
|
||||||
|
|
||||||
@SpringBootApplication
|
@SpringBootApplication
|
||||||
@EnableScheduling
|
@EnableScheduling
|
||||||
|
@EnableAsync
|
||||||
public class StoryCoveApplication {
|
public class StoryCoveApplication {
|
||||||
|
|
||||||
public static void main(String[] args) {
|
public static void main(String[] args) {
|
||||||
|
|||||||
@@ -1,211 +0,0 @@
|
|||||||
package com.storycove.config;
|
|
||||||
|
|
||||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
|
||||||
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
|
|
||||||
import org.apache.hc.client5.http.auth.AuthScope;
|
|
||||||
import org.apache.hc.client5.http.auth.UsernamePasswordCredentials;
|
|
||||||
import org.apache.hc.client5.http.impl.auth.BasicCredentialsProvider;
|
|
||||||
import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManager;
|
|
||||||
import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManagerBuilder;
|
|
||||||
import org.apache.hc.client5.http.ssl.ClientTlsStrategyBuilder;
|
|
||||||
import org.apache.hc.core5.http.HttpHost;
|
|
||||||
import org.apache.hc.core5.util.Timeout;
|
|
||||||
import org.opensearch.client.json.jackson.JacksonJsonpMapper;
|
|
||||||
import org.opensearch.client.opensearch.OpenSearchClient;
|
|
||||||
import org.opensearch.client.transport.OpenSearchTransport;
|
|
||||||
import org.opensearch.client.transport.httpclient5.ApacheHttpClient5TransportBuilder;
|
|
||||||
import org.slf4j.Logger;
|
|
||||||
import org.slf4j.LoggerFactory;
|
|
||||||
import org.springframework.beans.factory.annotation.Qualifier;
|
|
||||||
import org.springframework.context.annotation.Bean;
|
|
||||||
import org.springframework.context.annotation.Configuration;
|
|
||||||
|
|
||||||
import javax.net.ssl.SSLContext;
|
|
||||||
import javax.net.ssl.TrustManager;
|
|
||||||
import javax.net.ssl.X509TrustManager;
|
|
||||||
import java.io.FileInputStream;
|
|
||||||
import java.security.KeyStore;
|
|
||||||
import java.security.cert.X509Certificate;
|
|
||||||
|
|
||||||
@Configuration
|
|
||||||
public class OpenSearchConfig {
|
|
||||||
|
|
||||||
private static final Logger logger = LoggerFactory.getLogger(OpenSearchConfig.class);
|
|
||||||
|
|
||||||
private final OpenSearchProperties properties;
|
|
||||||
|
|
||||||
public OpenSearchConfig(@Qualifier("openSearchProperties") OpenSearchProperties properties) {
|
|
||||||
this.properties = properties;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Bean
|
|
||||||
public OpenSearchClient openSearchClient() throws Exception {
|
|
||||||
logger.info("Initializing OpenSearch client for profile: {}", properties.getProfile());
|
|
||||||
|
|
||||||
// Create credentials provider
|
|
||||||
BasicCredentialsProvider credentialsProvider = createCredentialsProvider();
|
|
||||||
|
|
||||||
// Create SSL context based on environment
|
|
||||||
SSLContext sslContext = createSSLContext();
|
|
||||||
|
|
||||||
// Create connection manager with pooling
|
|
||||||
PoolingAsyncClientConnectionManager connectionManager = createConnectionManager(sslContext);
|
|
||||||
|
|
||||||
// Create custom ObjectMapper for proper date serialization
|
|
||||||
ObjectMapper objectMapper = new ObjectMapper();
|
|
||||||
objectMapper.registerModule(new JavaTimeModule());
|
|
||||||
objectMapper.disable(com.fasterxml.jackson.databind.SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
|
|
||||||
|
|
||||||
// Create the transport with all configurations and custom Jackson mapper
|
|
||||||
OpenSearchTransport transport = ApacheHttpClient5TransportBuilder
|
|
||||||
.builder(new HttpHost(properties.getScheme(), properties.getHost(), properties.getPort()))
|
|
||||||
.setMapper(new JacksonJsonpMapper(objectMapper))
|
|
||||||
.setHttpClientConfigCallback(httpClientBuilder -> {
|
|
||||||
// Only set credentials provider if authentication is configured
|
|
||||||
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
|
|
||||||
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
|
|
||||||
httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
|
|
||||||
}
|
|
||||||
|
|
||||||
httpClientBuilder.setConnectionManager(connectionManager);
|
|
||||||
|
|
||||||
// Set timeouts
|
|
||||||
httpClientBuilder.setDefaultRequestConfig(
|
|
||||||
org.apache.hc.client5.http.config.RequestConfig.custom()
|
|
||||||
.setConnectionRequestTimeout(Timeout.ofMilliseconds(properties.getConnection().getTimeout()))
|
|
||||||
.setResponseTimeout(Timeout.ofMilliseconds(properties.getConnection().getSocketTimeout()))
|
|
||||||
.build()
|
|
||||||
);
|
|
||||||
|
|
||||||
return httpClientBuilder;
|
|
||||||
})
|
|
||||||
.build();
|
|
||||||
|
|
||||||
OpenSearchClient client = new OpenSearchClient(transport);
|
|
||||||
|
|
||||||
// Test connection
|
|
||||||
testConnection(client);
|
|
||||||
|
|
||||||
return client;
|
|
||||||
}
|
|
||||||
|
|
||||||
private BasicCredentialsProvider createCredentialsProvider() {
|
|
||||||
BasicCredentialsProvider credentialsProvider = new BasicCredentialsProvider();
|
|
||||||
|
|
||||||
// Only set credentials if username and password are provided
|
|
||||||
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
|
|
||||||
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
|
|
||||||
credentialsProvider.setCredentials(
|
|
||||||
new AuthScope(properties.getHost(), properties.getPort()),
|
|
||||||
new UsernamePasswordCredentials(
|
|
||||||
properties.getUsername(),
|
|
||||||
properties.getPassword().toCharArray()
|
|
||||||
)
|
|
||||||
);
|
|
||||||
logger.info("OpenSearch credentials configured for user: {}", properties.getUsername());
|
|
||||||
} else {
|
|
||||||
logger.info("OpenSearch running without authentication (no credentials configured)");
|
|
||||||
}
|
|
||||||
|
|
||||||
return credentialsProvider;
|
|
||||||
}
|
|
||||||
|
|
||||||
private SSLContext createSSLContext() throws Exception {
|
|
||||||
SSLContext sslContext;
|
|
||||||
|
|
||||||
if (isProduction() && !properties.getSecurity().isTrustAllCertificates()) {
|
|
||||||
// Production SSL configuration with proper certificate validation
|
|
||||||
sslContext = createProductionSSLContext();
|
|
||||||
} else {
|
|
||||||
// Development SSL configuration (trust all certificates)
|
|
||||||
sslContext = createDevelopmentSSLContext();
|
|
||||||
}
|
|
||||||
|
|
||||||
return sslContext;
|
|
||||||
}
|
|
||||||
|
|
||||||
private SSLContext createProductionSSLContext() throws Exception {
|
|
||||||
logger.info("Configuring production SSL context with certificate validation");
|
|
||||||
|
|
||||||
SSLContext sslContext = SSLContext.getInstance("TLS");
|
|
||||||
|
|
||||||
// Load custom keystore/truststore if provided
|
|
||||||
if (properties.getSecurity().getTruststorePath() != null) {
|
|
||||||
KeyStore trustStore = KeyStore.getInstance("JKS");
|
|
||||||
try (FileInputStream fis = new FileInputStream(properties.getSecurity().getTruststorePath())) {
|
|
||||||
trustStore.load(fis, properties.getSecurity().getTruststorePassword().toCharArray());
|
|
||||||
}
|
|
||||||
|
|
||||||
javax.net.ssl.TrustManagerFactory tmf =
|
|
||||||
javax.net.ssl.TrustManagerFactory.getInstance(javax.net.ssl.TrustManagerFactory.getDefaultAlgorithm());
|
|
||||||
tmf.init(trustStore);
|
|
||||||
|
|
||||||
sslContext.init(null, tmf.getTrustManagers(), null);
|
|
||||||
} else {
|
|
||||||
// Use default system SSL context for production
|
|
||||||
sslContext.init(null, null, null);
|
|
||||||
}
|
|
||||||
|
|
||||||
return sslContext;
|
|
||||||
}
|
|
||||||
|
|
||||||
private SSLContext createDevelopmentSSLContext() throws Exception {
|
|
||||||
logger.warn("Configuring development SSL context - TRUSTING ALL CERTIFICATES (not for production!)");
|
|
||||||
|
|
||||||
SSLContext sslContext = SSLContext.getInstance("TLS");
|
|
||||||
sslContext.init(null, new TrustManager[] {
|
|
||||||
new X509TrustManager() {
|
|
||||||
public X509Certificate[] getAcceptedIssuers() { return null; }
|
|
||||||
public void checkClientTrusted(X509Certificate[] certs, String authType) {}
|
|
||||||
public void checkServerTrusted(X509Certificate[] certs, String authType) {}
|
|
||||||
}
|
|
||||||
}, null);
|
|
||||||
|
|
||||||
return sslContext;
|
|
||||||
}
|
|
||||||
|
|
||||||
private PoolingAsyncClientConnectionManager createConnectionManager(SSLContext sslContext) {
|
|
||||||
PoolingAsyncClientConnectionManagerBuilder builder = PoolingAsyncClientConnectionManagerBuilder.create();
|
|
||||||
|
|
||||||
// Configure TLS strategy
|
|
||||||
if (properties.getScheme().equals("https")) {
|
|
||||||
if (isProduction() && properties.getSecurity().isSslVerification()) {
|
|
||||||
// Production TLS with hostname verification
|
|
||||||
builder.setTlsStrategy(ClientTlsStrategyBuilder.create()
|
|
||||||
.setSslContext(sslContext)
|
|
||||||
.build());
|
|
||||||
} else {
|
|
||||||
// Development TLS without hostname verification
|
|
||||||
builder.setTlsStrategy(ClientTlsStrategyBuilder.create()
|
|
||||||
.setSslContext(sslContext)
|
|
||||||
.setHostnameVerifier((hostname, session) -> true)
|
|
||||||
.build());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
PoolingAsyncClientConnectionManager connectionManager = builder.build();
|
|
||||||
|
|
||||||
// Configure connection pool settings
|
|
||||||
connectionManager.setMaxTotal(properties.getConnection().getMaxConnectionsTotal());
|
|
||||||
connectionManager.setDefaultMaxPerRoute(properties.getConnection().getMaxConnectionsPerRoute());
|
|
||||||
|
|
||||||
return connectionManager;
|
|
||||||
}
|
|
||||||
|
|
||||||
private boolean isProduction() {
|
|
||||||
return "production".equalsIgnoreCase(properties.getProfile());
|
|
||||||
}
|
|
||||||
|
|
||||||
private void testConnection(OpenSearchClient client) {
|
|
||||||
try {
|
|
||||||
var response = client.info();
|
|
||||||
logger.info("OpenSearch connection successful - Version: {}, Cluster: {}",
|
|
||||||
response.version().number(),
|
|
||||||
response.clusterName());
|
|
||||||
} catch (Exception e) {
|
|
||||||
logger.warn("OpenSearch connection test failed during initialization: {}", e.getMessage());
|
|
||||||
logger.debug("OpenSearch connection test full error", e);
|
|
||||||
// Don't throw exception here - let the client be created and handle failures in service methods
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,164 +0,0 @@
|
|||||||
package com.storycove.config;
|
|
||||||
|
|
||||||
import org.springframework.boot.context.properties.ConfigurationProperties;
|
|
||||||
import org.springframework.stereotype.Component;
|
|
||||||
|
|
||||||
@Component
|
|
||||||
@ConfigurationProperties(prefix = "storycove.opensearch")
|
|
||||||
public class OpenSearchProperties {
|
|
||||||
|
|
||||||
private String host = "localhost";
|
|
||||||
private int port = 9200;
|
|
||||||
private String scheme = "https";
|
|
||||||
private String username = "admin";
|
|
||||||
private String password;
|
|
||||||
private String profile = "development";
|
|
||||||
|
|
||||||
private Security security = new Security();
|
|
||||||
private Connection connection = new Connection();
|
|
||||||
private Indices indices = new Indices();
|
|
||||||
private Bulk bulk = new Bulk();
|
|
||||||
private Health health = new Health();
|
|
||||||
|
|
||||||
// Getters and setters
|
|
||||||
public String getHost() { return host; }
|
|
||||||
public void setHost(String host) { this.host = host; }
|
|
||||||
|
|
||||||
public int getPort() { return port; }
|
|
||||||
public void setPort(int port) { this.port = port; }
|
|
||||||
|
|
||||||
public String getScheme() { return scheme; }
|
|
||||||
public void setScheme(String scheme) { this.scheme = scheme; }
|
|
||||||
|
|
||||||
public String getUsername() { return username; }
|
|
||||||
public void setUsername(String username) { this.username = username; }
|
|
||||||
|
|
||||||
public String getPassword() { return password; }
|
|
||||||
public void setPassword(String password) { this.password = password; }
|
|
||||||
|
|
||||||
public String getProfile() { return profile; }
|
|
||||||
public void setProfile(String profile) { this.profile = profile; }
|
|
||||||
|
|
||||||
public Security getSecurity() { return security; }
|
|
||||||
public void setSecurity(Security security) { this.security = security; }
|
|
||||||
|
|
||||||
public Connection getConnection() { return connection; }
|
|
||||||
public void setConnection(Connection connection) { this.connection = connection; }
|
|
||||||
|
|
||||||
public Indices getIndices() { return indices; }
|
|
||||||
public void setIndices(Indices indices) { this.indices = indices; }
|
|
||||||
|
|
||||||
public Bulk getBulk() { return bulk; }
|
|
||||||
public void setBulk(Bulk bulk) { this.bulk = bulk; }
|
|
||||||
|
|
||||||
public Health getHealth() { return health; }
|
|
||||||
public void setHealth(Health health) { this.health = health; }
|
|
||||||
|
|
||||||
public static class Security {
|
|
||||||
private boolean sslVerification = false;
|
|
||||||
private boolean trustAllCertificates = true;
|
|
||||||
private String keystorePath;
|
|
||||||
private String keystorePassword;
|
|
||||||
private String truststorePath;
|
|
||||||
private String truststorePassword;
|
|
||||||
|
|
||||||
// Getters and setters
|
|
||||||
public boolean isSslVerification() { return sslVerification; }
|
|
||||||
public void setSslVerification(boolean sslVerification) { this.sslVerification = sslVerification; }
|
|
||||||
|
|
||||||
public boolean isTrustAllCertificates() { return trustAllCertificates; }
|
|
||||||
public void setTrustAllCertificates(boolean trustAllCertificates) { this.trustAllCertificates = trustAllCertificates; }
|
|
||||||
|
|
||||||
public String getKeystorePath() { return keystorePath; }
|
|
||||||
public void setKeystorePath(String keystorePath) { this.keystorePath = keystorePath; }
|
|
||||||
|
|
||||||
public String getKeystorePassword() { return keystorePassword; }
|
|
||||||
public void setKeystorePassword(String keystorePassword) { this.keystorePassword = keystorePassword; }
|
|
||||||
|
|
||||||
public String getTruststorePath() { return truststorePath; }
|
|
||||||
public void setTruststorePath(String truststorePath) { this.truststorePath = truststorePath; }
|
|
||||||
|
|
||||||
public String getTruststorePassword() { return truststorePassword; }
|
|
||||||
public void setTruststorePassword(String truststorePassword) { this.truststorePassword = truststorePassword; }
|
|
||||||
}
|
|
||||||
|
|
||||||
public static class Connection {
|
|
||||||
private int timeout = 30000;
|
|
||||||
private int socketTimeout = 60000;
|
|
||||||
private int maxConnectionsPerRoute = 10;
|
|
||||||
private int maxConnectionsTotal = 30;
|
|
||||||
private boolean retryOnFailure = true;
|
|
||||||
private int maxRetries = 3;
|
|
||||||
|
|
||||||
// Getters and setters
|
|
||||||
public int getTimeout() { return timeout; }
|
|
||||||
public void setTimeout(int timeout) { this.timeout = timeout; }
|
|
||||||
|
|
||||||
public int getSocketTimeout() { return socketTimeout; }
|
|
||||||
public void setSocketTimeout(int socketTimeout) { this.socketTimeout = socketTimeout; }
|
|
||||||
|
|
||||||
public int getMaxConnectionsPerRoute() { return maxConnectionsPerRoute; }
|
|
||||||
public void setMaxConnectionsPerRoute(int maxConnectionsPerRoute) { this.maxConnectionsPerRoute = maxConnectionsPerRoute; }
|
|
||||||
|
|
||||||
public int getMaxConnectionsTotal() { return maxConnectionsTotal; }
|
|
||||||
public void setMaxConnectionsTotal(int maxConnectionsTotal) { this.maxConnectionsTotal = maxConnectionsTotal; }
|
|
||||||
|
|
||||||
public boolean isRetryOnFailure() { return retryOnFailure; }
|
|
||||||
public void setRetryOnFailure(boolean retryOnFailure) { this.retryOnFailure = retryOnFailure; }
|
|
||||||
|
|
||||||
public int getMaxRetries() { return maxRetries; }
|
|
||||||
public void setMaxRetries(int maxRetries) { this.maxRetries = maxRetries; }
|
|
||||||
}
|
|
||||||
|
|
||||||
public static class Indices {
|
|
||||||
private int defaultShards = 1;
|
|
||||||
private int defaultReplicas = 0;
|
|
||||||
private String refreshInterval = "1s";
|
|
||||||
|
|
||||||
// Getters and setters
|
|
||||||
public int getDefaultShards() { return defaultShards; }
|
|
||||||
public void setDefaultShards(int defaultShards) { this.defaultShards = defaultShards; }
|
|
||||||
|
|
||||||
public int getDefaultReplicas() { return defaultReplicas; }
|
|
||||||
public void setDefaultReplicas(int defaultReplicas) { this.defaultReplicas = defaultReplicas; }
|
|
||||||
|
|
||||||
public String getRefreshInterval() { return refreshInterval; }
|
|
||||||
public void setRefreshInterval(String refreshInterval) { this.refreshInterval = refreshInterval; }
|
|
||||||
}
|
|
||||||
|
|
||||||
public static class Bulk {
|
|
||||||
private int actions = 1000;
|
|
||||||
private long size = 5242880; // 5MB
|
|
||||||
private int timeout = 10000;
|
|
||||||
private int concurrentRequests = 1;
|
|
||||||
|
|
||||||
// Getters and setters
|
|
||||||
public int getActions() { return actions; }
|
|
||||||
public void setActions(int actions) { this.actions = actions; }
|
|
||||||
|
|
||||||
public long getSize() { return size; }
|
|
||||||
public void setSize(long size) { this.size = size; }
|
|
||||||
|
|
||||||
public int getTimeout() { return timeout; }
|
|
||||||
public void setTimeout(int timeout) { this.timeout = timeout; }
|
|
||||||
|
|
||||||
public int getConcurrentRequests() { return concurrentRequests; }
|
|
||||||
public void setConcurrentRequests(int concurrentRequests) { this.concurrentRequests = concurrentRequests; }
|
|
||||||
}
|
|
||||||
|
|
||||||
public static class Health {
|
|
||||||
private int checkInterval = 30000;
|
|
||||||
private int slowQueryThreshold = 5000;
|
|
||||||
private boolean enableMetrics = true;
|
|
||||||
|
|
||||||
// Getters and setters
|
|
||||||
public int getCheckInterval() { return checkInterval; }
|
|
||||||
public void setCheckInterval(int checkInterval) { this.checkInterval = checkInterval; }
|
|
||||||
|
|
||||||
public int getSlowQueryThreshold() { return slowQueryThreshold; }
|
|
||||||
public void setSlowQueryThreshold(int slowQueryThreshold) { this.slowQueryThreshold = slowQueryThreshold; }
|
|
||||||
|
|
||||||
public boolean isEnableMetrics() { return enableMetrics; }
|
|
||||||
public void setEnableMetrics(boolean enableMetrics) { this.enableMetrics = enableMetrics; }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
57
backend/src/main/java/com/storycove/config/SolrConfig.java
Normal file
57
backend/src/main/java/com/storycove/config/SolrConfig.java
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
package com.storycove.config;
|
||||||
|
|
||||||
|
import org.apache.solr.client.solrj.SolrClient;
|
||||||
|
import org.apache.solr.client.solrj.impl.HttpSolrClient;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||||
|
import org.springframework.context.annotation.Bean;
|
||||||
|
import org.springframework.context.annotation.Configuration;
|
||||||
|
|
||||||
|
@Configuration
|
||||||
|
@ConditionalOnProperty(
|
||||||
|
value = "storycove.search.engine",
|
||||||
|
havingValue = "solr",
|
||||||
|
matchIfMissing = false
|
||||||
|
)
|
||||||
|
public class SolrConfig {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(SolrConfig.class);
|
||||||
|
|
||||||
|
private final SolrProperties properties;
|
||||||
|
|
||||||
|
public SolrConfig(SolrProperties properties) {
|
||||||
|
this.properties = properties;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Bean
|
||||||
|
public SolrClient solrClient() {
|
||||||
|
logger.info("Initializing Solr client with URL: {}", properties.getUrl());
|
||||||
|
|
||||||
|
HttpSolrClient.Builder builder = new HttpSolrClient.Builder(properties.getUrl())
|
||||||
|
.withConnectionTimeout(properties.getConnection().getTimeout())
|
||||||
|
.withSocketTimeout(properties.getConnection().getSocketTimeout());
|
||||||
|
|
||||||
|
SolrClient client = builder.build();
|
||||||
|
|
||||||
|
logger.info("Solr running without authentication");
|
||||||
|
|
||||||
|
// Test connection
|
||||||
|
testConnection(client);
|
||||||
|
|
||||||
|
return client;
|
||||||
|
}
|
||||||
|
|
||||||
|
private void testConnection(SolrClient client) {
|
||||||
|
try {
|
||||||
|
// Test connection by pinging the server
|
||||||
|
var response = client.ping();
|
||||||
|
logger.info("Solr connection successful - Response time: {}ms",
|
||||||
|
response.getElapsedTime());
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.warn("Solr connection test failed during initialization: {}", e.getMessage());
|
||||||
|
logger.debug("Solr connection test full error", e);
|
||||||
|
// Don't throw exception here - let the client be created and handle failures in service methods
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
140
backend/src/main/java/com/storycove/config/SolrProperties.java
Normal file
140
backend/src/main/java/com/storycove/config/SolrProperties.java
Normal file
@@ -0,0 +1,140 @@
|
|||||||
|
package com.storycove.config;
|
||||||
|
|
||||||
|
import org.springframework.boot.context.properties.ConfigurationProperties;
|
||||||
|
import org.springframework.stereotype.Component;
|
||||||
|
|
||||||
|
@Component
|
||||||
|
@ConfigurationProperties(prefix = "storycove.solr")
|
||||||
|
public class SolrProperties {
|
||||||
|
|
||||||
|
private String url = "http://localhost:8983/solr";
|
||||||
|
private String username;
|
||||||
|
private String password;
|
||||||
|
|
||||||
|
private Cores cores = new Cores();
|
||||||
|
private Connection connection = new Connection();
|
||||||
|
private Query query = new Query();
|
||||||
|
private Commit commit = new Commit();
|
||||||
|
private Health health = new Health();
|
||||||
|
|
||||||
|
// Getters and setters
|
||||||
|
public String getUrl() { return url; }
|
||||||
|
public void setUrl(String url) { this.url = url; }
|
||||||
|
|
||||||
|
public String getUsername() { return username; }
|
||||||
|
public void setUsername(String username) { this.username = username; }
|
||||||
|
|
||||||
|
public String getPassword() { return password; }
|
||||||
|
public void setPassword(String password) { this.password = password; }
|
||||||
|
|
||||||
|
public Cores getCores() { return cores; }
|
||||||
|
public void setCores(Cores cores) { this.cores = cores; }
|
||||||
|
|
||||||
|
public Connection getConnection() { return connection; }
|
||||||
|
public void setConnection(Connection connection) { this.connection = connection; }
|
||||||
|
|
||||||
|
public Query getQuery() { return query; }
|
||||||
|
public void setQuery(Query query) { this.query = query; }
|
||||||
|
|
||||||
|
public Commit getCommit() { return commit; }
|
||||||
|
public void setCommit(Commit commit) { this.commit = commit; }
|
||||||
|
|
||||||
|
public Health getHealth() { return health; }
|
||||||
|
public void setHealth(Health health) { this.health = health; }
|
||||||
|
|
||||||
|
public static class Cores {
|
||||||
|
private String stories = "storycove_stories";
|
||||||
|
private String authors = "storycove_authors";
|
||||||
|
|
||||||
|
// Getters and setters
|
||||||
|
public String getStories() { return stories; }
|
||||||
|
public void setStories(String stories) { this.stories = stories; }
|
||||||
|
|
||||||
|
public String getAuthors() { return authors; }
|
||||||
|
public void setAuthors(String authors) { this.authors = authors; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class Connection {
|
||||||
|
private int timeout = 30000;
|
||||||
|
private int socketTimeout = 60000;
|
||||||
|
private int maxConnectionsPerRoute = 10;
|
||||||
|
private int maxConnectionsTotal = 30;
|
||||||
|
private boolean retryOnFailure = true;
|
||||||
|
private int maxRetries = 3;
|
||||||
|
|
||||||
|
// Getters and setters
|
||||||
|
public int getTimeout() { return timeout; }
|
||||||
|
public void setTimeout(int timeout) { this.timeout = timeout; }
|
||||||
|
|
||||||
|
public int getSocketTimeout() { return socketTimeout; }
|
||||||
|
public void setSocketTimeout(int socketTimeout) { this.socketTimeout = socketTimeout; }
|
||||||
|
|
||||||
|
public int getMaxConnectionsPerRoute() { return maxConnectionsPerRoute; }
|
||||||
|
public void setMaxConnectionsPerRoute(int maxConnectionsPerRoute) { this.maxConnectionsPerRoute = maxConnectionsPerRoute; }
|
||||||
|
|
||||||
|
public int getMaxConnectionsTotal() { return maxConnectionsTotal; }
|
||||||
|
public void setMaxConnectionsTotal(int maxConnectionsTotal) { this.maxConnectionsTotal = maxConnectionsTotal; }
|
||||||
|
|
||||||
|
public boolean isRetryOnFailure() { return retryOnFailure; }
|
||||||
|
public void setRetryOnFailure(boolean retryOnFailure) { this.retryOnFailure = retryOnFailure; }
|
||||||
|
|
||||||
|
public int getMaxRetries() { return maxRetries; }
|
||||||
|
public void setMaxRetries(int maxRetries) { this.maxRetries = maxRetries; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class Query {
|
||||||
|
private int defaultRows = 10;
|
||||||
|
private int maxRows = 1000;
|
||||||
|
private String defaultOperator = "AND";
|
||||||
|
private boolean highlight = true;
|
||||||
|
private boolean facets = true;
|
||||||
|
|
||||||
|
// Getters and setters
|
||||||
|
public int getDefaultRows() { return defaultRows; }
|
||||||
|
public void setDefaultRows(int defaultRows) { this.defaultRows = defaultRows; }
|
||||||
|
|
||||||
|
public int getMaxRows() { return maxRows; }
|
||||||
|
public void setMaxRows(int maxRows) { this.maxRows = maxRows; }
|
||||||
|
|
||||||
|
public String getDefaultOperator() { return defaultOperator; }
|
||||||
|
public void setDefaultOperator(String defaultOperator) { this.defaultOperator = defaultOperator; }
|
||||||
|
|
||||||
|
public boolean isHighlight() { return highlight; }
|
||||||
|
public void setHighlight(boolean highlight) { this.highlight = highlight; }
|
||||||
|
|
||||||
|
public boolean isFacets() { return facets; }
|
||||||
|
public void setFacets(boolean facets) { this.facets = facets; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class Commit {
|
||||||
|
private boolean softCommit = true;
|
||||||
|
private int commitWithin = 1000;
|
||||||
|
private boolean waitSearcher = false;
|
||||||
|
|
||||||
|
// Getters and setters
|
||||||
|
public boolean isSoftCommit() { return softCommit; }
|
||||||
|
public void setSoftCommit(boolean softCommit) { this.softCommit = softCommit; }
|
||||||
|
|
||||||
|
public int getCommitWithin() { return commitWithin; }
|
||||||
|
public void setCommitWithin(int commitWithin) { this.commitWithin = commitWithin; }
|
||||||
|
|
||||||
|
public boolean isWaitSearcher() { return waitSearcher; }
|
||||||
|
public void setWaitSearcher(boolean waitSearcher) { this.waitSearcher = waitSearcher; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class Health {
|
||||||
|
private int checkInterval = 30000;
|
||||||
|
private int slowQueryThreshold = 5000;
|
||||||
|
private boolean enableMetrics = true;
|
||||||
|
|
||||||
|
// Getters and setters
|
||||||
|
public int getCheckInterval() { return checkInterval; }
|
||||||
|
public void setCheckInterval(int checkInterval) { this.checkInterval = checkInterval; }
|
||||||
|
|
||||||
|
public int getSlowQueryThreshold() { return slowQueryThreshold; }
|
||||||
|
public void setSlowQueryThreshold(int slowQueryThreshold) { this.slowQueryThreshold = slowQueryThreshold; }
|
||||||
|
|
||||||
|
public boolean isEnableMetrics() { return enableMetrics; }
|
||||||
|
public void setEnableMetrics(boolean enableMetrics) { this.enableMetrics = enableMetrics; }
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -3,7 +3,7 @@ package com.storycove.controller;
|
|||||||
import com.storycove.entity.Author;
|
import com.storycove.entity.Author;
|
||||||
import com.storycove.entity.Story;
|
import com.storycove.entity.Story;
|
||||||
import com.storycove.service.AuthorService;
|
import com.storycove.service.AuthorService;
|
||||||
import com.storycove.service.OpenSearchService;
|
import com.storycove.service.SolrService;
|
||||||
import com.storycove.service.SearchServiceAdapter;
|
import com.storycove.service.SearchServiceAdapter;
|
||||||
import com.storycove.service.StoryService;
|
import com.storycove.service.StoryService;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
@@ -16,7 +16,7 @@ import java.util.List;
|
|||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Admin controller for managing OpenSearch operations.
|
* Admin controller for managing Solr operations.
|
||||||
* Provides endpoints for reindexing and index management.
|
* Provides endpoints for reindexing and index management.
|
||||||
*/
|
*/
|
||||||
@RestController
|
@RestController
|
||||||
@@ -35,7 +35,7 @@ public class AdminSearchController {
|
|||||||
private AuthorService authorService;
|
private AuthorService authorService;
|
||||||
|
|
||||||
@Autowired(required = false)
|
@Autowired(required = false)
|
||||||
private OpenSearchService openSearchService;
|
private SolrService solrService;
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get current search status
|
* Get current search status
|
||||||
@@ -48,7 +48,7 @@ public class AdminSearchController {
|
|||||||
return ResponseEntity.ok(Map.of(
|
return ResponseEntity.ok(Map.of(
|
||||||
"primaryEngine", status.getPrimaryEngine(),
|
"primaryEngine", status.getPrimaryEngine(),
|
||||||
"dualWrite", status.isDualWrite(),
|
"dualWrite", status.isDualWrite(),
|
||||||
"openSearchAvailable", status.isOpenSearchAvailable()
|
"solrAvailable", status.isSolrAvailable()
|
||||||
));
|
));
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Error getting search status", e);
|
logger.error("Error getting search status", e);
|
||||||
@@ -59,17 +59,17 @@ public class AdminSearchController {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Reindex all data in OpenSearch
|
* Reindex all data in Solr
|
||||||
*/
|
*/
|
||||||
@PostMapping("/opensearch/reindex")
|
@PostMapping("/solr/reindex")
|
||||||
public ResponseEntity<Map<String, Object>> reindexOpenSearch() {
|
public ResponseEntity<Map<String, Object>> reindexSolr() {
|
||||||
try {
|
try {
|
||||||
logger.info("Starting OpenSearch full reindex");
|
logger.info("Starting Solr full reindex");
|
||||||
|
|
||||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||||
return ResponseEntity.badRequest().body(Map.of(
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
"success", false,
|
"success", false,
|
||||||
"error", "OpenSearch is not available or healthy"
|
"error", "Solr is not available or healthy"
|
||||||
));
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -77,14 +77,14 @@ public class AdminSearchController {
|
|||||||
List<Story> allStories = storyService.findAllWithAssociations();
|
List<Story> allStories = storyService.findAllWithAssociations();
|
||||||
List<Author> allAuthors = authorService.findAllWithStories();
|
List<Author> allAuthors = authorService.findAllWithStories();
|
||||||
|
|
||||||
// Bulk index directly in OpenSearch
|
// Bulk index directly in Solr
|
||||||
if (openSearchService != null) {
|
if (solrService != null) {
|
||||||
openSearchService.bulkIndexStories(allStories);
|
solrService.bulkIndexStories(allStories);
|
||||||
openSearchService.bulkIndexAuthors(allAuthors);
|
solrService.bulkIndexAuthors(allAuthors);
|
||||||
} else {
|
} else {
|
||||||
return ResponseEntity.badRequest().body(Map.of(
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
"success", false,
|
"success", false,
|
||||||
"error", "OpenSearch service not available"
|
"error", "Solr service not available"
|
||||||
));
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -92,7 +92,7 @@ public class AdminSearchController {
|
|||||||
|
|
||||||
return ResponseEntity.ok(Map.of(
|
return ResponseEntity.ok(Map.of(
|
||||||
"success", true,
|
"success", true,
|
||||||
"message", String.format("Reindexed %d stories and %d authors in OpenSearch",
|
"message", String.format("Reindexed %d stories and %d authors in Solr",
|
||||||
allStories.size(), allAuthors.size()),
|
allStories.size(), allAuthors.size()),
|
||||||
"storiesCount", allStories.size(),
|
"storiesCount", allStories.size(),
|
||||||
"authorsCount", allAuthors.size(),
|
"authorsCount", allAuthors.size(),
|
||||||
@@ -100,36 +100,36 @@ public class AdminSearchController {
|
|||||||
));
|
));
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Error during OpenSearch reindex", e);
|
logger.error("Error during Solr reindex", e);
|
||||||
return ResponseEntity.internalServerError().body(Map.of(
|
return ResponseEntity.internalServerError().body(Map.of(
|
||||||
"success", false,
|
"success", false,
|
||||||
"error", "OpenSearch reindex failed: " + e.getMessage()
|
"error", "Solr reindex failed: " + e.getMessage()
|
||||||
));
|
));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Recreate OpenSearch indices
|
* Recreate Solr indices
|
||||||
*/
|
*/
|
||||||
@PostMapping("/opensearch/recreate")
|
@PostMapping("/solr/recreate")
|
||||||
public ResponseEntity<Map<String, Object>> recreateOpenSearchIndices() {
|
public ResponseEntity<Map<String, Object>> recreateSolrIndices() {
|
||||||
try {
|
try {
|
||||||
logger.info("Starting OpenSearch indices recreation");
|
logger.info("Starting Solr indices recreation");
|
||||||
|
|
||||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||||
return ResponseEntity.badRequest().body(Map.of(
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
"success", false,
|
"success", false,
|
||||||
"error", "OpenSearch is not available or healthy"
|
"error", "Solr is not available or healthy"
|
||||||
));
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
// Recreate indices
|
// Recreate indices
|
||||||
if (openSearchService != null) {
|
if (solrService != null) {
|
||||||
openSearchService.recreateIndices();
|
solrService.recreateIndices();
|
||||||
} else {
|
} else {
|
||||||
return ResponseEntity.badRequest().body(Map.of(
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
"success", false,
|
"success", false,
|
||||||
"error", "OpenSearch service not available"
|
"error", "Solr service not available"
|
||||||
));
|
));
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -138,14 +138,14 @@ public class AdminSearchController {
|
|||||||
List<Author> allAuthors = authorService.findAllWithStories();
|
List<Author> allAuthors = authorService.findAllWithStories();
|
||||||
|
|
||||||
// Bulk index after recreation
|
// Bulk index after recreation
|
||||||
openSearchService.bulkIndexStories(allStories);
|
solrService.bulkIndexStories(allStories);
|
||||||
openSearchService.bulkIndexAuthors(allAuthors);
|
solrService.bulkIndexAuthors(allAuthors);
|
||||||
|
|
||||||
int totalIndexed = allStories.size() + allAuthors.size();
|
int totalIndexed = allStories.size() + allAuthors.size();
|
||||||
|
|
||||||
return ResponseEntity.ok(Map.of(
|
return ResponseEntity.ok(Map.of(
|
||||||
"success", true,
|
"success", true,
|
||||||
"message", String.format("Recreated OpenSearch indices and indexed %d stories and %d authors",
|
"message", String.format("Recreated Solr indices and indexed %d stories and %d authors",
|
||||||
allStories.size(), allAuthors.size()),
|
allStories.size(), allAuthors.size()),
|
||||||
"storiesCount", allStories.size(),
|
"storiesCount", allStories.size(),
|
||||||
"authorsCount", allAuthors.size(),
|
"authorsCount", allAuthors.size(),
|
||||||
@@ -153,10 +153,156 @@ public class AdminSearchController {
|
|||||||
));
|
));
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Error during OpenSearch indices recreation", e);
|
logger.error("Error during Solr indices recreation", e);
|
||||||
return ResponseEntity.internalServerError().body(Map.of(
|
return ResponseEntity.internalServerError().body(Map.of(
|
||||||
"success", false,
|
"success", false,
|
||||||
"error", "OpenSearch indices recreation failed: " + e.getMessage()
|
"error", "Solr indices recreation failed: " + e.getMessage()
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Add libraryId field to Solr schema via Schema API.
|
||||||
|
* This is a prerequisite for library-aware indexing.
|
||||||
|
*/
|
||||||
|
@PostMapping("/solr/add-library-field")
|
||||||
|
public ResponseEntity<Map<String, Object>> addLibraryField() {
|
||||||
|
try {
|
||||||
|
logger.info("Starting Solr libraryId field addition");
|
||||||
|
|
||||||
|
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "Solr is not available or healthy"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (solrService == null) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "Solr service not available"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add the libraryId field to the schema
|
||||||
|
try {
|
||||||
|
solrService.addLibraryIdField();
|
||||||
|
logger.info("libraryId field added successfully to schema");
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"message", "libraryId field added successfully to both stories and authors cores",
|
||||||
|
"note", "You can now run the library schema migration"
|
||||||
|
));
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to add libraryId field to schema", e);
|
||||||
|
return ResponseEntity.internalServerError().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "Failed to add libraryId field to schema: " + e.getMessage(),
|
||||||
|
"details", "Check that Solr is accessible and schema is modifiable"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error during libraryId field addition", e);
|
||||||
|
return ResponseEntity.internalServerError().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "libraryId field addition failed: " + e.getMessage()
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Migrate to library-aware Solr schema.
|
||||||
|
* This endpoint handles the migration from non-library-aware to library-aware indexing.
|
||||||
|
* It clears existing data and reindexes with library context.
|
||||||
|
*/
|
||||||
|
@PostMapping("/solr/migrate-library-schema")
|
||||||
|
public ResponseEntity<Map<String, Object>> migrateLibrarySchema() {
|
||||||
|
try {
|
||||||
|
logger.info("Starting Solr library schema migration");
|
||||||
|
|
||||||
|
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "Solr is not available or healthy"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (solrService == null) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "Solr service not available"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Adding libraryId field to Solr schema");
|
||||||
|
|
||||||
|
// First, add the libraryId field to the schema via Schema API
|
||||||
|
try {
|
||||||
|
solrService.addLibraryIdField();
|
||||||
|
logger.info("libraryId field added successfully to schema");
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to add libraryId field to schema", e);
|
||||||
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "Failed to add libraryId field to schema: " + e.getMessage(),
|
||||||
|
"details", "The schema must support the libraryId field before migration"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Clearing existing Solr data for library schema migration");
|
||||||
|
|
||||||
|
// Clear existing data that doesn't have libraryId
|
||||||
|
try {
|
||||||
|
solrService.recreateIndices();
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.warn("Could not recreate indices (expected in production): {}", e.getMessage());
|
||||||
|
// In production, just clear the data instead
|
||||||
|
try {
|
||||||
|
solrService.clearAllDocuments();
|
||||||
|
logger.info("Cleared all documents from Solr cores");
|
||||||
|
} catch (Exception clearError) {
|
||||||
|
logger.error("Failed to clear documents", clearError);
|
||||||
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "Failed to clear existing data: " + clearError.getMessage()
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all data and reindex with library context
|
||||||
|
List<Story> allStories = storyService.findAllWithAssociations();
|
||||||
|
List<Author> allAuthors = authorService.findAllWithStories();
|
||||||
|
|
||||||
|
logger.info("Reindexing {} stories and {} authors with library context",
|
||||||
|
allStories.size(), allAuthors.size());
|
||||||
|
|
||||||
|
// Bulk index everything (will now include libraryId from current library context)
|
||||||
|
solrService.bulkIndexStories(allStories);
|
||||||
|
solrService.bulkIndexAuthors(allAuthors);
|
||||||
|
|
||||||
|
int totalIndexed = allStories.size() + allAuthors.size();
|
||||||
|
|
||||||
|
logger.info("Solr library schema migration completed successfully");
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"message", String.format("Library schema migration completed. Reindexed %d stories and %d authors with library context.",
|
||||||
|
allStories.size(), allAuthors.size()),
|
||||||
|
"storiesCount", allStories.size(),
|
||||||
|
"authorsCount", allAuthors.size(),
|
||||||
|
"totalCount", totalIndexed,
|
||||||
|
"note", "Ensure libraryId field exists in Solr schema before running this migration"
|
||||||
|
));
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error during Solr library schema migration", e);
|
||||||
|
return ResponseEntity.internalServerError().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", "Library schema migration failed: " + e.getMessage(),
|
||||||
|
"details", "Make sure the libraryId field has been added to both stories and authors Solr cores"
|
||||||
));
|
));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -291,7 +291,7 @@ public class CollectionController {
|
|||||||
// Collections are not indexed in search engine yet
|
// Collections are not indexed in search engine yet
|
||||||
return ResponseEntity.ok(Map.of(
|
return ResponseEntity.ok(Map.of(
|
||||||
"success", true,
|
"success", true,
|
||||||
"message", "Collections indexing not yet implemented in OpenSearch",
|
"message", "Collections indexing not yet implemented in Solr",
|
||||||
"count", allCollections.size()
|
"count", allCollections.size()
|
||||||
));
|
));
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
|
|||||||
@@ -3,27 +3,43 @@ package com.storycove.controller;
|
|||||||
import com.storycove.dto.HtmlSanitizationConfigDto;
|
import com.storycove.dto.HtmlSanitizationConfigDto;
|
||||||
import com.storycove.service.HtmlSanitizationService;
|
import com.storycove.service.HtmlSanitizationService;
|
||||||
import com.storycove.service.ImageService;
|
import com.storycove.service.ImageService;
|
||||||
|
import com.storycove.service.StoryService;
|
||||||
|
import com.storycove.entity.Story;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.beans.factory.annotation.Value;
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
import org.springframework.http.ResponseEntity;
|
import org.springframework.http.ResponseEntity;
|
||||||
import org.springframework.web.bind.annotation.*;
|
import org.springframework.web.bind.annotation.*;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.HashMap;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.UUID;
|
||||||
|
import java.nio.file.Path;
|
||||||
|
import java.nio.file.Paths;
|
||||||
|
import java.nio.file.Files;
|
||||||
|
import java.io.IOException;
|
||||||
|
|
||||||
@RestController
|
@RestController
|
||||||
@RequestMapping("/api/config")
|
@RequestMapping("/api/config")
|
||||||
public class ConfigController {
|
public class ConfigController {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(ConfigController.class);
|
||||||
|
|
||||||
private final HtmlSanitizationService htmlSanitizationService;
|
private final HtmlSanitizationService htmlSanitizationService;
|
||||||
private final ImageService imageService;
|
private final ImageService imageService;
|
||||||
|
private final StoryService storyService;
|
||||||
|
|
||||||
@Value("${app.reading.speed.default:200}")
|
@Value("${app.reading.speed.default:200}")
|
||||||
private int defaultReadingSpeed;
|
private int defaultReadingSpeed;
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
public ConfigController(HtmlSanitizationService htmlSanitizationService, ImageService imageService) {
|
public ConfigController(HtmlSanitizationService htmlSanitizationService, ImageService imageService, StoryService storyService) {
|
||||||
this.htmlSanitizationService = htmlSanitizationService;
|
this.htmlSanitizationService = htmlSanitizationService;
|
||||||
this.imageService = imageService;
|
this.imageService = imageService;
|
||||||
|
this.storyService = storyService;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -61,27 +77,55 @@ public class ConfigController {
|
|||||||
@PostMapping("/cleanup/images/preview")
|
@PostMapping("/cleanup/images/preview")
|
||||||
public ResponseEntity<Map<String, Object>> previewImageCleanup() {
|
public ResponseEntity<Map<String, Object>> previewImageCleanup() {
|
||||||
try {
|
try {
|
||||||
|
logger.info("Starting image cleanup preview");
|
||||||
ImageService.ContentImageCleanupResult result = imageService.cleanupOrphanedContentImages(true);
|
ImageService.ContentImageCleanupResult result = imageService.cleanupOrphanedContentImages(true);
|
||||||
|
|
||||||
Map<String, Object> response = Map.of(
|
// Create detailed file information with story relationships
|
||||||
"success", true,
|
logger.info("Processing {} orphaned files for detailed information", result.getOrphanedImages().size());
|
||||||
"orphanedCount", result.getOrphanedImages().size(),
|
List<Map<String, Object>> orphanedFiles = result.getOrphanedImages().stream()
|
||||||
"totalSizeBytes", result.getTotalSizeBytes(),
|
.map(filePath -> {
|
||||||
"formattedSize", result.getFormattedSize(),
|
try {
|
||||||
"foldersToDelete", result.getFoldersToDelete(),
|
return createFileInfo(filePath);
|
||||||
"referencedImagesCount", result.getTotalReferencedImages(),
|
} catch (Exception e) {
|
||||||
"errors", result.getErrors(),
|
logger.error("Error processing file {}: {}", filePath, e.getMessage());
|
||||||
"hasErrors", result.hasErrors(),
|
// Return a basic error entry instead of failing completely
|
||||||
"dryRun", true
|
Map<String, Object> errorEntry = new HashMap<>();
|
||||||
);
|
errorEntry.put("filePath", filePath);
|
||||||
|
errorEntry.put("fileName", Paths.get(filePath).getFileName().toString());
|
||||||
|
errorEntry.put("fileSize", 0L);
|
||||||
|
errorEntry.put("formattedSize", "0 B");
|
||||||
|
errorEntry.put("storyId", "error");
|
||||||
|
errorEntry.put("storyTitle", null);
|
||||||
|
errorEntry.put("storyExists", false);
|
||||||
|
errorEntry.put("canAccessStory", false);
|
||||||
|
errorEntry.put("error", e.getMessage());
|
||||||
|
return errorEntry;
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
// Use HashMap to avoid Map.of() null value issues
|
||||||
|
Map<String, Object> response = new HashMap<>();
|
||||||
|
response.put("success", true);
|
||||||
|
response.put("orphanedCount", result.getOrphanedImages().size());
|
||||||
|
response.put("totalSizeBytes", result.getTotalSizeBytes());
|
||||||
|
response.put("formattedSize", result.getFormattedSize());
|
||||||
|
response.put("foldersToDelete", result.getFoldersToDelete());
|
||||||
|
response.put("referencedImagesCount", result.getTotalReferencedImages());
|
||||||
|
response.put("errors", result.getErrors());
|
||||||
|
response.put("hasErrors", result.hasErrors());
|
||||||
|
response.put("dryRun", true);
|
||||||
|
response.put("orphanedFiles", orphanedFiles);
|
||||||
|
|
||||||
|
logger.info("Image cleanup preview completed successfully");
|
||||||
return ResponseEntity.ok(response);
|
return ResponseEntity.ok(response);
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
return ResponseEntity.status(500).body(Map.of(
|
logger.error("Failed to preview image cleanup", e);
|
||||||
"success", false,
|
Map<String, Object> errorResponse = new HashMap<>();
|
||||||
"error", "Failed to preview image cleanup: " + e.getMessage()
|
errorResponse.put("success", false);
|
||||||
));
|
errorResponse.put("error", "Failed to preview image cleanup: " + (e.getMessage() != null ? e.getMessage() : e.getClass().getSimpleName()));
|
||||||
|
return ResponseEntity.status(500).body(errorResponse);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -114,4 +158,89 @@ public class ConfigController {
|
|||||||
));
|
));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create detailed file information for orphaned image including story relationship
|
||||||
|
*/
|
||||||
|
private Map<String, Object> createFileInfo(String filePath) {
|
||||||
|
try {
|
||||||
|
Path path = Paths.get(filePath);
|
||||||
|
String fileName = path.getFileName().toString();
|
||||||
|
long fileSize = Files.exists(path) ? Files.size(path) : 0;
|
||||||
|
|
||||||
|
// Extract story UUID from the path (content images are stored in /content/{storyId}/)
|
||||||
|
String storyId = extractStoryIdFromPath(filePath);
|
||||||
|
|
||||||
|
// Look up the story if we have a valid UUID
|
||||||
|
Story relatedStory = null;
|
||||||
|
if (storyId != null) {
|
||||||
|
try {
|
||||||
|
UUID storyUuid = UUID.fromString(storyId);
|
||||||
|
relatedStory = storyService.findById(storyUuid);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.debug("Could not find story with ID {}: {}", storyId, e.getMessage());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Map<String, Object> fileInfo = new HashMap<>();
|
||||||
|
fileInfo.put("filePath", filePath);
|
||||||
|
fileInfo.put("fileName", fileName);
|
||||||
|
fileInfo.put("fileSize", fileSize);
|
||||||
|
fileInfo.put("formattedSize", formatBytes(fileSize));
|
||||||
|
fileInfo.put("storyId", storyId != null ? storyId : "unknown");
|
||||||
|
fileInfo.put("storyTitle", relatedStory != null ? relatedStory.getTitle() : null);
|
||||||
|
fileInfo.put("storyExists", relatedStory != null);
|
||||||
|
fileInfo.put("canAccessStory", relatedStory != null);
|
||||||
|
|
||||||
|
return fileInfo;
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error creating file info for {}: {}", filePath, e.getMessage());
|
||||||
|
Map<String, Object> errorInfo = new HashMap<>();
|
||||||
|
errorInfo.put("filePath", filePath);
|
||||||
|
errorInfo.put("fileName", Paths.get(filePath).getFileName().toString());
|
||||||
|
errorInfo.put("fileSize", 0L);
|
||||||
|
errorInfo.put("formattedSize", "0 B");
|
||||||
|
errorInfo.put("storyId", "error");
|
||||||
|
errorInfo.put("storyTitle", null);
|
||||||
|
errorInfo.put("storyExists", false);
|
||||||
|
errorInfo.put("canAccessStory", false);
|
||||||
|
errorInfo.put("error", e.getMessage() != null ? e.getMessage() : e.getClass().getSimpleName());
|
||||||
|
return errorInfo;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract story ID from content image file path
|
||||||
|
*/
|
||||||
|
private String extractStoryIdFromPath(String filePath) {
|
||||||
|
try {
|
||||||
|
// Content images are stored in: /path/to/uploads/content/{storyId}/filename.ext
|
||||||
|
Path path = Paths.get(filePath);
|
||||||
|
Path parent = path.getParent();
|
||||||
|
if (parent != null) {
|
||||||
|
String potentialUuid = parent.getFileName().toString();
|
||||||
|
// Basic UUID validation (36 characters with dashes in right places)
|
||||||
|
if (potentialUuid.length() == 36 &&
|
||||||
|
potentialUuid.charAt(8) == '-' &&
|
||||||
|
potentialUuid.charAt(13) == '-' &&
|
||||||
|
potentialUuid.charAt(18) == '-' &&
|
||||||
|
potentialUuid.charAt(23) == '-') {
|
||||||
|
return potentialUuid;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Invalid path or other error
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format file size in human readable format
|
||||||
|
*/
|
||||||
|
private String formatBytes(long bytes) {
|
||||||
|
if (bytes < 1024) return bytes + " B";
|
||||||
|
if (bytes < 1024 * 1024) return String.format("%.1f KB", bytes / 1024.0);
|
||||||
|
if (bytes < 1024 * 1024 * 1024) return String.format("%.1f MB", bytes / (1024.0 * 1024.0));
|
||||||
|
return String.format("%.1f GB", bytes / (1024.0 * 1024.0 * 1024.0));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -2,6 +2,8 @@ package com.storycove.controller;
|
|||||||
|
|
||||||
import com.storycove.service.ImageService;
|
import com.storycove.service.ImageService;
|
||||||
import com.storycove.service.LibraryService;
|
import com.storycove.service.LibraryService;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.core.io.FileSystemResource;
|
import org.springframework.core.io.FileSystemResource;
|
||||||
import org.springframework.core.io.Resource;
|
import org.springframework.core.io.Resource;
|
||||||
import org.springframework.http.HttpHeaders;
|
import org.springframework.http.HttpHeaders;
|
||||||
@@ -21,6 +23,7 @@ import java.util.Map;
|
|||||||
@RestController
|
@RestController
|
||||||
@RequestMapping("/api/files")
|
@RequestMapping("/api/files")
|
||||||
public class FileController {
|
public class FileController {
|
||||||
|
private static final Logger log = LoggerFactory.getLogger(FileController.class);
|
||||||
|
|
||||||
private final ImageService imageService;
|
private final ImageService imageService;
|
||||||
private final LibraryService libraryService;
|
private final LibraryService libraryService;
|
||||||
@@ -32,7 +35,7 @@ public class FileController {
|
|||||||
|
|
||||||
private String getCurrentLibraryId() {
|
private String getCurrentLibraryId() {
|
||||||
String libraryId = libraryService.getCurrentLibraryId();
|
String libraryId = libraryService.getCurrentLibraryId();
|
||||||
System.out.println("FileController - Current Library ID: " + libraryId);
|
log.debug("FileController - Current Library ID: {}", libraryId);
|
||||||
return libraryId != null ? libraryId : "default";
|
return libraryId != null ? libraryId : "default";
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -48,7 +51,7 @@ public class FileController {
|
|||||||
String imageUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
String imageUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
||||||
response.put("url", imageUrl);
|
response.put("url", imageUrl);
|
||||||
|
|
||||||
System.out.println("Upload response - path: " + imagePath + ", url: " + imageUrl);
|
log.debug("Upload response - path: {}, url: {}", imagePath, imageUrl);
|
||||||
|
|
||||||
return ResponseEntity.ok(response);
|
return ResponseEntity.ok(response);
|
||||||
} catch (IllegalArgumentException e) {
|
} catch (IllegalArgumentException e) {
|
||||||
|
|||||||
@@ -12,9 +12,7 @@ import com.storycove.service.*;
|
|||||||
import jakarta.validation.Valid;
|
import jakarta.validation.Valid;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
|
||||||
import org.springframework.data.domain.Page;
|
import org.springframework.data.domain.Page;
|
||||||
import org.springframework.data.domain.PageImpl;
|
|
||||||
import org.springframework.data.domain.PageRequest;
|
import org.springframework.data.domain.PageRequest;
|
||||||
import org.springframework.data.domain.Pageable;
|
import org.springframework.data.domain.Pageable;
|
||||||
import org.springframework.data.domain.Sort;
|
import org.springframework.data.domain.Sort;
|
||||||
@@ -46,6 +44,8 @@ public class StoryController {
|
|||||||
private final ReadingTimeService readingTimeService;
|
private final ReadingTimeService readingTimeService;
|
||||||
private final EPUBImportService epubImportService;
|
private final EPUBImportService epubImportService;
|
||||||
private final EPUBExportService epubExportService;
|
private final EPUBExportService epubExportService;
|
||||||
|
private final AsyncImageProcessingService asyncImageProcessingService;
|
||||||
|
private final ImageProcessingProgressService progressService;
|
||||||
|
|
||||||
public StoryController(StoryService storyService,
|
public StoryController(StoryService storyService,
|
||||||
AuthorService authorService,
|
AuthorService authorService,
|
||||||
@@ -56,7 +56,9 @@ public class StoryController {
|
|||||||
SearchServiceAdapter searchServiceAdapter,
|
SearchServiceAdapter searchServiceAdapter,
|
||||||
ReadingTimeService readingTimeService,
|
ReadingTimeService readingTimeService,
|
||||||
EPUBImportService epubImportService,
|
EPUBImportService epubImportService,
|
||||||
EPUBExportService epubExportService) {
|
EPUBExportService epubExportService,
|
||||||
|
AsyncImageProcessingService asyncImageProcessingService,
|
||||||
|
ImageProcessingProgressService progressService) {
|
||||||
this.storyService = storyService;
|
this.storyService = storyService;
|
||||||
this.authorService = authorService;
|
this.authorService = authorService;
|
||||||
this.seriesService = seriesService;
|
this.seriesService = seriesService;
|
||||||
@@ -67,6 +69,8 @@ public class StoryController {
|
|||||||
this.readingTimeService = readingTimeService;
|
this.readingTimeService = readingTimeService;
|
||||||
this.epubImportService = epubImportService;
|
this.epubImportService = epubImportService;
|
||||||
this.epubExportService = epubExportService;
|
this.epubExportService = epubExportService;
|
||||||
|
this.asyncImageProcessingService = asyncImageProcessingService;
|
||||||
|
this.progressService = progressService;
|
||||||
}
|
}
|
||||||
|
|
||||||
@GetMapping
|
@GetMapping
|
||||||
@@ -144,25 +148,33 @@ public class StoryController {
|
|||||||
logger.info("Creating new story: {}", request.getTitle());
|
logger.info("Creating new story: {}", request.getTitle());
|
||||||
Story story = new Story();
|
Story story = new Story();
|
||||||
updateStoryFromRequest(story, request);
|
updateStoryFromRequest(story, request);
|
||||||
|
|
||||||
Story savedStory = storyService.createWithTagNames(story, request.getTagNames());
|
Story savedStory = storyService.createWithTagNames(story, request.getTagNames());
|
||||||
|
|
||||||
|
// Process external images in content after saving
|
||||||
|
savedStory = processExternalImagesIfNeeded(savedStory);
|
||||||
|
|
||||||
logger.info("Successfully created story: {} (ID: {})", savedStory.getTitle(), savedStory.getId());
|
logger.info("Successfully created story: {} (ID: {})", savedStory.getTitle(), savedStory.getId());
|
||||||
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedStory));
|
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedStory));
|
||||||
}
|
}
|
||||||
|
|
||||||
@PutMapping("/{id}")
|
@PutMapping("/{id}")
|
||||||
public ResponseEntity<StoryDto> updateStory(@PathVariable UUID id,
|
public ResponseEntity<StoryDto> updateStory(@PathVariable UUID id,
|
||||||
@Valid @RequestBody UpdateStoryRequest request) {
|
@Valid @RequestBody UpdateStoryRequest request) {
|
||||||
logger.info("Updating story: {} (ID: {})", request.getTitle(), id);
|
logger.info("Updating story: {} (ID: {})", request.getTitle(), id);
|
||||||
|
|
||||||
// Handle author creation/lookup at controller level before calling service
|
// Handle author creation/lookup at controller level before calling service
|
||||||
if (request.getAuthorName() != null && !request.getAuthorName().trim().isEmpty() && request.getAuthorId() == null) {
|
if (request.getAuthorName() != null && !request.getAuthorName().trim().isEmpty() && request.getAuthorId() == null) {
|
||||||
Author author = findOrCreateAuthor(request.getAuthorName().trim());
|
Author author = findOrCreateAuthor(request.getAuthorName().trim());
|
||||||
request.setAuthorId(author.getId());
|
request.setAuthorId(author.getId());
|
||||||
request.setAuthorName(null); // Clear author name since we now have the ID
|
request.setAuthorName(null); // Clear author name since we now have the ID
|
||||||
}
|
}
|
||||||
|
|
||||||
Story updatedStory = storyService.updateWithTagNames(id, request);
|
Story updatedStory = storyService.updateWithTagNames(id, request);
|
||||||
|
|
||||||
|
// Process external images in content after saving
|
||||||
|
updatedStory = processExternalImagesIfNeeded(updatedStory);
|
||||||
|
|
||||||
logger.info("Successfully updated story: {}", updatedStory.getTitle());
|
logger.info("Successfully updated story: {}", updatedStory.getTitle());
|
||||||
return ResponseEntity.ok(convertToDto(updatedStory));
|
return ResponseEntity.ok(convertToDto(updatedStory));
|
||||||
}
|
}
|
||||||
@@ -474,7 +486,9 @@ public class StoryController {
|
|||||||
story.setTitle(createReq.getTitle());
|
story.setTitle(createReq.getTitle());
|
||||||
story.setSummary(createReq.getSummary());
|
story.setSummary(createReq.getSummary());
|
||||||
story.setDescription(createReq.getDescription());
|
story.setDescription(createReq.getDescription());
|
||||||
|
|
||||||
story.setContentHtml(sanitizationService.sanitize(createReq.getContentHtml()));
|
story.setContentHtml(sanitizationService.sanitize(createReq.getContentHtml()));
|
||||||
|
|
||||||
story.setSourceUrl(createReq.getSourceUrl());
|
story.setSourceUrl(createReq.getSourceUrl());
|
||||||
story.setVolume(createReq.getVolume());
|
story.setVolume(createReq.getVolume());
|
||||||
|
|
||||||
@@ -706,7 +720,51 @@ public class StoryController {
|
|||||||
|
|
||||||
return dto;
|
return dto;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private Story processExternalImagesIfNeeded(Story story) {
|
||||||
|
try {
|
||||||
|
if (story.getContentHtml() != null && !story.getContentHtml().trim().isEmpty()) {
|
||||||
|
logger.debug("Starting async image processing for story: {}", story.getId());
|
||||||
|
|
||||||
|
// Start async processing - this returns immediately
|
||||||
|
asyncImageProcessingService.processStoryImagesAsync(story.getId(), story.getContentHtml());
|
||||||
|
|
||||||
|
logger.info("Async image processing started for story: {}", story.getId());
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to start async image processing for story {}: {}",
|
||||||
|
story.getId(), e.getMessage(), e);
|
||||||
|
// Don't fail the entire operation if image processing fails
|
||||||
|
}
|
||||||
|
|
||||||
|
return story;
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/{id}/image-processing-progress")
|
||||||
|
public ResponseEntity<Map<String, Object>> getImageProcessingProgress(@PathVariable UUID id) {
|
||||||
|
ImageProcessingProgressService.ImageProcessingProgress progress = progressService.getProgress(id);
|
||||||
|
|
||||||
|
if (progress == null) {
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"isProcessing", false,
|
||||||
|
"message", "No active image processing"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
Map<String, Object> response = Map.of(
|
||||||
|
"isProcessing", !progress.isCompleted(),
|
||||||
|
"totalImages", progress.getTotalImages(),
|
||||||
|
"processedImages", progress.getProcessedImages(),
|
||||||
|
"currentImageUrl", progress.getCurrentImageUrl() != null ? progress.getCurrentImageUrl() : "",
|
||||||
|
"status", progress.getStatus(),
|
||||||
|
"progressPercentage", progress.getProgressPercentage(),
|
||||||
|
"completed", progress.isCompleted(),
|
||||||
|
"error", progress.getErrorMessage() != null ? progress.getErrorMessage() : ""
|
||||||
|
);
|
||||||
|
|
||||||
|
return ResponseEntity.ok(response);
|
||||||
|
}
|
||||||
|
|
||||||
@GetMapping("/check-duplicate")
|
@GetMapping("/check-duplicate")
|
||||||
public ResponseEntity<Map<String, Object>> checkDuplicate(
|
public ResponseEntity<Map<String, Object>> checkDuplicate(
|
||||||
@RequestParam String title,
|
@RequestParam String title,
|
||||||
|
|||||||
@@ -33,6 +33,18 @@ public class SearchResultDto<T> {
|
|||||||
this.searchTimeMs = searchTimeMs;
|
this.searchTimeMs = searchTimeMs;
|
||||||
this.facets = facets;
|
this.facets = facets;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Simple constructor for basic search results with facet list
|
||||||
|
public SearchResultDto(List<T> results, long totalHits, int resultCount, List<FacetCountDto> facetsList) {
|
||||||
|
this.results = results;
|
||||||
|
this.totalHits = totalHits;
|
||||||
|
this.page = 0;
|
||||||
|
this.perPage = resultCount;
|
||||||
|
this.query = "";
|
||||||
|
this.searchTimeMs = 0;
|
||||||
|
// Convert list to map if needed - for now just set empty map
|
||||||
|
this.facets = java.util.Collections.emptyMap();
|
||||||
|
}
|
||||||
|
|
||||||
// Getters and Setters
|
// Getters and Setters
|
||||||
public List<T> getResults() {
|
public List<T> getResults() {
|
||||||
|
|||||||
@@ -0,0 +1,34 @@
|
|||||||
|
package com.storycove.event;
|
||||||
|
|
||||||
|
import org.springframework.context.ApplicationEvent;
|
||||||
|
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Event published when a story's content is created or updated
|
||||||
|
*/
|
||||||
|
public class StoryContentUpdatedEvent extends ApplicationEvent {
|
||||||
|
|
||||||
|
private final UUID storyId;
|
||||||
|
private final String contentHtml;
|
||||||
|
private final boolean isNewStory;
|
||||||
|
|
||||||
|
public StoryContentUpdatedEvent(Object source, UUID storyId, String contentHtml, boolean isNewStory) {
|
||||||
|
super(source);
|
||||||
|
this.storyId = storyId;
|
||||||
|
this.contentHtml = contentHtml;
|
||||||
|
this.isNewStory = isNewStory;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getStoryId() {
|
||||||
|
return storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContentHtml() {
|
||||||
|
return contentHtml;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isNewStory() {
|
||||||
|
return isNewStory;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,122 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.scheduling.annotation.Async;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
|
||||||
|
import java.util.UUID;
|
||||||
|
import java.util.concurrent.CompletableFuture;
|
||||||
|
import java.util.regex.Matcher;
|
||||||
|
import java.util.regex.Pattern;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
public class AsyncImageProcessingService {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(AsyncImageProcessingService.class);
|
||||||
|
|
||||||
|
private final ImageService imageService;
|
||||||
|
private final StoryService storyService;
|
||||||
|
private final ImageProcessingProgressService progressService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
public AsyncImageProcessingService(ImageService imageService,
|
||||||
|
StoryService storyService,
|
||||||
|
ImageProcessingProgressService progressService) {
|
||||||
|
this.imageService = imageService;
|
||||||
|
this.storyService = storyService;
|
||||||
|
this.progressService = progressService;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Async
|
||||||
|
public CompletableFuture<Void> processStoryImagesAsync(UUID storyId, String contentHtml) {
|
||||||
|
logger.info("Starting async image processing for story: {}", storyId);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Count external images first
|
||||||
|
int externalImageCount = countExternalImages(contentHtml);
|
||||||
|
|
||||||
|
if (externalImageCount == 0) {
|
||||||
|
logger.debug("No external images found for story {}", storyId);
|
||||||
|
return CompletableFuture.completedFuture(null);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start progress tracking
|
||||||
|
ImageProcessingProgressService.ImageProcessingProgress progress =
|
||||||
|
progressService.startProgress(storyId, externalImageCount);
|
||||||
|
|
||||||
|
// Process images with progress updates
|
||||||
|
ImageService.ContentImageProcessingResult result =
|
||||||
|
processImagesWithProgress(contentHtml, storyId, progress);
|
||||||
|
|
||||||
|
// Update story with processed content if changed
|
||||||
|
if (!result.getProcessedContent().equals(contentHtml)) {
|
||||||
|
progressService.updateProgress(storyId, progress.getTotalImages(),
|
||||||
|
"Saving processed content", "Updating story content");
|
||||||
|
|
||||||
|
storyService.updateContentOnly(storyId, result.getProcessedContent());
|
||||||
|
|
||||||
|
progressService.completeProgress(storyId,
|
||||||
|
String.format("Completed: %d images processed", result.getDownloadedImages().size()));
|
||||||
|
|
||||||
|
logger.info("Async image processing completed for story {}: {} images processed",
|
||||||
|
storyId, result.getDownloadedImages().size());
|
||||||
|
} else {
|
||||||
|
progressService.completeProgress(storyId, "Completed: No images needed processing");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clean up progress after a delay to allow frontend to see completion
|
||||||
|
CompletableFuture.runAsync(() -> {
|
||||||
|
try {
|
||||||
|
Thread.sleep(5000); // 5 seconds delay
|
||||||
|
progressService.removeProgress(storyId);
|
||||||
|
} catch (InterruptedException e) {
|
||||||
|
Thread.currentThread().interrupt();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Async image processing failed for story {}: {}", storyId, e.getMessage(), e);
|
||||||
|
progressService.setError(storyId, e.getMessage());
|
||||||
|
}
|
||||||
|
|
||||||
|
return CompletableFuture.completedFuture(null);
|
||||||
|
}
|
||||||
|
|
||||||
|
private int countExternalImages(String contentHtml) {
|
||||||
|
if (contentHtml == null || contentHtml.trim().isEmpty()) {
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
Pattern imgPattern = Pattern.compile("<img[^>]+src=[\"']([^\"']+)[\"'][^>]*>", Pattern.CASE_INSENSITIVE);
|
||||||
|
Matcher matcher = imgPattern.matcher(contentHtml);
|
||||||
|
|
||||||
|
int count = 0;
|
||||||
|
while (matcher.find()) {
|
||||||
|
String src = matcher.group(1);
|
||||||
|
if (isExternalUrl(src)) {
|
||||||
|
count++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return count;
|
||||||
|
}
|
||||||
|
|
||||||
|
private boolean isExternalUrl(String url) {
|
||||||
|
return url != null &&
|
||||||
|
(url.startsWith("http://") || url.startsWith("https://")) &&
|
||||||
|
!url.contains("/api/files/images/");
|
||||||
|
}
|
||||||
|
|
||||||
|
private ImageService.ContentImageProcessingResult processImagesWithProgress(
|
||||||
|
String contentHtml, UUID storyId, ImageProcessingProgressService.ImageProcessingProgress progress) {
|
||||||
|
|
||||||
|
// Use a custom version of processContentImages that provides progress callbacks
|
||||||
|
return imageService.processContentImagesWithProgress(contentHtml, storyId,
|
||||||
|
(currentUrl, processedCount, totalCount) -> {
|
||||||
|
progressService.updateProgress(storyId, processedCount, currentUrl,
|
||||||
|
String.format("Processing image %d of %d", processedCount + 1, totalCount));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -132,7 +132,7 @@ public class AuthorService {
|
|||||||
validateAuthorForCreate(author);
|
validateAuthorForCreate(author);
|
||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Index in OpenSearch
|
// Index in Solr
|
||||||
searchServiceAdapter.indexAuthor(savedAuthor);
|
searchServiceAdapter.indexAuthor(savedAuthor);
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
@@ -150,7 +150,7 @@ public class AuthorService {
|
|||||||
updateAuthorFields(existingAuthor, authorUpdates);
|
updateAuthorFields(existingAuthor, authorUpdates);
|
||||||
Author savedAuthor = authorRepository.save(existingAuthor);
|
Author savedAuthor = authorRepository.save(existingAuthor);
|
||||||
|
|
||||||
// Update in OpenSearch
|
// Update in Solr
|
||||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
@@ -166,7 +166,7 @@ public class AuthorService {
|
|||||||
|
|
||||||
authorRepository.delete(author);
|
authorRepository.delete(author);
|
||||||
|
|
||||||
// Remove from OpenSearch
|
// Remove from Solr
|
||||||
searchServiceAdapter.deleteAuthor(id);
|
searchServiceAdapter.deleteAuthor(id);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -175,7 +175,7 @@ public class AuthorService {
|
|||||||
author.addUrl(url);
|
author.addUrl(url);
|
||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Update in OpenSearch
|
// Update in Solr
|
||||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
@@ -186,7 +186,7 @@ public class AuthorService {
|
|||||||
author.removeUrl(url);
|
author.removeUrl(url);
|
||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Update in OpenSearch
|
// Update in Solr
|
||||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
@@ -221,7 +221,7 @@ public class AuthorService {
|
|||||||
logger.debug("Saved author rating: {} for author: {}",
|
logger.debug("Saved author rating: {} for author: {}",
|
||||||
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
|
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
|
||||||
|
|
||||||
// Update in OpenSearch
|
// Update in Solr
|
||||||
searchServiceAdapter.updateAuthor(refreshedAuthor);
|
searchServiceAdapter.updateAuthor(refreshedAuthor);
|
||||||
|
|
||||||
return refreshedAuthor;
|
return refreshedAuthor;
|
||||||
@@ -265,7 +265,7 @@ public class AuthorService {
|
|||||||
author.setAvatarImagePath(avatarPath);
|
author.setAvatarImagePath(avatarPath);
|
||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Update in OpenSearch
|
// Update in Solr
|
||||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
@@ -276,7 +276,7 @@ public class AuthorService {
|
|||||||
author.setAvatarImagePath(null);
|
author.setAvatarImagePath(null);
|
||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Update in OpenSearch
|
// Update in Solr
|
||||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
|
|||||||
@@ -55,8 +55,8 @@ public class CollectionService {
|
|||||||
*/
|
*/
|
||||||
public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) {
|
public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) {
|
||||||
// Collections are currently handled at database level, not indexed in search engine
|
// Collections are currently handled at database level, not indexed in search engine
|
||||||
// Return empty result for now as collections search is not implemented in OpenSearch
|
// Return empty result for now as collections search is not implemented in Solr
|
||||||
logger.warn("Collections search not yet implemented in OpenSearch, returning empty results");
|
logger.warn("Collections search not yet implemented in Solr, returning empty results");
|
||||||
return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0);
|
return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -70,6 +70,75 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
|||||||
this.applicationContext = applicationContext;
|
this.applicationContext = applicationContext;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Helper methods to extract database connection details
|
||||||
|
private String extractDatabaseUrl() {
|
||||||
|
try (Connection connection = getDataSource().getConnection()) {
|
||||||
|
return connection.getMetaData().getURL();
|
||||||
|
} catch (SQLException e) {
|
||||||
|
throw new RuntimeException("Failed to extract database URL", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractDatabaseHost() {
|
||||||
|
String url = extractDatabaseUrl();
|
||||||
|
// Extract host from jdbc:postgresql://host:port/database
|
||||||
|
if (url.startsWith("jdbc:postgresql://")) {
|
||||||
|
String hostPort = url.substring("jdbc:postgresql://".length());
|
||||||
|
if (hostPort.contains("/")) {
|
||||||
|
hostPort = hostPort.substring(0, hostPort.indexOf("/"));
|
||||||
|
}
|
||||||
|
if (hostPort.contains(":")) {
|
||||||
|
return hostPort.substring(0, hostPort.indexOf(":"));
|
||||||
|
}
|
||||||
|
return hostPort;
|
||||||
|
}
|
||||||
|
return "localhost"; // fallback
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractDatabasePort() {
|
||||||
|
String url = extractDatabaseUrl();
|
||||||
|
// Extract port from jdbc:postgresql://host:port/database
|
||||||
|
if (url.startsWith("jdbc:postgresql://")) {
|
||||||
|
String hostPort = url.substring("jdbc:postgresql://".length());
|
||||||
|
if (hostPort.contains("/")) {
|
||||||
|
hostPort = hostPort.substring(0, hostPort.indexOf("/"));
|
||||||
|
}
|
||||||
|
if (hostPort.contains(":")) {
|
||||||
|
return hostPort.substring(hostPort.indexOf(":") + 1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return "5432"; // default PostgreSQL port
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractDatabaseName() {
|
||||||
|
String url = extractDatabaseUrl();
|
||||||
|
// Extract database name from jdbc:postgresql://host:port/database
|
||||||
|
if (url.startsWith("jdbc:postgresql://")) {
|
||||||
|
String remaining = url.substring("jdbc:postgresql://".length());
|
||||||
|
if (remaining.contains("/")) {
|
||||||
|
String dbPart = remaining.substring(remaining.indexOf("/") + 1);
|
||||||
|
// Remove any query parameters
|
||||||
|
if (dbPart.contains("?")) {
|
||||||
|
dbPart = dbPart.substring(0, dbPart.indexOf("?"));
|
||||||
|
}
|
||||||
|
return dbPart;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return "storycove"; // fallback
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractDatabaseUsername() {
|
||||||
|
// Get from environment variable or default
|
||||||
|
return System.getenv("SPRING_DATASOURCE_USERNAME") != null ?
|
||||||
|
System.getenv("SPRING_DATASOURCE_USERNAME") : "storycove";
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractDatabasePassword() {
|
||||||
|
// Get from environment variable or default
|
||||||
|
return System.getenv("SPRING_DATASOURCE_PASSWORD") != null ?
|
||||||
|
System.getenv("SPRING_DATASOURCE_PASSWORD") : "password";
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Create a comprehensive backup including database and files in ZIP format
|
* Create a comprehensive backup including database and files in ZIP format
|
||||||
*/
|
*/
|
||||||
@@ -97,6 +166,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
|||||||
/**
|
/**
|
||||||
* Restore from complete backup (ZIP format)
|
* Restore from complete backup (ZIP format)
|
||||||
*/
|
*/
|
||||||
|
@Transactional(timeout = 1800) // 30 minutes timeout for large backup restores
|
||||||
public void restoreFromCompleteBackup(InputStream backupStream) throws IOException, SQLException {
|
public void restoreFromCompleteBackup(InputStream backupStream) throws IOException, SQLException {
|
||||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||||
System.err.println("Starting complete backup restore for library: " + currentLibraryId);
|
System.err.println("Starting complete backup restore for library: " + currentLibraryId);
|
||||||
@@ -171,157 +241,177 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
|||||||
}
|
}
|
||||||
|
|
||||||
public Resource createBackup() throws SQLException, IOException {
|
public Resource createBackup() throws SQLException, IOException {
|
||||||
StringBuilder sqlDump = new StringBuilder();
|
// Use PostgreSQL's native pg_dump for reliable backup
|
||||||
|
String dbHost = extractDatabaseHost();
|
||||||
try (Connection connection = getDataSource().getConnection()) {
|
String dbPort = extractDatabasePort();
|
||||||
// Add header
|
String dbName = extractDatabaseName();
|
||||||
sqlDump.append("-- StoryCove Database Backup\n");
|
String dbUser = extractDatabaseUsername();
|
||||||
sqlDump.append("-- Generated at: ").append(new java.util.Date()).append("\n\n");
|
String dbPassword = extractDatabasePassword();
|
||||||
|
|
||||||
// Disable foreign key checks during restore (PostgreSQL syntax)
|
// Create temporary file for backup
|
||||||
sqlDump.append("SET session_replication_role = replica;\n\n");
|
Path tempBackupFile = Files.createTempFile("storycove_backup_", ".sql");
|
||||||
|
|
||||||
// List of tables in dependency order (parents first for insertion)
|
try {
|
||||||
List<String> insertTables = Arrays.asList(
|
// Build pg_dump command
|
||||||
"authors", "series", "tags", "collections",
|
ProcessBuilder pb = new ProcessBuilder(
|
||||||
"stories", "story_tags", "author_urls", "collection_stories"
|
"pg_dump",
|
||||||
|
"--host=" + dbHost,
|
||||||
|
"--port=" + dbPort,
|
||||||
|
"--username=" + dbUser,
|
||||||
|
"--dbname=" + dbName,
|
||||||
|
"--no-password",
|
||||||
|
"--verbose",
|
||||||
|
"--clean",
|
||||||
|
"--if-exists",
|
||||||
|
"--create",
|
||||||
|
"--file=" + tempBackupFile.toString()
|
||||||
);
|
);
|
||||||
|
|
||||||
// TRUNCATE in reverse order (children first)
|
// Set PGPASSWORD environment variable
|
||||||
List<String> truncateTables = Arrays.asList(
|
Map<String, String> env = pb.environment();
|
||||||
"collection_stories", "author_urls", "story_tags",
|
env.put("PGPASSWORD", dbPassword);
|
||||||
"stories", "collections", "tags", "series", "authors"
|
|
||||||
);
|
System.err.println("Starting PostgreSQL backup using pg_dump...");
|
||||||
|
Process process = pb.start();
|
||||||
// Generate TRUNCATE statements for each table (assuming tables already exist)
|
|
||||||
for (String tableName : truncateTables) {
|
// Capture output
|
||||||
sqlDump.append("-- Truncate Table: ").append(tableName).append("\n");
|
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
|
||||||
sqlDump.append("TRUNCATE TABLE \"").append(tableName).append("\" CASCADE;\n");
|
String line;
|
||||||
}
|
while ((line = reader.readLine()) != null) {
|
||||||
sqlDump.append("\n");
|
System.err.println("pg_dump: " + line);
|
||||||
|
|
||||||
// Generate INSERT statements in dependency order
|
|
||||||
for (String tableName : insertTables) {
|
|
||||||
sqlDump.append("-- Data for Table: ").append(tableName).append("\n");
|
|
||||||
|
|
||||||
// Get table data
|
|
||||||
try (PreparedStatement stmt = connection.prepareStatement("SELECT * FROM \"" + tableName + "\"");
|
|
||||||
ResultSet rs = stmt.executeQuery()) {
|
|
||||||
|
|
||||||
ResultSetMetaData metaData = rs.getMetaData();
|
|
||||||
int columnCount = metaData.getColumnCount();
|
|
||||||
|
|
||||||
// Build column names for INSERT statement
|
|
||||||
StringBuilder columnNames = new StringBuilder();
|
|
||||||
for (int i = 1; i <= columnCount; i++) {
|
|
||||||
if (i > 1) columnNames.append(", ");
|
|
||||||
columnNames.append("\"").append(metaData.getColumnName(i)).append("\"");
|
|
||||||
}
|
|
||||||
|
|
||||||
while (rs.next()) {
|
|
||||||
sqlDump.append("INSERT INTO \"").append(tableName).append("\" (")
|
|
||||||
.append(columnNames).append(") VALUES (");
|
|
||||||
|
|
||||||
for (int i = 1; i <= columnCount; i++) {
|
|
||||||
if (i > 1) sqlDump.append(", ");
|
|
||||||
|
|
||||||
Object value = rs.getObject(i);
|
|
||||||
sqlDump.append(formatSqlValue(value));
|
|
||||||
}
|
|
||||||
|
|
||||||
sqlDump.append(");\n");
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
sqlDump.append("\n");
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Re-enable foreign key checks (PostgreSQL syntax)
|
int exitCode = process.waitFor();
|
||||||
sqlDump.append("SET session_replication_role = DEFAULT;\n");
|
if (exitCode != 0) {
|
||||||
|
throw new RuntimeException("pg_dump failed with exit code: " + exitCode);
|
||||||
|
}
|
||||||
|
|
||||||
|
System.err.println("PostgreSQL backup completed successfully");
|
||||||
|
|
||||||
|
// Read the backup file into memory
|
||||||
|
byte[] backupData = Files.readAllBytes(tempBackupFile);
|
||||||
|
return new ByteArrayResource(backupData);
|
||||||
|
|
||||||
|
} catch (InterruptedException e) {
|
||||||
|
Thread.currentThread().interrupt();
|
||||||
|
throw new RuntimeException("Backup process was interrupted", e);
|
||||||
|
} finally {
|
||||||
|
// Clean up temporary file
|
||||||
|
try {
|
||||||
|
Files.deleteIfExists(tempBackupFile);
|
||||||
|
} catch (IOException e) {
|
||||||
|
System.err.println("Warning: Could not delete temporary backup file: " + e.getMessage());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
byte[] backupData = sqlDump.toString().getBytes(StandardCharsets.UTF_8);
|
|
||||||
return new ByteArrayResource(backupData);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
@Transactional
|
@Transactional(timeout = 1800) // 30 minutes timeout for large backup restores
|
||||||
public void restoreFromBackup(InputStream backupStream) throws IOException, SQLException {
|
public void restoreFromBackup(InputStream backupStream) throws IOException, SQLException {
|
||||||
// Read the SQL file
|
// Use PostgreSQL's native psql for reliable restore
|
||||||
StringBuilder sqlContent = new StringBuilder();
|
String dbHost = extractDatabaseHost();
|
||||||
try (BufferedReader reader = new BufferedReader(new InputStreamReader(backupStream, StandardCharsets.UTF_8))) {
|
String dbPort = extractDatabasePort();
|
||||||
String line;
|
String dbName = extractDatabaseName();
|
||||||
while ((line = reader.readLine()) != null) {
|
String dbUser = extractDatabaseUsername();
|
||||||
// Skip comments and empty lines
|
String dbPassword = extractDatabasePassword();
|
||||||
if (!line.trim().startsWith("--") && !line.trim().isEmpty()) {
|
|
||||||
sqlContent.append(line).append("\n");
|
// Create temporary file for the backup
|
||||||
|
Path tempBackupFile = Files.createTempFile("storycove_restore_", ".sql");
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Write backup stream to temporary file
|
||||||
|
System.err.println("Writing backup data to temporary file...");
|
||||||
|
try (InputStream input = backupStream;
|
||||||
|
OutputStream output = Files.newOutputStream(tempBackupFile)) {
|
||||||
|
byte[] buffer = new byte[8192];
|
||||||
|
int bytesRead;
|
||||||
|
while ((bytesRead = input.read(buffer)) != -1) {
|
||||||
|
output.write(buffer, 0, bytesRead);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
// Execute the SQL statements
|
System.err.println("Starting PostgreSQL restore using psql...");
|
||||||
try (Connection connection = getDataSource().getConnection()) {
|
|
||||||
connection.setAutoCommit(false);
|
// Build psql command to restore the backup
|
||||||
|
ProcessBuilder pb = new ProcessBuilder(
|
||||||
try {
|
"psql",
|
||||||
// Ensure database schema exists before restoring data
|
"--host=" + dbHost,
|
||||||
ensureDatabaseSchemaExists(connection);
|
"--port=" + dbPort,
|
||||||
|
"--username=" + dbUser,
|
||||||
// Parse SQL statements properly (handle semicolons inside string literals)
|
"--dbname=" + dbName,
|
||||||
List<String> statements = parseStatements(sqlContent.toString());
|
"--no-password",
|
||||||
|
"--echo-errors",
|
||||||
int successCount = 0;
|
"--file=" + tempBackupFile.toString()
|
||||||
for (String statement : statements) {
|
);
|
||||||
String trimmedStatement = statement.trim();
|
|
||||||
if (!trimmedStatement.isEmpty()) {
|
// Set PGPASSWORD environment variable
|
||||||
try (PreparedStatement stmt = connection.prepareStatement(trimmedStatement)) {
|
Map<String, String> env = pb.environment();
|
||||||
stmt.executeUpdate();
|
env.put("PGPASSWORD", dbPassword);
|
||||||
successCount++;
|
|
||||||
} catch (SQLException e) {
|
Process process = pb.start();
|
||||||
// Log detailed error information for failed statements
|
|
||||||
System.err.println("ERROR: Failed to execute SQL statement #" + (successCount + 1));
|
// Capture output
|
||||||
System.err.println("Error: " + e.getMessage());
|
try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()));
|
||||||
System.err.println("SQL State: " + e.getSQLState());
|
BufferedReader outputReader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
|
||||||
System.err.println("Error Code: " + e.getErrorCode());
|
|
||||||
|
// Read stderr in a separate thread
|
||||||
// Show the problematic statement (first 500 chars)
|
Thread errorThread = new Thread(() -> {
|
||||||
String statementPreview = trimmedStatement.length() > 500 ?
|
try {
|
||||||
trimmedStatement.substring(0, 500) + "..." : trimmedStatement;
|
String line;
|
||||||
System.err.println("Statement: " + statementPreview);
|
while ((line = reader.readLine()) != null) {
|
||||||
|
System.err.println("psql stderr: " + line);
|
||||||
throw e; // Re-throw to trigger rollback
|
|
||||||
}
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
System.err.println("Error reading psql stderr: " + e.getMessage());
|
||||||
}
|
}
|
||||||
|
});
|
||||||
|
errorThread.start();
|
||||||
|
|
||||||
|
// Read stdout
|
||||||
|
String line;
|
||||||
|
while ((line = outputReader.readLine()) != null) {
|
||||||
|
System.err.println("psql stdout: " + line);
|
||||||
}
|
}
|
||||||
|
|
||||||
connection.commit();
|
errorThread.join();
|
||||||
System.err.println("Restore completed successfully. Executed " + successCount + " SQL statements.");
|
}
|
||||||
|
|
||||||
// Reindex search after successful restore
|
int exitCode = process.waitFor();
|
||||||
try {
|
if (exitCode != 0) {
|
||||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
throw new RuntimeException("psql restore failed with exit code: " + exitCode);
|
||||||
System.err.println("Starting search reindex after successful restore for library: " + currentLibraryId);
|
}
|
||||||
if (currentLibraryId == null) {
|
|
||||||
System.err.println("ERROR: No current library set during restore - cannot reindex search!");
|
System.err.println("PostgreSQL restore completed successfully");
|
||||||
throw new IllegalStateException("No current library active during restore");
|
|
||||||
}
|
// Reindex search after successful restore
|
||||||
|
try {
|
||||||
// Manually trigger reindexing using the correct database connection
|
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||||
System.err.println("Triggering manual reindex from library-specific database for library: " + currentLibraryId);
|
System.err.println("Starting search reindex after successful restore for library: " + currentLibraryId);
|
||||||
reindexStoriesAndAuthorsFromCurrentDatabase();
|
if (currentLibraryId == null) {
|
||||||
|
System.err.println("ERROR: No current library set during restore - cannot reindex search!");
|
||||||
// Note: Collections collection will be recreated when needed by the service
|
throw new IllegalStateException("No current library active during restore");
|
||||||
System.err.println("Search reindex completed successfully for library: " + currentLibraryId);
|
|
||||||
} catch (Exception e) {
|
|
||||||
// Log the error but don't fail the restore
|
|
||||||
System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
|
|
||||||
e.printStackTrace();
|
|
||||||
}
|
}
|
||||||
|
|
||||||
} catch (SQLException e) {
|
// Manually trigger reindexing using the correct database connection
|
||||||
connection.rollback();
|
System.err.println("Triggering manual reindex from library-specific database for library: " + currentLibraryId);
|
||||||
throw e;
|
reindexStoriesAndAuthorsFromCurrentDatabase();
|
||||||
} finally {
|
|
||||||
connection.setAutoCommit(true);
|
// Note: Collections collection will be recreated when needed by the service
|
||||||
|
System.err.println("Search reindex completed successfully for library: " + currentLibraryId);
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Log the error but don't fail the restore
|
||||||
|
System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
|
||||||
|
e.printStackTrace();
|
||||||
|
}
|
||||||
|
} catch (InterruptedException e) {
|
||||||
|
Thread.currentThread().interrupt();
|
||||||
|
throw new RuntimeException("Restore process was interrupted", e);
|
||||||
|
} finally {
|
||||||
|
// Clean up temporary file
|
||||||
|
try {
|
||||||
|
Files.deleteIfExists(tempBackupFile);
|
||||||
|
} catch (IOException e) {
|
||||||
|
System.err.println("Warning: Could not delete temporary restore file: " + e.getMessage());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -449,7 +539,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
|
|||||||
/**
|
/**
|
||||||
* Clear all data AND files (for complete restore)
|
* Clear all data AND files (for complete restore)
|
||||||
*/
|
*/
|
||||||
@Transactional
|
@Transactional(timeout = 600) // 10 minutes timeout for clearing large datasets
|
||||||
public int clearAllDataAndFiles() {
|
public int clearAllDataAndFiles() {
|
||||||
// First clear the database
|
// First clear the database
|
||||||
int totalDeleted = clearAllData();
|
int totalDeleted = clearAllData();
|
||||||
|
|||||||
@@ -16,6 +16,8 @@ import nl.siegmann.epublib.epub.EpubReader;
|
|||||||
|
|
||||||
import org.jsoup.Jsoup;
|
import org.jsoup.Jsoup;
|
||||||
import org.jsoup.nodes.Document;
|
import org.jsoup.nodes.Document;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.stereotype.Service;
|
import org.springframework.stereotype.Service;
|
||||||
import org.springframework.transaction.annotation.Transactional;
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
@@ -30,6 +32,7 @@ import java.util.Optional;
|
|||||||
@Service
|
@Service
|
||||||
@Transactional
|
@Transactional
|
||||||
public class EPUBImportService {
|
public class EPUBImportService {
|
||||||
|
private static final Logger log = LoggerFactory.getLogger(EPUBImportService.class);
|
||||||
|
|
||||||
private final StoryService storyService;
|
private final StoryService storyService;
|
||||||
private final AuthorService authorService;
|
private final AuthorService authorService;
|
||||||
@@ -87,12 +90,12 @@ public class EPUBImportService {
|
|||||||
savedStory = storyService.update(savedStory.getId(), savedStory);
|
savedStory = storyService.update(savedStory.getId(), savedStory);
|
||||||
|
|
||||||
// Log the image processing results
|
// Log the image processing results
|
||||||
System.out.println("EPUB Import - Image processing completed for story " + savedStory.getId() +
|
log.debug("EPUB Import - Image processing completed for story {}. Downloaded {} images.",
|
||||||
". Downloaded " + imageResult.getDownloadedImages().size() + " images.");
|
savedStory.getId(), imageResult.getDownloadedImages().size());
|
||||||
|
|
||||||
if (imageResult.hasWarnings()) {
|
if (imageResult.hasWarnings()) {
|
||||||
System.out.println("EPUB Import - Image processing warnings: " +
|
log.debug("EPUB Import - Image processing warnings: {}",
|
||||||
String.join(", ", imageResult.getWarnings()));
|
String.join(", ", imageResult.getWarnings()));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
@@ -282,7 +285,7 @@ public class EPUBImportService {
|
|||||||
if (language != null && !language.trim().isEmpty()) {
|
if (language != null && !language.trim().isEmpty()) {
|
||||||
// Store as metadata in story description if needed
|
// Store as metadata in story description if needed
|
||||||
// For now, we'll just log it for potential future use
|
// For now, we'll just log it for potential future use
|
||||||
System.out.println("EPUB Language: " + language);
|
log.debug("EPUB Language: {}", language);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Extract publisher information
|
// Extract publisher information
|
||||||
@@ -290,14 +293,14 @@ public class EPUBImportService {
|
|||||||
if (publishers != null && !publishers.isEmpty()) {
|
if (publishers != null && !publishers.isEmpty()) {
|
||||||
String publisher = publishers.get(0);
|
String publisher = publishers.get(0);
|
||||||
// Could append to description or store separately in future
|
// Could append to description or store separately in future
|
||||||
System.out.println("EPUB Publisher: " + publisher);
|
log.debug("EPUB Publisher: {}", publisher);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Extract publication date
|
// Extract publication date
|
||||||
List<nl.siegmann.epublib.domain.Date> dates = metadata.getDates();
|
List<nl.siegmann.epublib.domain.Date> dates = metadata.getDates();
|
||||||
if (dates != null && !dates.isEmpty()) {
|
if (dates != null && !dates.isEmpty()) {
|
||||||
for (nl.siegmann.epublib.domain.Date date : dates) {
|
for (nl.siegmann.epublib.domain.Date date : dates) {
|
||||||
System.out.println("EPUB Date (" + date.getEvent() + "): " + date.getValue());
|
log.debug("EPUB Date ({}): {}", date.getEvent(), date.getValue());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -305,7 +308,7 @@ public class EPUBImportService {
|
|||||||
List<nl.siegmann.epublib.domain.Identifier> identifiers = metadata.getIdentifiers();
|
List<nl.siegmann.epublib.domain.Identifier> identifiers = metadata.getIdentifiers();
|
||||||
if (identifiers != null && !identifiers.isEmpty()) {
|
if (identifiers != null && !identifiers.isEmpty()) {
|
||||||
for (nl.siegmann.epublib.domain.Identifier identifier : identifiers) {
|
for (nl.siegmann.epublib.domain.Identifier identifier : identifiers) {
|
||||||
System.out.println("EPUB Identifier (" + identifier.getScheme() + "): " + identifier.getValue());
|
log.debug("EPUB Identifier ({}): {}", identifier.getScheme(), identifier.getValue());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -137,12 +137,63 @@ public class HtmlSanitizationService {
|
|||||||
return config;
|
return config;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Preprocess HTML to extract images from figure tags before sanitization
|
||||||
|
*/
|
||||||
|
private String preprocessFigureTags(String html) {
|
||||||
|
if (html == null || html.trim().isEmpty()) {
|
||||||
|
return html;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
org.jsoup.nodes.Document doc = Jsoup.parse(html);
|
||||||
|
org.jsoup.select.Elements figures = doc.select("figure");
|
||||||
|
|
||||||
|
for (org.jsoup.nodes.Element figure : figures) {
|
||||||
|
// Find img tags within the figure
|
||||||
|
org.jsoup.select.Elements images = figure.select("img");
|
||||||
|
|
||||||
|
if (!images.isEmpty()) {
|
||||||
|
// Extract the first image and replace the figure with it
|
||||||
|
org.jsoup.nodes.Element img = images.first();
|
||||||
|
|
||||||
|
// Check if there's a figcaption to preserve as alt text
|
||||||
|
org.jsoup.select.Elements figcaptions = figure.select("figcaption");
|
||||||
|
if (!figcaptions.isEmpty() && !img.hasAttr("alt")) {
|
||||||
|
String captionText = figcaptions.first().text();
|
||||||
|
if (captionText != null && !captionText.trim().isEmpty()) {
|
||||||
|
img.attr("alt", captionText);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Replace the figure element with just the img
|
||||||
|
figure.replaceWith(img.clone());
|
||||||
|
logger.debug("Extracted image from figure tag: {}", img.attr("src"));
|
||||||
|
} else {
|
||||||
|
// No images in figure, remove it entirely
|
||||||
|
figure.remove();
|
||||||
|
logger.debug("Removed figure tag without images");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return doc.body().html();
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.warn("Failed to preprocess figure tags, returning original HTML: {}", e.getMessage());
|
||||||
|
return html;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public String sanitize(String html) {
|
public String sanitize(String html) {
|
||||||
if (html == null || html.trim().isEmpty()) {
|
if (html == null || html.trim().isEmpty()) {
|
||||||
return "";
|
return "";
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info("Content before sanitization: "+html);
|
logger.info("Content before sanitization: "+html);
|
||||||
String saniztedHtml = Jsoup.clean(html, allowlist.preserveRelativeLinks(true));
|
|
||||||
|
// Preprocess to extract images from figure tags
|
||||||
|
String preprocessed = preprocessFigureTags(html);
|
||||||
|
|
||||||
|
String saniztedHtml = Jsoup.clean(preprocessed, allowlist.preserveRelativeLinks(true));
|
||||||
logger.info("Content after sanitization: "+saniztedHtml);
|
logger.info("Content after sanitization: "+saniztedHtml);
|
||||||
return saniztedHtml;
|
return saniztedHtml;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -0,0 +1,108 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
|
||||||
|
import java.util.Map;
|
||||||
|
import java.util.UUID;
|
||||||
|
import java.util.concurrent.ConcurrentHashMap;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
public class ImageProcessingProgressService {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(ImageProcessingProgressService.class);
|
||||||
|
|
||||||
|
private final Map<UUID, ImageProcessingProgress> progressMap = new ConcurrentHashMap<>();
|
||||||
|
|
||||||
|
public static class ImageProcessingProgress {
|
||||||
|
private final UUID storyId;
|
||||||
|
private final int totalImages;
|
||||||
|
private volatile int processedImages;
|
||||||
|
private volatile String currentImageUrl;
|
||||||
|
private volatile String status;
|
||||||
|
private volatile boolean completed;
|
||||||
|
private volatile String errorMessage;
|
||||||
|
|
||||||
|
public ImageProcessingProgress(UUID storyId, int totalImages) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
this.totalImages = totalImages;
|
||||||
|
this.processedImages = 0;
|
||||||
|
this.status = "Starting";
|
||||||
|
this.completed = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters
|
||||||
|
public UUID getStoryId() { return storyId; }
|
||||||
|
public int getTotalImages() { return totalImages; }
|
||||||
|
public int getProcessedImages() { return processedImages; }
|
||||||
|
public String getCurrentImageUrl() { return currentImageUrl; }
|
||||||
|
public String getStatus() { return status; }
|
||||||
|
public boolean isCompleted() { return completed; }
|
||||||
|
public String getErrorMessage() { return errorMessage; }
|
||||||
|
public double getProgressPercentage() {
|
||||||
|
return totalImages > 0 ? (double) processedImages / totalImages * 100 : 100;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Setters
|
||||||
|
public void setProcessedImages(int processedImages) { this.processedImages = processedImages; }
|
||||||
|
public void setCurrentImageUrl(String currentImageUrl) { this.currentImageUrl = currentImageUrl; }
|
||||||
|
public void setStatus(String status) { this.status = status; }
|
||||||
|
public void setCompleted(boolean completed) { this.completed = completed; }
|
||||||
|
public void setErrorMessage(String errorMessage) { this.errorMessage = errorMessage; }
|
||||||
|
|
||||||
|
public void incrementProcessed() {
|
||||||
|
this.processedImages++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public ImageProcessingProgress startProgress(UUID storyId, int totalImages) {
|
||||||
|
ImageProcessingProgress progress = new ImageProcessingProgress(storyId, totalImages);
|
||||||
|
progressMap.put(storyId, progress);
|
||||||
|
logger.info("Started image processing progress tracking for story {} with {} images", storyId, totalImages);
|
||||||
|
return progress;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ImageProcessingProgress getProgress(UUID storyId) {
|
||||||
|
return progressMap.get(storyId);
|
||||||
|
}
|
||||||
|
|
||||||
|
public void updateProgress(UUID storyId, int processedImages, String currentImageUrl, String status) {
|
||||||
|
ImageProcessingProgress progress = progressMap.get(storyId);
|
||||||
|
if (progress != null) {
|
||||||
|
progress.setProcessedImages(processedImages);
|
||||||
|
progress.setCurrentImageUrl(currentImageUrl);
|
||||||
|
progress.setStatus(status);
|
||||||
|
logger.debug("Updated progress for story {}: {}/{} - {}", storyId, processedImages, progress.getTotalImages(), status);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public void completeProgress(UUID storyId, String finalStatus) {
|
||||||
|
ImageProcessingProgress progress = progressMap.get(storyId);
|
||||||
|
if (progress != null) {
|
||||||
|
progress.setCompleted(true);
|
||||||
|
progress.setStatus(finalStatus);
|
||||||
|
logger.info("Completed image processing for story {}: {}", storyId, finalStatus);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setError(UUID storyId, String errorMessage) {
|
||||||
|
ImageProcessingProgress progress = progressMap.get(storyId);
|
||||||
|
if (progress != null) {
|
||||||
|
progress.setErrorMessage(errorMessage);
|
||||||
|
progress.setStatus("Error: " + errorMessage);
|
||||||
|
progress.setCompleted(true);
|
||||||
|
logger.error("Image processing error for story {}: {}", storyId, errorMessage);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public void removeProgress(UUID storyId) {
|
||||||
|
progressMap.remove(storyId);
|
||||||
|
logger.debug("Removed progress tracking for story {}", storyId);
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isProcessing(UUID storyId) {
|
||||||
|
ImageProcessingProgress progress = progressMap.get(storyId);
|
||||||
|
return progress != null && !progress.isCompleted();
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -4,6 +4,8 @@ import org.slf4j.Logger;
|
|||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.beans.factory.annotation.Value;
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
|
import org.springframework.context.event.EventListener;
|
||||||
|
import org.springframework.scheduling.annotation.Async;
|
||||||
import org.springframework.stereotype.Service;
|
import org.springframework.stereotype.Service;
|
||||||
import org.springframework.web.multipart.MultipartFile;
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
@@ -21,6 +23,8 @@ import java.util.List;
|
|||||||
import java.util.regex.Matcher;
|
import java.util.regex.Matcher;
|
||||||
import java.util.regex.Pattern;
|
import java.util.regex.Pattern;
|
||||||
|
|
||||||
|
import com.storycove.event.StoryContentUpdatedEvent;
|
||||||
|
|
||||||
@Service
|
@Service
|
||||||
public class ImageService {
|
public class ImageService {
|
||||||
|
|
||||||
@@ -42,6 +46,12 @@ public class ImageService {
|
|||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
private StoryService storyService;
|
private StoryService storyService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private AuthorService authorService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private CollectionService collectionService;
|
||||||
|
|
||||||
private String getUploadDir() {
|
private String getUploadDir() {
|
||||||
String libraryPath = libraryService.getCurrentImagePath();
|
String libraryPath = libraryService.getCurrentImagePath();
|
||||||
@@ -248,14 +258,14 @@ public class ImageService {
|
|||||||
* Process HTML content and download all referenced images, replacing URLs with local paths
|
* Process HTML content and download all referenced images, replacing URLs with local paths
|
||||||
*/
|
*/
|
||||||
public ContentImageProcessingResult processContentImages(String htmlContent, UUID storyId) {
|
public ContentImageProcessingResult processContentImages(String htmlContent, UUID storyId) {
|
||||||
logger.info("Processing content images for story: {}, content length: {}", storyId,
|
logger.debug("Processing content images for story: {}, content length: {}", storyId,
|
||||||
htmlContent != null ? htmlContent.length() : 0);
|
htmlContent != null ? htmlContent.length() : 0);
|
||||||
|
|
||||||
List<String> warnings = new ArrayList<>();
|
List<String> warnings = new ArrayList<>();
|
||||||
List<String> downloadedImages = new ArrayList<>();
|
List<String> downloadedImages = new ArrayList<>();
|
||||||
|
|
||||||
if (htmlContent == null || htmlContent.trim().isEmpty()) {
|
if (htmlContent == null || htmlContent.trim().isEmpty()) {
|
||||||
logger.info("No content to process for story: {}", storyId);
|
logger.debug("No content to process for story: {}", storyId);
|
||||||
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
|
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -273,18 +283,18 @@ public class ImageService {
|
|||||||
String imageUrl = matcher.group(1);
|
String imageUrl = matcher.group(1);
|
||||||
imageCount++;
|
imageCount++;
|
||||||
|
|
||||||
logger.info("Found image #{}: {} in tag: {}", imageCount, imageUrl, fullImgTag);
|
logger.debug("Found image #{}: {} in tag: {}", imageCount, imageUrl, fullImgTag);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
// Skip if it's already a local path or data URL
|
// Skip if it's already a local path or data URL
|
||||||
if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) {
|
if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) {
|
||||||
logger.info("Skipping local/data URL: {}", imageUrl);
|
logger.debug("Skipping local/data URL: {}", imageUrl);
|
||||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
externalImageCount++;
|
externalImageCount++;
|
||||||
logger.info("Processing external image #{}: {}", externalImageCount, imageUrl);
|
logger.debug("Processing external image #{}: {}", externalImageCount, imageUrl);
|
||||||
|
|
||||||
// Download and store the image
|
// Download and store the image
|
||||||
String localPath = downloadImageFromUrl(imageUrl, storyId);
|
String localPath = downloadImageFromUrl(imageUrl, storyId);
|
||||||
@@ -292,7 +302,7 @@ public class ImageService {
|
|||||||
|
|
||||||
// Generate local URL
|
// Generate local URL
|
||||||
String localUrl = getLocalImageUrl(storyId, localPath);
|
String localUrl = getLocalImageUrl(storyId, localPath);
|
||||||
logger.info("Downloaded image: {} -> {}", imageUrl, localUrl);
|
logger.debug("Downloaded image: {} -> {}", imageUrl, localUrl);
|
||||||
|
|
||||||
// Replace the src attribute with the local path - handle both single and double quotes
|
// Replace the src attribute with the local path - handle both single and double quotes
|
||||||
String newImgTag = fullImgTag
|
String newImgTag = fullImgTag
|
||||||
@@ -305,7 +315,7 @@ public class ImageService {
|
|||||||
newImgTag = fullImgTag.replaceAll("src\\s*=\\s*[\"']?" + Pattern.quote(imageUrl) + "[\"']?", "src=\"" + localUrl + "\"");
|
newImgTag = fullImgTag.replaceAll("src\\s*=\\s*[\"']?" + Pattern.quote(imageUrl) + "[\"']?", "src=\"" + localUrl + "\"");
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info("Replaced img tag: {} -> {}", fullImgTag, newImgTag);
|
logger.debug("Replaced img tag: {} -> {}", fullImgTag, newImgTag);
|
||||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
|
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
@@ -324,6 +334,101 @@ public class ImageService {
|
|||||||
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
|
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Functional interface for progress callbacks during image processing
|
||||||
|
*/
|
||||||
|
@FunctionalInterface
|
||||||
|
public interface ImageProcessingProgressCallback {
|
||||||
|
void onProgress(String currentImageUrl, int processedCount, int totalCount);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Process content images with progress callbacks for async processing
|
||||||
|
*/
|
||||||
|
public ContentImageProcessingResult processContentImagesWithProgress(String htmlContent, UUID storyId, ImageProcessingProgressCallback progressCallback) {
|
||||||
|
logger.debug("Processing content images with progress for story: {}, content length: {}", storyId,
|
||||||
|
htmlContent != null ? htmlContent.length() : 0);
|
||||||
|
|
||||||
|
List<String> warnings = new ArrayList<>();
|
||||||
|
List<String> downloadedImages = new ArrayList<>();
|
||||||
|
|
||||||
|
if (htmlContent == null || htmlContent.trim().isEmpty()) {
|
||||||
|
logger.debug("No content to process for story: {}", storyId);
|
||||||
|
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find all img tags with src attributes
|
||||||
|
Pattern imgPattern = Pattern.compile("<img[^>]+src=[\"']([^\"']+)[\"'][^>]*>", Pattern.CASE_INSENSITIVE);
|
||||||
|
Matcher matcher = imgPattern.matcher(htmlContent);
|
||||||
|
|
||||||
|
// First pass: count external images
|
||||||
|
List<String> externalImages = new ArrayList<>();
|
||||||
|
Matcher countMatcher = imgPattern.matcher(htmlContent);
|
||||||
|
while (countMatcher.find()) {
|
||||||
|
String imageUrl = countMatcher.group(1);
|
||||||
|
if (!imageUrl.startsWith("/") && !imageUrl.startsWith("data:")) {
|
||||||
|
externalImages.add(imageUrl);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
int totalExternalImages = externalImages.size();
|
||||||
|
int processedCount = 0;
|
||||||
|
|
||||||
|
StringBuffer processedContent = new StringBuffer();
|
||||||
|
matcher.reset(); // Reset the matcher for processing
|
||||||
|
|
||||||
|
while (matcher.find()) {
|
||||||
|
String fullImgTag = matcher.group(0);
|
||||||
|
String imageUrl = matcher.group(1);
|
||||||
|
|
||||||
|
logger.debug("Found image: {} in tag: {}", imageUrl, fullImgTag);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Skip if it's already a local path or data URL
|
||||||
|
if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) {
|
||||||
|
logger.debug("Skipping local/data URL: {}", imageUrl);
|
||||||
|
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Call progress callback
|
||||||
|
if (progressCallback != null) {
|
||||||
|
progressCallback.onProgress(imageUrl, processedCount, totalExternalImages);
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.debug("Processing external image #{}: {}", processedCount + 1, imageUrl);
|
||||||
|
|
||||||
|
// Download and store the image
|
||||||
|
String localPath = downloadImageFromUrl(imageUrl, storyId);
|
||||||
|
downloadedImages.add(localPath);
|
||||||
|
|
||||||
|
// Generate local URL
|
||||||
|
String localUrl = getLocalImageUrl(storyId, localPath);
|
||||||
|
logger.debug("Downloaded image: {} -> {}", imageUrl, localUrl);
|
||||||
|
|
||||||
|
// Replace the src attribute with the local path
|
||||||
|
String newImgTag = fullImgTag
|
||||||
|
.replaceFirst("src=\"" + Pattern.quote(imageUrl) + "\"", "src=\"" + localUrl + "\"")
|
||||||
|
.replaceFirst("src='" + Pattern.quote(imageUrl) + "'", "src='" + localUrl + "'");
|
||||||
|
|
||||||
|
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
|
||||||
|
processedCount++;
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.warn("Failed to download image: {} - Error: {}", imageUrl, e.getMessage());
|
||||||
|
warnings.add("Failed to download image: " + imageUrl + " - " + e.getMessage());
|
||||||
|
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
matcher.appendTail(processedContent);
|
||||||
|
|
||||||
|
logger.info("Processed {} external images for story: {} (Total: {}, Downloaded: {}, Warnings: {})",
|
||||||
|
processedCount, storyId, processedCount, downloadedImages.size(), warnings.size());
|
||||||
|
|
||||||
|
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Download an image from a URL and store it locally
|
* Download an image from a URL and store it locally
|
||||||
*/
|
*/
|
||||||
@@ -388,7 +493,7 @@ public class ImageService {
|
|||||||
return "/api/files/images/default/" + imagePath;
|
return "/api/files/images/default/" + imagePath;
|
||||||
}
|
}
|
||||||
String localUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
String localUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
||||||
logger.info("Generated local image URL: {} for story: {}", localUrl, storyId);
|
logger.debug("Generated local image URL: {} for story: {}", localUrl, storyId);
|
||||||
return localUrl;
|
return localUrl;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -437,25 +542,26 @@ public class ImageService {
|
|||||||
int foldersToDelete = 0;
|
int foldersToDelete = 0;
|
||||||
|
|
||||||
// Step 1: Collect all image references from all story content
|
// Step 1: Collect all image references from all story content
|
||||||
logger.info("Scanning all story content for image references...");
|
logger.debug("Scanning all story content for image references...");
|
||||||
referencedImages = collectAllImageReferences();
|
referencedImages = collectAllImageReferences();
|
||||||
logger.info("Found {} unique image references in story content", referencedImages.size());
|
logger.debug("Found {} unique image references in story content", referencedImages.size());
|
||||||
|
|
||||||
try {
|
try {
|
||||||
// Step 2: Scan the content images directory
|
// Step 2: Scan the content images directory
|
||||||
Path contentImagesDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory());
|
Path contentImagesDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory());
|
||||||
|
|
||||||
if (!Files.exists(contentImagesDir)) {
|
if (!Files.exists(contentImagesDir)) {
|
||||||
logger.info("Content images directory does not exist: {}", contentImagesDir);
|
logger.debug("Content images directory does not exist: {}", contentImagesDir);
|
||||||
return new ContentImageCleanupResult(orphanedImages, 0, 0, referencedImages.size(), errors, dryRun);
|
return new ContentImageCleanupResult(orphanedImages, 0, 0, referencedImages.size(), errors, dryRun);
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info("Scanning content images directory: {}", contentImagesDir);
|
logger.debug("Scanning content images directory: {}", contentImagesDir);
|
||||||
|
|
||||||
// Walk through all story directories
|
// Walk through all story directories
|
||||||
Files.walk(contentImagesDir, 2)
|
Files.walk(contentImagesDir, 2)
|
||||||
.filter(Files::isDirectory)
|
.filter(Files::isDirectory)
|
||||||
.filter(path -> !path.equals(contentImagesDir)) // Skip the root content directory
|
.filter(path -> !path.equals(contentImagesDir)) // Skip the root content directory
|
||||||
|
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system directories
|
||||||
.forEach(storyDir -> {
|
.forEach(storyDir -> {
|
||||||
try {
|
try {
|
||||||
String storyId = storyDir.getFileName().toString();
|
String storyId = storyDir.getFileName().toString();
|
||||||
@@ -465,11 +571,13 @@ public class ImageService {
|
|||||||
boolean storyExists = storyService.findByIdOptional(UUID.fromString(storyId)).isPresent();
|
boolean storyExists = storyService.findByIdOptional(UUID.fromString(storyId)).isPresent();
|
||||||
|
|
||||||
if (!storyExists) {
|
if (!storyExists) {
|
||||||
logger.info("Found orphaned story directory (story deleted): {}", storyId);
|
logger.debug("Found orphaned story directory (story deleted): {}", storyId);
|
||||||
// Mark entire directory for deletion
|
// Mark entire directory for deletion
|
||||||
try {
|
try {
|
||||||
Files.walk(storyDir)
|
Files.walk(storyDir)
|
||||||
.filter(Files::isRegularFile)
|
.filter(Files::isRegularFile)
|
||||||
|
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system files
|
||||||
|
.filter(path -> isValidImageFile(path)) // Only process actual image files
|
||||||
.forEach(file -> {
|
.forEach(file -> {
|
||||||
try {
|
try {
|
||||||
long size = Files.size(file);
|
long size = Files.size(file);
|
||||||
@@ -489,13 +597,18 @@ public class ImageService {
|
|||||||
try {
|
try {
|
||||||
Files.walk(storyDir)
|
Files.walk(storyDir)
|
||||||
.filter(Files::isRegularFile)
|
.filter(Files::isRegularFile)
|
||||||
|
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system files
|
||||||
|
.filter(path -> isValidImageFile(path)) // Only process actual image files
|
||||||
.forEach(imageFile -> {
|
.forEach(imageFile -> {
|
||||||
try {
|
try {
|
||||||
String imagePath = getRelativeImagePath(imageFile);
|
String filename = imageFile.getFileName().toString();
|
||||||
|
|
||||||
if (!referencedImages.contains(imagePath)) {
|
// Only consider it orphaned if it's not in our referenced filenames
|
||||||
logger.debug("Found orphaned image: {}", imagePath);
|
if (!referencedImages.contains(filename)) {
|
||||||
|
logger.debug("Found orphaned image: {}", filename);
|
||||||
orphanedImages.add(imageFile.toString());
|
orphanedImages.add(imageFile.toString());
|
||||||
|
} else {
|
||||||
|
logger.debug("Image file is referenced, keeping: {}", filename);
|
||||||
}
|
}
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
errors.add("Error checking image file " + imageFile + ": " + e.getMessage());
|
errors.add("Error checking image file " + imageFile + ": " + e.getMessage());
|
||||||
@@ -535,7 +648,7 @@ public class ImageService {
|
|||||||
|
|
||||||
// Step 3: Delete orphaned files if not dry run
|
// Step 3: Delete orphaned files if not dry run
|
||||||
if (!dryRun && !orphanedImages.isEmpty()) {
|
if (!dryRun && !orphanedImages.isEmpty()) {
|
||||||
logger.info("Deleting {} orphaned images...", orphanedImages.size());
|
logger.debug("Deleting {} orphaned images...", orphanedImages.size());
|
||||||
|
|
||||||
Set<Path> directoriesToCheck = new HashSet<>();
|
Set<Path> directoriesToCheck = new HashSet<>();
|
||||||
|
|
||||||
@@ -557,7 +670,7 @@ public class ImageService {
|
|||||||
try {
|
try {
|
||||||
if (Files.exists(dir) && isDirEmpty(dir)) {
|
if (Files.exists(dir) && isDirEmpty(dir)) {
|
||||||
Files.delete(dir);
|
Files.delete(dir);
|
||||||
logger.info("Deleted empty story directory: {}", dir);
|
logger.debug("Deleted empty story directory: {}", dir);
|
||||||
}
|
}
|
||||||
} catch (IOException e) {
|
} catch (IOException e) {
|
||||||
errors.add("Failed to delete empty directory " + dir + ": " + e.getMessage());
|
errors.add("Failed to delete empty directory " + dir + ": " + e.getMessage());
|
||||||
@@ -577,10 +690,10 @@ public class ImageService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Collect all image references from all story content
|
* Collect all image filenames referenced in content (UUID-based filenames only)
|
||||||
*/
|
*/
|
||||||
private Set<String> collectAllImageReferences() {
|
private Set<String> collectAllImageReferences() {
|
||||||
Set<String> referencedImages = new HashSet<>();
|
Set<String> referencedFilenames = new HashSet<>();
|
||||||
|
|
||||||
try {
|
try {
|
||||||
// Get all stories
|
// Get all stories
|
||||||
@@ -590,27 +703,70 @@ public class ImageService {
|
|||||||
Pattern imagePattern = Pattern.compile("src=[\"']([^\"']*(?:content/[^\"']*\\.(jpg|jpeg|png)))[\"']", Pattern.CASE_INSENSITIVE);
|
Pattern imagePattern = Pattern.compile("src=[\"']([^\"']*(?:content/[^\"']*\\.(jpg|jpeg|png)))[\"']", Pattern.CASE_INSENSITIVE);
|
||||||
|
|
||||||
for (com.storycove.entity.Story story : allStories) {
|
for (com.storycove.entity.Story story : allStories) {
|
||||||
|
// Add story cover image filename if present
|
||||||
|
if (story.getCoverPath() != null && !story.getCoverPath().trim().isEmpty()) {
|
||||||
|
String filename = extractFilename(story.getCoverPath());
|
||||||
|
if (filename != null) {
|
||||||
|
referencedFilenames.add(filename);
|
||||||
|
logger.debug("Found cover image filename in story {}: {}", story.getId(), filename);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add author avatar image filename if present
|
||||||
|
if (story.getAuthor() != null && story.getAuthor().getAvatarImagePath() != null && !story.getAuthor().getAvatarImagePath().trim().isEmpty()) {
|
||||||
|
String filename = extractFilename(story.getAuthor().getAvatarImagePath());
|
||||||
|
if (filename != null) {
|
||||||
|
referencedFilenames.add(filename);
|
||||||
|
logger.debug("Found avatar image filename for author {}: {}", story.getAuthor().getId(), filename);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add content images from HTML
|
||||||
if (story.getContentHtml() != null) {
|
if (story.getContentHtml() != null) {
|
||||||
Matcher matcher = imagePattern.matcher(story.getContentHtml());
|
Matcher matcher = imagePattern.matcher(story.getContentHtml());
|
||||||
|
|
||||||
while (matcher.find()) {
|
while (matcher.find()) {
|
||||||
String imageSrc = matcher.group(1);
|
String imageSrc = matcher.group(1);
|
||||||
|
|
||||||
// Convert to relative path format that matches our file system
|
// Extract just the filename from the URL
|
||||||
String relativePath = convertSrcToRelativePath(imageSrc);
|
String filename = extractFilename(imageSrc);
|
||||||
if (relativePath != null) {
|
if (filename != null && isUuidBasedFilename(filename)) {
|
||||||
referencedImages.add(relativePath);
|
referencedFilenames.add(filename);
|
||||||
logger.debug("Found image reference in story {}: {}", story.getId(), relativePath);
|
logger.debug("Found content image filename in story {}: {}", story.getId(), filename);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Also get all authors separately to catch avatars for authors without stories
|
||||||
|
List<com.storycove.entity.Author> allAuthors = authorService.findAll();
|
||||||
|
for (com.storycove.entity.Author author : allAuthors) {
|
||||||
|
if (author.getAvatarImagePath() != null && !author.getAvatarImagePath().trim().isEmpty()) {
|
||||||
|
String filename = extractFilename(author.getAvatarImagePath());
|
||||||
|
if (filename != null) {
|
||||||
|
referencedFilenames.add(filename);
|
||||||
|
logger.debug("Found standalone avatar image filename for author {}: {}", author.getId(), filename);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Also get all collections to catch cover images
|
||||||
|
List<com.storycove.entity.Collection> allCollections = collectionService.findAllWithTags();
|
||||||
|
for (com.storycove.entity.Collection collection : allCollections) {
|
||||||
|
if (collection.getCoverImagePath() != null && !collection.getCoverImagePath().trim().isEmpty()) {
|
||||||
|
String filename = extractFilename(collection.getCoverImagePath());
|
||||||
|
if (filename != null) {
|
||||||
|
referencedFilenames.add(filename);
|
||||||
|
logger.debug("Found collection cover image filename for collection {}: {}", collection.getId(), filename);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Error collecting image references from stories", e);
|
logger.error("Error collecting image references from stories", e);
|
||||||
}
|
}
|
||||||
|
|
||||||
return referencedImages;
|
return referencedFilenames;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -629,6 +785,64 @@ public class ImageService {
|
|||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert absolute file path to relative path from upload directory
|
||||||
|
*/
|
||||||
|
private String convertAbsolutePathToRelative(String absolutePath) {
|
||||||
|
try {
|
||||||
|
if (absolutePath == null || absolutePath.trim().isEmpty()) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
Path absPath = Paths.get(absolutePath);
|
||||||
|
Path uploadDirPath = Paths.get(getUploadDir());
|
||||||
|
|
||||||
|
// If the path is already relative to upload dir, return as-is
|
||||||
|
if (!absPath.isAbsolute()) {
|
||||||
|
return absolutePath.replace('\\', '/');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to make it relative to the upload directory
|
||||||
|
if (absPath.startsWith(uploadDirPath)) {
|
||||||
|
Path relativePath = uploadDirPath.relativize(absPath);
|
||||||
|
return relativePath.toString().replace('\\', '/');
|
||||||
|
}
|
||||||
|
|
||||||
|
// If it's not under upload directory, check if it's library-specific path
|
||||||
|
String libraryPath = libraryService.getCurrentImagePath();
|
||||||
|
Path baseUploadPath = Paths.get(baseUploadDir);
|
||||||
|
|
||||||
|
if (absPath.startsWith(baseUploadPath)) {
|
||||||
|
Path relativePath = baseUploadPath.relativize(absPath);
|
||||||
|
String relativeStr = relativePath.toString().replace('\\', '/');
|
||||||
|
|
||||||
|
// Remove library prefix if present to make it library-agnostic for comparison
|
||||||
|
if (relativeStr.startsWith(libraryPath.substring(1))) { // Remove leading slash from library path
|
||||||
|
return relativeStr.substring(libraryPath.length() - 1); // Keep the leading slash
|
||||||
|
}
|
||||||
|
return relativeStr;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback: just use the filename portion if it's in the right structure
|
||||||
|
String fileName = absPath.getFileName().toString();
|
||||||
|
if (fileName.matches(".*\\.(jpg|jpeg|png)$")) {
|
||||||
|
// Try to preserve directory structure if it looks like covers/ or avatars/
|
||||||
|
Path parent = absPath.getParent();
|
||||||
|
if (parent != null) {
|
||||||
|
String parentName = parent.getFileName().toString();
|
||||||
|
if (parentName.equals("covers") || parentName.equals("avatars")) {
|
||||||
|
return parentName + "/" + fileName;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return fileName;
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.debug("Failed to convert absolute path to relative: {}", absolutePath, e);
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get relative image path from absolute file path
|
* Get relative image path from absolute file path
|
||||||
*/
|
*/
|
||||||
@@ -741,4 +955,115 @@ public class ImageService {
|
|||||||
return String.format("%.1f GB", totalSizeBytes / (1024.0 * 1024.0 * 1024.0));
|
return String.format("%.1f GB", totalSizeBytes / (1024.0 * 1024.0 * 1024.0));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a path is a Synology system path that should be ignored
|
||||||
|
*/
|
||||||
|
private boolean isSynologySystemPath(Path path) {
|
||||||
|
String pathStr = path.toString();
|
||||||
|
String fileName = path.getFileName().toString();
|
||||||
|
|
||||||
|
// Skip Synology metadata directories and files
|
||||||
|
return pathStr.contains("@eaDir") ||
|
||||||
|
fileName.startsWith("@") ||
|
||||||
|
fileName.contains("@SynoEAStream") ||
|
||||||
|
fileName.startsWith(".") ||
|
||||||
|
fileName.equals("Thumbs.db") ||
|
||||||
|
fileName.equals(".DS_Store");
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a file is a valid image file (not a system/metadata file)
|
||||||
|
*/
|
||||||
|
private boolean isValidImageFile(Path path) {
|
||||||
|
if (isSynologySystemPath(path)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
String fileName = path.getFileName().toString().toLowerCase();
|
||||||
|
return fileName.endsWith(".jpg") ||
|
||||||
|
fileName.endsWith(".jpeg") ||
|
||||||
|
fileName.endsWith(".png") ||
|
||||||
|
fileName.endsWith(".gif") ||
|
||||||
|
fileName.endsWith(".webp");
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract filename from a path or URL
|
||||||
|
*/
|
||||||
|
private String extractFilename(String pathOrUrl) {
|
||||||
|
if (pathOrUrl == null || pathOrUrl.trim().isEmpty()) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Remove query parameters if present
|
||||||
|
if (pathOrUrl.contains("?")) {
|
||||||
|
pathOrUrl = pathOrUrl.substring(0, pathOrUrl.indexOf("?"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the last part after slash
|
||||||
|
String filename = pathOrUrl.substring(pathOrUrl.lastIndexOf("/") + 1);
|
||||||
|
|
||||||
|
// Remove any special Synology suffixes
|
||||||
|
filename = filename.replace("@SynoEAStream", "");
|
||||||
|
|
||||||
|
return filename.trim().isEmpty() ? null : filename;
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.debug("Failed to extract filename from: {}", pathOrUrl);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a filename follows UUID pattern (indicates it's our generated file)
|
||||||
|
*/
|
||||||
|
private boolean isUuidBasedFilename(String filename) {
|
||||||
|
if (filename == null || filename.trim().isEmpty()) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove extension
|
||||||
|
String nameWithoutExt = filename;
|
||||||
|
int lastDot = filename.lastIndexOf(".");
|
||||||
|
if (lastDot > 0) {
|
||||||
|
nameWithoutExt = filename.substring(0, lastDot);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if it matches UUID pattern (8-4-4-4-12 hex characters)
|
||||||
|
return nameWithoutExt.matches("[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}");
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Event listener for story content updates - processes external images asynchronously
|
||||||
|
*/
|
||||||
|
@EventListener
|
||||||
|
@Async
|
||||||
|
public void handleStoryContentUpdated(StoryContentUpdatedEvent event) {
|
||||||
|
logger.info("Processing images for {} story {} after content update",
|
||||||
|
event.isNewStory() ? "new" : "updated", event.getStoryId());
|
||||||
|
|
||||||
|
try {
|
||||||
|
ContentImageProcessingResult result = processContentImages(event.getContentHtml(), event.getStoryId());
|
||||||
|
|
||||||
|
// If content was changed, we need to update the story (but this could cause circular events)
|
||||||
|
// Instead, let's just log the results for now and let the controller handle updates if needed
|
||||||
|
if (result.hasWarnings()) {
|
||||||
|
logger.warn("Image processing warnings for story {}: {}", event.getStoryId(), result.getWarnings());
|
||||||
|
}
|
||||||
|
if (!result.getDownloadedImages().isEmpty()) {
|
||||||
|
logger.info("Downloaded {} external images for story {}: {}",
|
||||||
|
result.getDownloadedImages().size(), event.getStoryId(), result.getDownloadedImages());
|
||||||
|
}
|
||||||
|
|
||||||
|
// TODO: If content was changed, we might need a way to update the story without triggering another event
|
||||||
|
if (!result.getProcessedContent().equals(event.getContentHtml())) {
|
||||||
|
logger.info("Story {} content was processed and external images were replaced with local URLs", event.getStoryId());
|
||||||
|
// For now, just log that processing occurred - the original content processing already handles updates
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to process images for story {}: {}", event.getStoryId(), e.getMessage(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -115,7 +115,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Switch to library after authentication with forced reindexing
|
* Switch to library after authentication with forced reindexing
|
||||||
* This ensures OpenSearch is always up-to-date after login
|
* This ensures Solr is always up-to-date after login
|
||||||
*/
|
*/
|
||||||
public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception {
|
public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception {
|
||||||
logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId);
|
logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId);
|
||||||
@@ -144,9 +144,9 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
String previousLibraryId = currentLibraryId;
|
String previousLibraryId = currentLibraryId;
|
||||||
|
|
||||||
if (libraryId.equals(currentLibraryId) && forceReindex) {
|
if (libraryId.equals(currentLibraryId) && forceReindex) {
|
||||||
logger.info("Forcing reindex for current library: {} ({})", library.getName(), libraryId);
|
logger.debug("Forcing reindex for current library: {} ({})", library.getName(), libraryId);
|
||||||
} else {
|
} else {
|
||||||
logger.info("Switching to library: {} ({})", library.getName(), libraryId);
|
logger.debug("Switching to library: {} ({})", library.getName(), libraryId);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Close current resources
|
// Close current resources
|
||||||
@@ -154,15 +154,15 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
|
|
||||||
// Set new active library (datasource routing handled by SmartRoutingDataSource)
|
// Set new active library (datasource routing handled by SmartRoutingDataSource)
|
||||||
currentLibraryId = libraryId;
|
currentLibraryId = libraryId;
|
||||||
// OpenSearch indexes are global - no per-library initialization needed
|
// Solr indexes are global - no per-library initialization needed
|
||||||
logger.info("Library switched to OpenSearch mode for library: {}", libraryId);
|
logger.debug("Library switched to Solr mode for library: {}", libraryId);
|
||||||
|
|
||||||
logger.info("Successfully switched to library: {}", library.getName());
|
logger.info("Successfully switched to library: {}", library.getName());
|
||||||
|
|
||||||
// Perform complete reindex AFTER library switch is fully complete
|
// Perform complete reindex AFTER library switch is fully complete
|
||||||
// This ensures database routing is properly established
|
// This ensures database routing is properly established
|
||||||
if (forceReindex || !libraryId.equals(previousLibraryId)) {
|
if (forceReindex || !libraryId.equals(previousLibraryId)) {
|
||||||
logger.info("Starting post-switch OpenSearch reindex for library: {}", libraryId);
|
logger.debug("Starting post-switch Solr reindex for library: {}", libraryId);
|
||||||
|
|
||||||
// Run reindex asynchronously to avoid blocking authentication response
|
// Run reindex asynchronously to avoid blocking authentication response
|
||||||
// and allow time for database routing to fully stabilize
|
// and allow time for database routing to fully stabilize
|
||||||
@@ -171,7 +171,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
try {
|
try {
|
||||||
// Give routing time to stabilize
|
// Give routing time to stabilize
|
||||||
Thread.sleep(500);
|
Thread.sleep(500);
|
||||||
logger.info("Starting async OpenSearch reindex for library: {}", finalLibraryId);
|
logger.debug("Starting async Solr reindex for library: {}", finalLibraryId);
|
||||||
|
|
||||||
SearchServiceAdapter searchService = applicationContext.getBean(SearchServiceAdapter.class);
|
SearchServiceAdapter searchService = applicationContext.getBean(SearchServiceAdapter.class);
|
||||||
// Get all stories and authors for reindexing
|
// Get all stories and authors for reindexing
|
||||||
@@ -184,12 +184,12 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
searchService.bulkIndexStories(allStories);
|
searchService.bulkIndexStories(allStories);
|
||||||
searchService.bulkIndexAuthors(allAuthors);
|
searchService.bulkIndexAuthors(allAuthors);
|
||||||
|
|
||||||
logger.info("Completed async OpenSearch reindexing for library: {} ({} stories, {} authors)",
|
logger.info("Completed async Solr reindexing for library: {} ({} stories, {} authors)",
|
||||||
finalLibraryId, allStories.size(), allAuthors.size());
|
finalLibraryId, allStories.size(), allAuthors.size());
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to async reindex OpenSearch for library {}: {}", finalLibraryId, e.getMessage());
|
logger.warn("Failed to async reindex Solr for library {}: {}", finalLibraryId, e.getMessage());
|
||||||
}
|
}
|
||||||
}, "OpenSearchReindex-" + libraryId).start();
|
}, "SolrReindex-" + libraryId).start();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -342,10 +342,10 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
library.setInitialized((Boolean) data.getOrDefault("initialized", false));
|
library.setInitialized((Boolean) data.getOrDefault("initialized", false));
|
||||||
|
|
||||||
libraries.put(id, library);
|
libraries.put(id, library);
|
||||||
logger.info("Loaded library: {} ({})", library.getName(), id);
|
logger.debug("Loaded library: {} ({})", library.getName(), id);
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
logger.info("No libraries configuration file found, will create default");
|
logger.debug("No libraries configuration file found, will create default");
|
||||||
}
|
}
|
||||||
} catch (IOException e) {
|
} catch (IOException e) {
|
||||||
logger.error("Failed to load libraries configuration", e);
|
logger.error("Failed to load libraries configuration", e);
|
||||||
@@ -411,7 +411,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
String json = objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(config);
|
String json = objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(config);
|
||||||
Files.writeString(Paths.get(LIBRARIES_CONFIG_PATH), json);
|
Files.writeString(Paths.get(LIBRARIES_CONFIG_PATH), json);
|
||||||
|
|
||||||
logger.info("Saved libraries configuration");
|
logger.debug("Saved libraries configuration");
|
||||||
} catch (IOException e) {
|
} catch (IOException e) {
|
||||||
logger.error("Failed to save libraries configuration", e);
|
logger.error("Failed to save libraries configuration", e);
|
||||||
}
|
}
|
||||||
@@ -419,7 +419,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
|
|
||||||
private DataSource createDataSource(String dbName) {
|
private DataSource createDataSource(String dbName) {
|
||||||
String url = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
String url = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
||||||
logger.info("Creating DataSource for: {}", url);
|
logger.debug("Creating DataSource for: {}", url);
|
||||||
|
|
||||||
// First, ensure the database exists
|
// First, ensure the database exists
|
||||||
ensureDatabaseExists(dbName);
|
ensureDatabaseExists(dbName);
|
||||||
@@ -459,7 +459,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
preparedStatement.setString(1, dbName);
|
preparedStatement.setString(1, dbName);
|
||||||
try (var resultSet = preparedStatement.executeQuery()) {
|
try (var resultSet = preparedStatement.executeQuery()) {
|
||||||
if (resultSet.next()) {
|
if (resultSet.next()) {
|
||||||
logger.info("Database {} already exists", dbName);
|
logger.debug("Database {} already exists", dbName);
|
||||||
return; // Database exists, nothing to do
|
return; // Database exists, nothing to do
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -488,7 +488,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
}
|
}
|
||||||
|
|
||||||
private void initializeNewDatabaseSchema(String dbName) {
|
private void initializeNewDatabaseSchema(String dbName) {
|
||||||
logger.info("Initializing schema for new database: {}", dbName);
|
logger.debug("Initializing schema for new database: {}", dbName);
|
||||||
|
|
||||||
// Create a temporary DataSource for the new database to initialize schema
|
// Create a temporary DataSource for the new database to initialize schema
|
||||||
String newDbUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
String newDbUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
||||||
@@ -505,7 +505,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
// Use Hibernate to create the schema
|
// Use Hibernate to create the schema
|
||||||
// This mimics what Spring Boot does during startup
|
// This mimics what Spring Boot does during startup
|
||||||
createSchemaUsingHibernate(tempDataSource);
|
createSchemaUsingHibernate(tempDataSource);
|
||||||
logger.info("Schema initialized for database: {}", dbName);
|
logger.debug("Schema initialized for database: {}", dbName);
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to initialize schema for database {}: {}", dbName, e.getMessage());
|
logger.error("Failed to initialize schema for database {}: {}", dbName, e.getMessage());
|
||||||
@@ -520,15 +520,15 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
}
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
logger.info("Initializing resources for new library: {}", library.getName());
|
logger.debug("Initializing resources for new library: {}", library.getName());
|
||||||
|
|
||||||
// 1. Create image directory structure
|
// 1. Create image directory structure
|
||||||
initializeImageDirectories(library);
|
initializeImageDirectories(library);
|
||||||
|
|
||||||
// 2. OpenSearch indexes are global and managed automatically
|
// 2. Solr indexes are global and managed automatically
|
||||||
// No per-library initialization needed for OpenSearch
|
// No per-library initialization needed for Solr
|
||||||
|
|
||||||
logger.info("Successfully initialized resources for library: {}", library.getName());
|
logger.debug("Successfully initialized resources for library: {}", library.getName());
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to initialize resources for library {}: {}", libraryId, e.getMessage());
|
logger.error("Failed to initialize resources for library {}: {}", libraryId, e.getMessage());
|
||||||
@@ -544,16 +544,16 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
|
|
||||||
if (!java.nio.file.Files.exists(libraryImagePath)) {
|
if (!java.nio.file.Files.exists(libraryImagePath)) {
|
||||||
java.nio.file.Files.createDirectories(libraryImagePath);
|
java.nio.file.Files.createDirectories(libraryImagePath);
|
||||||
logger.info("Created image directory: {}", imagePath);
|
logger.debug("Created image directory: {}", imagePath);
|
||||||
|
|
||||||
// Create subdirectories for different image types
|
// Create subdirectories for different image types
|
||||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("stories"));
|
java.nio.file.Files.createDirectories(libraryImagePath.resolve("stories"));
|
||||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("authors"));
|
java.nio.file.Files.createDirectories(libraryImagePath.resolve("authors"));
|
||||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("collections"));
|
java.nio.file.Files.createDirectories(libraryImagePath.resolve("collections"));
|
||||||
|
|
||||||
logger.info("Created image subdirectories for library: {}", library.getId());
|
logger.debug("Created image subdirectories for library: {}", library.getId());
|
||||||
} else {
|
} else {
|
||||||
logger.info("Image directory already exists: {}", imagePath);
|
logger.debug("Image directory already exists: {}", imagePath);
|
||||||
}
|
}
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
@@ -749,7 +749,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
statement.executeUpdate(sql);
|
statement.executeUpdate(sql);
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.info("Successfully created all database tables and constraints");
|
logger.debug("Successfully created all database tables and constraints");
|
||||||
|
|
||||||
} catch (SQLException e) {
|
} catch (SQLException e) {
|
||||||
logger.error("Failed to create database schema", e);
|
logger.error("Failed to create database schema", e);
|
||||||
@@ -760,7 +760,7 @@ public class LibraryService implements ApplicationContextAware {
|
|||||||
|
|
||||||
private void closeCurrentResources() {
|
private void closeCurrentResources() {
|
||||||
// No need to close datasource - SmartRoutingDataSource handles this
|
// No need to close datasource - SmartRoutingDataSource handles this
|
||||||
// OpenSearch service is managed by Spring - no explicit cleanup needed
|
// Solr service is managed by Spring - no explicit cleanup needed
|
||||||
// Don't clear currentLibraryId here - only when explicitly switching
|
// Don't clear currentLibraryId here - only when explicitly switching
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -1,133 +0,0 @@
|
|||||||
package com.storycove.service;
|
|
||||||
|
|
||||||
import com.storycove.config.OpenSearchProperties;
|
|
||||||
import org.opensearch.client.opensearch.OpenSearchClient;
|
|
||||||
import org.opensearch.client.opensearch.cluster.HealthRequest;
|
|
||||||
import org.opensearch.client.opensearch.cluster.HealthResponse;
|
|
||||||
import org.slf4j.Logger;
|
|
||||||
import org.slf4j.LoggerFactory;
|
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
|
||||||
import org.springframework.boot.actuate.health.Health;
|
|
||||||
import org.springframework.boot.actuate.health.HealthIndicator;
|
|
||||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
|
||||||
import org.springframework.scheduling.annotation.Scheduled;
|
|
||||||
import org.springframework.stereotype.Service;
|
|
||||||
|
|
||||||
import java.time.LocalDateTime;
|
|
||||||
import java.util.concurrent.atomic.AtomicReference;
|
|
||||||
|
|
||||||
@Service
|
|
||||||
@ConditionalOnProperty(name = "storycove.search.engine", havingValue = "opensearch")
|
|
||||||
public class OpenSearchHealthService implements HealthIndicator {
|
|
||||||
|
|
||||||
private static final Logger logger = LoggerFactory.getLogger(OpenSearchHealthService.class);
|
|
||||||
|
|
||||||
private final OpenSearchClient openSearchClient;
|
|
||||||
private final OpenSearchProperties properties;
|
|
||||||
|
|
||||||
private final AtomicReference<Health> lastKnownHealth = new AtomicReference<>(Health.unknown().build());
|
|
||||||
private LocalDateTime lastCheckTime = LocalDateTime.now();
|
|
||||||
|
|
||||||
@Autowired
|
|
||||||
public OpenSearchHealthService(OpenSearchClient openSearchClient, OpenSearchProperties properties) {
|
|
||||||
this.openSearchClient = openSearchClient;
|
|
||||||
this.properties = properties;
|
|
||||||
}
|
|
||||||
|
|
||||||
@Override
|
|
||||||
public Health health() {
|
|
||||||
return lastKnownHealth.get();
|
|
||||||
}
|
|
||||||
|
|
||||||
@Scheduled(fixedDelayString = "#{@openSearchProperties.health.checkInterval}")
|
|
||||||
public void performHealthCheck() {
|
|
||||||
try {
|
|
||||||
HealthResponse clusterHealth = openSearchClient.cluster().health(
|
|
||||||
HealthRequest.of(h -> h.timeout(t -> t.time("10s")))
|
|
||||||
);
|
|
||||||
|
|
||||||
Health.Builder healthBuilder = Health.up()
|
|
||||||
.withDetail("cluster_name", clusterHealth.clusterName())
|
|
||||||
.withDetail("status", clusterHealth.status().jsonValue())
|
|
||||||
.withDetail("number_of_nodes", clusterHealth.numberOfNodes())
|
|
||||||
.withDetail("number_of_data_nodes", clusterHealth.numberOfDataNodes())
|
|
||||||
.withDetail("active_primary_shards", clusterHealth.activePrimaryShards())
|
|
||||||
.withDetail("active_shards", clusterHealth.activeShards())
|
|
||||||
.withDetail("relocating_shards", clusterHealth.relocatingShards())
|
|
||||||
.withDetail("initializing_shards", clusterHealth.initializingShards())
|
|
||||||
.withDetail("unassigned_shards", clusterHealth.unassignedShards())
|
|
||||||
.withDetail("last_check", LocalDateTime.now());
|
|
||||||
|
|
||||||
// Check if cluster status is concerning
|
|
||||||
switch (clusterHealth.status()) {
|
|
||||||
case Red:
|
|
||||||
healthBuilder = Health.down()
|
|
||||||
.withDetail("reason", "Cluster status is RED - some primary shards are unassigned");
|
|
||||||
break;
|
|
||||||
case Yellow:
|
|
||||||
if (isProduction()) {
|
|
||||||
healthBuilder = Health.down()
|
|
||||||
.withDetail("reason", "Cluster status is YELLOW - some replica shards are unassigned (critical in production)");
|
|
||||||
} else {
|
|
||||||
// Yellow is acceptable in development (single node clusters)
|
|
||||||
healthBuilder.withDetail("warning", "Cluster status is YELLOW - acceptable for development");
|
|
||||||
}
|
|
||||||
break;
|
|
||||||
case Green:
|
|
||||||
// All good
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
lastKnownHealth.set(healthBuilder.build());
|
|
||||||
lastCheckTime = LocalDateTime.now();
|
|
||||||
|
|
||||||
if (properties.getHealth().isEnableMetrics()) {
|
|
||||||
logMetrics(clusterHealth);
|
|
||||||
}
|
|
||||||
|
|
||||||
} catch (Exception e) {
|
|
||||||
logger.error("OpenSearch health check failed", e);
|
|
||||||
Health unhealthyStatus = Health.down()
|
|
||||||
.withDetail("error", e.getMessage())
|
|
||||||
.withDetail("last_successful_check", lastCheckTime)
|
|
||||||
.withDetail("current_time", LocalDateTime.now())
|
|
||||||
.build();
|
|
||||||
lastKnownHealth.set(unhealthyStatus);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
private void logMetrics(HealthResponse clusterHealth) {
|
|
||||||
logger.info("OpenSearch Cluster Metrics - Status: {}, Nodes: {}, Active Shards: {}, Unassigned: {}",
|
|
||||||
clusterHealth.status().jsonValue(),
|
|
||||||
clusterHealth.numberOfNodes(),
|
|
||||||
clusterHealth.activeShards(),
|
|
||||||
clusterHealth.unassignedShards());
|
|
||||||
}
|
|
||||||
|
|
||||||
private boolean isProduction() {
|
|
||||||
return "production".equalsIgnoreCase(properties.getProfile());
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Manual health check for immediate status
|
|
||||||
*/
|
|
||||||
public boolean isClusterHealthy() {
|
|
||||||
Health currentHealth = lastKnownHealth.get();
|
|
||||||
return currentHealth.getStatus() == org.springframework.boot.actuate.health.Status.UP;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get detailed cluster information
|
|
||||||
*/
|
|
||||||
public String getClusterInfo() {
|
|
||||||
try {
|
|
||||||
var info = openSearchClient.info();
|
|
||||||
return String.format("OpenSearch %s (Cluster: %s, Lucene: %s)",
|
|
||||||
info.version().number(),
|
|
||||||
info.clusterName(),
|
|
||||||
info.version().luceneVersion());
|
|
||||||
} catch (Exception e) {
|
|
||||||
return "Unable to retrieve cluster information: " + e.getMessage();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -16,7 +16,7 @@ import java.util.UUID;
|
|||||||
/**
|
/**
|
||||||
* Service adapter that provides a unified interface for search operations.
|
* Service adapter that provides a unified interface for search operations.
|
||||||
*
|
*
|
||||||
* This adapter directly delegates to OpenSearchService.
|
* This adapter directly delegates to SolrService.
|
||||||
*/
|
*/
|
||||||
@Service
|
@Service
|
||||||
public class SearchServiceAdapter {
|
public class SearchServiceAdapter {
|
||||||
@@ -24,7 +24,7 @@ public class SearchServiceAdapter {
|
|||||||
private static final Logger logger = LoggerFactory.getLogger(SearchServiceAdapter.class);
|
private static final Logger logger = LoggerFactory.getLogger(SearchServiceAdapter.class);
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
private OpenSearchService openSearchService;
|
private SolrService solrService;
|
||||||
|
|
||||||
// ===============================
|
// ===============================
|
||||||
// SEARCH OPERATIONS
|
// SEARCH OPERATIONS
|
||||||
@@ -46,11 +46,20 @@ public class SearchServiceAdapter {
|
|||||||
String sourceDomain, String seriesFilter,
|
String sourceDomain, String seriesFilter,
|
||||||
Integer minTagCount, Boolean popularOnly,
|
Integer minTagCount, Boolean popularOnly,
|
||||||
Boolean hiddenGemsOnly) {
|
Boolean hiddenGemsOnly) {
|
||||||
return openSearchService.searchStories(query, tags, author, series, minWordCount, maxWordCount,
|
logger.info("SearchServiceAdapter: delegating search to SolrService");
|
||||||
minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy,
|
try {
|
||||||
createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus,
|
SearchResultDto<StorySearchDto> result = solrService.searchStories(query, tags, author, series, minWordCount, maxWordCount,
|
||||||
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly,
|
minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy,
|
||||||
hiddenGemsOnly);
|
createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus,
|
||||||
|
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly,
|
||||||
|
hiddenGemsOnly);
|
||||||
|
logger.info("SearchServiceAdapter: received result with {} stories and {} facets",
|
||||||
|
result.getResults().size(), result.getFacets().size());
|
||||||
|
return result;
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("SearchServiceAdapter: error during search", e);
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -60,7 +69,7 @@ public class SearchServiceAdapter {
|
|||||||
String series, Integer minWordCount, Integer maxWordCount,
|
String series, Integer minWordCount, Integer maxWordCount,
|
||||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||||
Long seed) {
|
Long seed) {
|
||||||
return openSearchService.getRandomStories(count, tags, author, series, minWordCount, maxWordCount,
|
return solrService.getRandomStories(count, tags, author, series, minWordCount, maxWordCount,
|
||||||
minRating, isRead, isFavorite, seed);
|
minRating, isRead, isFavorite, seed);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -69,7 +78,7 @@ public class SearchServiceAdapter {
|
|||||||
*/
|
*/
|
||||||
public void recreateIndices() {
|
public void recreateIndices() {
|
||||||
try {
|
try {
|
||||||
openSearchService.recreateIndices();
|
solrService.recreateIndices();
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to recreate search indices", e);
|
logger.error("Failed to recreate search indices", e);
|
||||||
throw new RuntimeException("Failed to recreate search indices", e);
|
throw new RuntimeException("Failed to recreate search indices", e);
|
||||||
@@ -93,21 +102,21 @@ public class SearchServiceAdapter {
|
|||||||
* Get random story ID with unified interface
|
* Get random story ID with unified interface
|
||||||
*/
|
*/
|
||||||
public String getRandomStoryId(Long seed) {
|
public String getRandomStoryId(Long seed) {
|
||||||
return openSearchService.getRandomStoryId(seed);
|
return solrService.getRandomStoryId(seed);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Search authors with unified interface
|
* Search authors with unified interface
|
||||||
*/
|
*/
|
||||||
public List<AuthorSearchDto> searchAuthors(String query, int limit) {
|
public List<AuthorSearchDto> searchAuthors(String query, int limit) {
|
||||||
return openSearchService.searchAuthors(query, limit);
|
return solrService.searchAuthors(query, limit);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get tag suggestions with unified interface
|
* Get tag suggestions with unified interface
|
||||||
*/
|
*/
|
||||||
public List<String> getTagSuggestions(String query, int limit) {
|
public List<String> getTagSuggestions(String query, int limit) {
|
||||||
return openSearchService.getTagSuggestions(query, limit);
|
return solrService.getTagSuggestions(query, limit);
|
||||||
}
|
}
|
||||||
|
|
||||||
// ===============================
|
// ===============================
|
||||||
@@ -115,88 +124,88 @@ public class SearchServiceAdapter {
|
|||||||
// ===============================
|
// ===============================
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Index a story in OpenSearch
|
* Index a story in Solr
|
||||||
*/
|
*/
|
||||||
public void indexStory(Story story) {
|
public void indexStory(Story story) {
|
||||||
try {
|
try {
|
||||||
openSearchService.indexStory(story);
|
solrService.indexStory(story);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to index story {}", story.getId(), e);
|
logger.error("Failed to index story {}", story.getId(), e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Update a story in OpenSearch
|
* Update a story in Solr
|
||||||
*/
|
*/
|
||||||
public void updateStory(Story story) {
|
public void updateStory(Story story) {
|
||||||
try {
|
try {
|
||||||
openSearchService.updateStory(story);
|
solrService.updateStory(story);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to update story {}", story.getId(), e);
|
logger.error("Failed to update story {}", story.getId(), e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Delete a story from OpenSearch
|
* Delete a story from Solr
|
||||||
*/
|
*/
|
||||||
public void deleteStory(UUID storyId) {
|
public void deleteStory(UUID storyId) {
|
||||||
try {
|
try {
|
||||||
openSearchService.deleteStory(storyId);
|
solrService.deleteStory(storyId);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to delete story {}", storyId, e);
|
logger.error("Failed to delete story {}", storyId, e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Index an author in OpenSearch
|
* Index an author in Solr
|
||||||
*/
|
*/
|
||||||
public void indexAuthor(Author author) {
|
public void indexAuthor(Author author) {
|
||||||
try {
|
try {
|
||||||
openSearchService.indexAuthor(author);
|
solrService.indexAuthor(author);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to index author {}", author.getId(), e);
|
logger.error("Failed to index author {}", author.getId(), e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Update an author in OpenSearch
|
* Update an author in Solr
|
||||||
*/
|
*/
|
||||||
public void updateAuthor(Author author) {
|
public void updateAuthor(Author author) {
|
||||||
try {
|
try {
|
||||||
openSearchService.updateAuthor(author);
|
solrService.updateAuthor(author);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to update author {}", author.getId(), e);
|
logger.error("Failed to update author {}", author.getId(), e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Delete an author from OpenSearch
|
* Delete an author from Solr
|
||||||
*/
|
*/
|
||||||
public void deleteAuthor(UUID authorId) {
|
public void deleteAuthor(UUID authorId) {
|
||||||
try {
|
try {
|
||||||
openSearchService.deleteAuthor(authorId);
|
solrService.deleteAuthor(authorId);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to delete author {}", authorId, e);
|
logger.error("Failed to delete author {}", authorId, e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Bulk index stories in OpenSearch
|
* Bulk index stories in Solr
|
||||||
*/
|
*/
|
||||||
public void bulkIndexStories(List<Story> stories) {
|
public void bulkIndexStories(List<Story> stories) {
|
||||||
try {
|
try {
|
||||||
openSearchService.bulkIndexStories(stories);
|
solrService.bulkIndexStories(stories);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to bulk index {} stories", stories.size(), e);
|
logger.error("Failed to bulk index {} stories", stories.size(), e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Bulk index authors in OpenSearch
|
* Bulk index authors in Solr
|
||||||
*/
|
*/
|
||||||
public void bulkIndexAuthors(List<Author> authors) {
|
public void bulkIndexAuthors(List<Author> authors) {
|
||||||
try {
|
try {
|
||||||
openSearchService.bulkIndexAuthors(authors);
|
solrService.bulkIndexAuthors(authors);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.error("Failed to bulk index {} authors", authors.size(), e);
|
logger.error("Failed to bulk index {} authors", authors.size(), e);
|
||||||
}
|
}
|
||||||
@@ -210,14 +219,14 @@ public class SearchServiceAdapter {
|
|||||||
* Check if search service is available and healthy
|
* Check if search service is available and healthy
|
||||||
*/
|
*/
|
||||||
public boolean isSearchServiceAvailable() {
|
public boolean isSearchServiceAvailable() {
|
||||||
return openSearchService.testConnection();
|
return solrService.testConnection();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get current search engine name
|
* Get current search engine name
|
||||||
*/
|
*/
|
||||||
public String getCurrentSearchEngine() {
|
public String getCurrentSearchEngine() {
|
||||||
return "opensearch";
|
return "solr";
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -228,10 +237,10 @@ public class SearchServiceAdapter {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Check if we can switch to OpenSearch
|
* Check if we can switch to Solr
|
||||||
*/
|
*/
|
||||||
public boolean canSwitchToOpenSearch() {
|
public boolean canSwitchToSolr() {
|
||||||
return true; // Already using OpenSearch
|
return true; // Already using Solr
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -246,10 +255,10 @@ public class SearchServiceAdapter {
|
|||||||
*/
|
*/
|
||||||
public SearchStatus getSearchStatus() {
|
public SearchStatus getSearchStatus() {
|
||||||
return new SearchStatus(
|
return new SearchStatus(
|
||||||
"opensearch",
|
"solr",
|
||||||
false, // no dual-write
|
false, // no dual-write
|
||||||
false, // no typesense
|
false, // no typesense
|
||||||
openSearchService.testConnection()
|
solrService.testConnection()
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -260,19 +269,19 @@ public class SearchServiceAdapter {
|
|||||||
private final String primaryEngine;
|
private final String primaryEngine;
|
||||||
private final boolean dualWrite;
|
private final boolean dualWrite;
|
||||||
private final boolean typesenseAvailable;
|
private final boolean typesenseAvailable;
|
||||||
private final boolean openSearchAvailable;
|
private final boolean solrAvailable;
|
||||||
|
|
||||||
public SearchStatus(String primaryEngine, boolean dualWrite,
|
public SearchStatus(String primaryEngine, boolean dualWrite,
|
||||||
boolean typesenseAvailable, boolean openSearchAvailable) {
|
boolean typesenseAvailable, boolean solrAvailable) {
|
||||||
this.primaryEngine = primaryEngine;
|
this.primaryEngine = primaryEngine;
|
||||||
this.dualWrite = dualWrite;
|
this.dualWrite = dualWrite;
|
||||||
this.typesenseAvailable = typesenseAvailable;
|
this.typesenseAvailable = typesenseAvailable;
|
||||||
this.openSearchAvailable = openSearchAvailable;
|
this.solrAvailable = solrAvailable;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getPrimaryEngine() { return primaryEngine; }
|
public String getPrimaryEngine() { return primaryEngine; }
|
||||||
public boolean isDualWrite() { return dualWrite; }
|
public boolean isDualWrite() { return dualWrite; }
|
||||||
public boolean isTypesenseAvailable() { return typesenseAvailable; }
|
public boolean isTypesenseAvailable() { return typesenseAvailable; }
|
||||||
public boolean isOpenSearchAvailable() { return openSearchAvailable; }
|
public boolean isSolrAvailable() { return solrAvailable; }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
1115
backend/src/main/java/com/storycove/service/SolrService.java
Normal file
1115
backend/src/main/java/com/storycove/service/SolrService.java
Normal file
File diff suppressed because it is too large
Load Diff
@@ -342,15 +342,15 @@ public class StoryService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
Story savedStory = storyRepository.save(story);
|
Story savedStory = storyRepository.save(story);
|
||||||
|
|
||||||
// Handle tags
|
// Handle tags
|
||||||
if (story.getTags() != null && !story.getTags().isEmpty()) {
|
if (story.getTags() != null && !story.getTags().isEmpty()) {
|
||||||
updateStoryTags(savedStory, story.getTags());
|
updateStoryTags(savedStory, story.getTags());
|
||||||
}
|
}
|
||||||
|
|
||||||
// Index in search engine
|
// Index in search engine
|
||||||
searchServiceAdapter.indexStory(savedStory);
|
searchServiceAdapter.indexStory(savedStory);
|
||||||
|
|
||||||
return savedStory;
|
return savedStory;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -370,15 +370,15 @@ public class StoryService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
Story savedStory = storyRepository.save(story);
|
Story savedStory = storyRepository.save(story);
|
||||||
|
|
||||||
// Handle tags by names
|
// Handle tags by names
|
||||||
if (tagNames != null && !tagNames.isEmpty()) {
|
if (tagNames != null && !tagNames.isEmpty()) {
|
||||||
updateStoryTagsByNames(savedStory, tagNames);
|
updateStoryTagsByNames(savedStory, tagNames);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Index in search engine
|
// Index in search engine
|
||||||
searchServiceAdapter.indexStory(savedStory);
|
searchServiceAdapter.indexStory(savedStory);
|
||||||
|
|
||||||
return savedStory;
|
return savedStory;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -422,6 +422,18 @@ public class StoryService {
|
|||||||
return updatedStory;
|
return updatedStory;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Story updateContentOnly(UUID id, String contentHtml) {
|
||||||
|
Story existingStory = findById(id);
|
||||||
|
existingStory.setContentHtml(contentHtml);
|
||||||
|
|
||||||
|
Story updatedStory = storyRepository.save(existingStory);
|
||||||
|
|
||||||
|
// Update in search engine since content changed
|
||||||
|
searchServiceAdapter.updateStory(updatedStory);
|
||||||
|
|
||||||
|
return updatedStory;
|
||||||
|
}
|
||||||
|
|
||||||
public void delete(UUID id) {
|
public void delete(UUID id) {
|
||||||
Story story = findById(id);
|
Story story = findById(id);
|
||||||
|
|
||||||
|
|||||||
@@ -4,6 +4,11 @@ spring:
|
|||||||
username: ${SPRING_DATASOURCE_USERNAME:storycove}
|
username: ${SPRING_DATASOURCE_USERNAME:storycove}
|
||||||
password: ${SPRING_DATASOURCE_PASSWORD:password}
|
password: ${SPRING_DATASOURCE_PASSWORD:password}
|
||||||
driver-class-name: org.postgresql.Driver
|
driver-class-name: org.postgresql.Driver
|
||||||
|
hikari:
|
||||||
|
connection-timeout: 60000 # 60 seconds
|
||||||
|
idle-timeout: 300000 # 5 minutes
|
||||||
|
max-lifetime: 1800000 # 30 minutes
|
||||||
|
maximum-pool-size: 20
|
||||||
|
|
||||||
jpa:
|
jpa:
|
||||||
hibernate:
|
hibernate:
|
||||||
@@ -16,8 +21,8 @@ spring:
|
|||||||
|
|
||||||
servlet:
|
servlet:
|
||||||
multipart:
|
multipart:
|
||||||
max-file-size: 256MB # Increased for backup restore
|
max-file-size: 600MB # Increased for large backup restore (425MB+)
|
||||||
max-request-size: 260MB # Slightly higher to account for form data
|
max-request-size: 610MB # Slightly higher to account for form data
|
||||||
|
|
||||||
jackson:
|
jackson:
|
||||||
serialization:
|
serialization:
|
||||||
@@ -27,6 +32,8 @@ spring:
|
|||||||
|
|
||||||
server:
|
server:
|
||||||
port: 8080
|
port: 8080
|
||||||
|
tomcat:
|
||||||
|
max-http-request-size: 650MB # Tomcat HTTP request size limit (separate from multipart)
|
||||||
|
|
||||||
storycove:
|
storycove:
|
||||||
app:
|
app:
|
||||||
@@ -39,54 +46,46 @@ storycove:
|
|||||||
auth:
|
auth:
|
||||||
password: ${APP_PASSWORD} # REQUIRED: No default password for security
|
password: ${APP_PASSWORD} # REQUIRED: No default password for security
|
||||||
search:
|
search:
|
||||||
engine: opensearch # OpenSearch is the only search engine
|
engine: solr # Apache Solr search engine
|
||||||
opensearch:
|
solr:
|
||||||
# Connection settings
|
# Connection settings
|
||||||
host: ${OPENSEARCH_HOST:localhost}
|
url: ${SOLR_URL:http://solr:8983/solr}
|
||||||
port: ${OPENSEARCH_PORT:9200}
|
username: ${SOLR_USERNAME:}
|
||||||
scheme: ${OPENSEARCH_SCHEME:http}
|
password: ${SOLR_PASSWORD:}
|
||||||
username: ${OPENSEARCH_USERNAME:}
|
|
||||||
password: ${OPENSEARCH_PASSWORD:} # Empty when security is disabled
|
|
||||||
|
|
||||||
# Environment-specific configuration
|
# Core configuration
|
||||||
profile: ${SPRING_PROFILES_ACTIVE:development} # development, staging, production
|
cores:
|
||||||
|
stories: ${SOLR_STORIES_CORE:storycove_stories}
|
||||||
|
authors: ${SOLR_AUTHORS_CORE:storycove_authors}
|
||||||
|
|
||||||
# Security settings
|
# Connection settings
|
||||||
security:
|
|
||||||
ssl-verification: ${OPENSEARCH_SSL_VERIFICATION:false}
|
|
||||||
trust-all-certificates: ${OPENSEARCH_TRUST_ALL_CERTS:true}
|
|
||||||
keystore-path: ${OPENSEARCH_KEYSTORE_PATH:}
|
|
||||||
keystore-password: ${OPENSEARCH_KEYSTORE_PASSWORD:}
|
|
||||||
truststore-path: ${OPENSEARCH_TRUSTSTORE_PATH:}
|
|
||||||
truststore-password: ${OPENSEARCH_TRUSTSTORE_PASSWORD:}
|
|
||||||
|
|
||||||
# Connection pool settings
|
|
||||||
connection:
|
connection:
|
||||||
timeout: ${OPENSEARCH_CONNECTION_TIMEOUT:30000} # 30 seconds
|
timeout: ${SOLR_CONNECTION_TIMEOUT:30000} # 30 seconds
|
||||||
socket-timeout: ${OPENSEARCH_SOCKET_TIMEOUT:60000} # 60 seconds
|
socket-timeout: ${SOLR_SOCKET_TIMEOUT:60000} # 60 seconds
|
||||||
max-connections-per-route: ${OPENSEARCH_MAX_CONN_PER_ROUTE:10}
|
max-connections-per-route: ${SOLR_MAX_CONN_PER_ROUTE:10}
|
||||||
max-connections-total: ${OPENSEARCH_MAX_CONN_TOTAL:30}
|
max-connections-total: ${SOLR_MAX_CONN_TOTAL:30}
|
||||||
retry-on-failure: ${OPENSEARCH_RETRY_ON_FAILURE:true}
|
retry-on-failure: ${SOLR_RETRY_ON_FAILURE:true}
|
||||||
max-retries: ${OPENSEARCH_MAX_RETRIES:3}
|
max-retries: ${SOLR_MAX_RETRIES:3}
|
||||||
|
|
||||||
# Index settings
|
# Query settings
|
||||||
indices:
|
query:
|
||||||
default-shards: ${OPENSEARCH_DEFAULT_SHARDS:1}
|
default-rows: ${SOLR_DEFAULT_ROWS:10}
|
||||||
default-replicas: ${OPENSEARCH_DEFAULT_REPLICAS:0}
|
max-rows: ${SOLR_MAX_ROWS:1000}
|
||||||
refresh-interval: ${OPENSEARCH_REFRESH_INTERVAL:1s}
|
default-operator: ${SOLR_DEFAULT_OPERATOR:AND}
|
||||||
|
highlight: ${SOLR_ENABLE_HIGHLIGHT:true}
|
||||||
|
facets: ${SOLR_ENABLE_FACETS:true}
|
||||||
|
|
||||||
# Bulk operations
|
# Commit settings
|
||||||
bulk:
|
commit:
|
||||||
actions: ${OPENSEARCH_BULK_ACTIONS:1000}
|
soft-commit: ${SOLR_SOFT_COMMIT:true}
|
||||||
size: ${OPENSEARCH_BULK_SIZE:5242880} # 5MB
|
commit-within: ${SOLR_COMMIT_WITHIN:1000} # 1 second
|
||||||
timeout: ${OPENSEARCH_BULK_TIMEOUT:10000} # 10 seconds
|
wait-searcher: ${SOLR_WAIT_SEARCHER:false}
|
||||||
concurrent-requests: ${OPENSEARCH_BULK_CONCURRENT:1}
|
|
||||||
|
|
||||||
# Health and monitoring
|
# Health and monitoring
|
||||||
health:
|
health:
|
||||||
check-interval: ${OPENSEARCH_HEALTH_CHECK_INTERVAL:30000} # 30 seconds
|
check-interval: ${SOLR_HEALTH_CHECK_INTERVAL:30000} # 30 seconds
|
||||||
slow-query-threshold: ${OPENSEARCH_SLOW_QUERY_THRESHOLD:5000} # 5 seconds
|
slow-query-threshold: ${SOLR_SLOW_QUERY_THRESHOLD:5000} # 5 seconds
|
||||||
enable-metrics: ${OPENSEARCH_ENABLE_METRICS:true}
|
enable-metrics: ${SOLR_ENABLE_METRICS:true}
|
||||||
images:
|
images:
|
||||||
storage-path: ${IMAGE_STORAGE_PATH:/app/images}
|
storage-path: ${IMAGE_STORAGE_PATH:/app/images}
|
||||||
|
|
||||||
@@ -100,8 +99,8 @@ management:
|
|||||||
show-details: when-authorized
|
show-details: when-authorized
|
||||||
show-components: always
|
show-components: always
|
||||||
health:
|
health:
|
||||||
opensearch:
|
solr:
|
||||||
enabled: ${OPENSEARCH_HEALTH_ENABLED:true}
|
enabled: ${SOLR_HEALTH_ENABLED:true}
|
||||||
|
|
||||||
logging:
|
logging:
|
||||||
level:
|
level:
|
||||||
|
|||||||
@@ -1,178 +0,0 @@
|
|||||||
# OpenSearch Configuration - Best Practices Implementation
|
|
||||||
|
|
||||||
## Overview
|
|
||||||
|
|
||||||
This directory contains a production-ready OpenSearch configuration following industry best practices for security, scalability, and maintainability.
|
|
||||||
|
|
||||||
## Architecture
|
|
||||||
|
|
||||||
### 📁 Directory Structure
|
|
||||||
```
|
|
||||||
opensearch/
|
|
||||||
├── config/
|
|
||||||
│ ├── opensearch-development.yml # Development-specific settings
|
|
||||||
│ └── opensearch-production.yml # Production-specific settings
|
|
||||||
├── mappings/
|
|
||||||
│ ├── stories-mapping.json # Story index mapping
|
|
||||||
│ ├── authors-mapping.json # Author index mapping
|
|
||||||
│ └── collections-mapping.json # Collection index mapping
|
|
||||||
├── templates/
|
|
||||||
│ ├── stories-template.json # Index template for stories_*
|
|
||||||
│ └── index-lifecycle-policy.json # ILM policy for index management
|
|
||||||
└── README.md # This file
|
|
||||||
```
|
|
||||||
|
|
||||||
## ✅ Best Practices Implemented
|
|
||||||
|
|
||||||
### 🔒 **Security**
|
|
||||||
- **Environment-Aware SSL Configuration**
|
|
||||||
- Production: Full certificate validation with custom truststore support
|
|
||||||
- Development: Optional certificate validation for local development
|
|
||||||
- **Proper Authentication**: Basic auth with secure credential management
|
|
||||||
- **Connection Security**: TLS 1.3 support with hostname verification
|
|
||||||
|
|
||||||
### 🏗️ **Configuration Management**
|
|
||||||
- **Externalized Configuration**: JSON/YAML files instead of hardcoded values
|
|
||||||
- **Environment-Specific Settings**: Different configs for dev/staging/prod
|
|
||||||
- **Type-Safe Properties**: Strongly-typed configuration classes
|
|
||||||
- **Validation**: Configuration validation at startup
|
|
||||||
|
|
||||||
### 📈 **Scalability & Performance**
|
|
||||||
- **Connection Pooling**: Configurable connection pool with timeout management
|
|
||||||
- **Environment-Aware Sharding**:
|
|
||||||
- Development: 1 shard, 0 replicas (single node)
|
|
||||||
- Production: 3 shards, 1 replica (high availability)
|
|
||||||
- **Bulk Operations**: Optimized bulk indexing with configurable batch sizes
|
|
||||||
- **Index Templates**: Automatic application of settings to new indexes
|
|
||||||
|
|
||||||
### 🔄 **Index Lifecycle Management**
|
|
||||||
- **Automated Index Rollover**: Based on size, document count, and age
|
|
||||||
- **Hot-Warm-Cold Architecture**: Optimized storage costs
|
|
||||||
- **Retention Policies**: Automatic cleanup of old data
|
|
||||||
- **Force Merge**: Optimization in warm phase
|
|
||||||
|
|
||||||
### 📊 **Monitoring & Observability**
|
|
||||||
- **Health Checks**: Automatic cluster health monitoring
|
|
||||||
- **Spring Boot Actuator**: Health endpoints for monitoring systems
|
|
||||||
- **Metrics Collection**: Configurable performance metrics
|
|
||||||
- **Slow Query Detection**: Configurable thresholds for query performance
|
|
||||||
|
|
||||||
### 🛡️ **Error Handling & Resilience**
|
|
||||||
- **Connection Retry Logic**: Automatic retry with backoff
|
|
||||||
- **Circuit Breaker Pattern**: Fail-fast for unhealthy clusters
|
|
||||||
- **Graceful Degradation**: Graceful handling when OpenSearch unavailable
|
|
||||||
- **Detailed Error Logging**: Comprehensive error tracking
|
|
||||||
|
|
||||||
## 🚀 Usage
|
|
||||||
|
|
||||||
### Development Environment
|
|
||||||
```yaml
|
|
||||||
# application-development.yml
|
|
||||||
storycove:
|
|
||||||
opensearch:
|
|
||||||
profile: development
|
|
||||||
security:
|
|
||||||
ssl-verification: false
|
|
||||||
trust-all-certificates: true
|
|
||||||
indices:
|
|
||||||
default-shards: 1
|
|
||||||
default-replicas: 0
|
|
||||||
```
|
|
||||||
|
|
||||||
### Production Environment
|
|
||||||
```yaml
|
|
||||||
# application-production.yml
|
|
||||||
storycove:
|
|
||||||
opensearch:
|
|
||||||
profile: production
|
|
||||||
security:
|
|
||||||
ssl-verification: true
|
|
||||||
trust-all-certificates: false
|
|
||||||
truststore-path: /etc/ssl/opensearch-truststore.jks
|
|
||||||
indices:
|
|
||||||
default-shards: 3
|
|
||||||
default-replicas: 1
|
|
||||||
```
|
|
||||||
|
|
||||||
## 📋 Environment Variables
|
|
||||||
|
|
||||||
### Required
|
|
||||||
- `OPENSEARCH_PASSWORD`: Admin password for OpenSearch cluster
|
|
||||||
|
|
||||||
### Optional (with sensible defaults)
|
|
||||||
- `OPENSEARCH_HOST`: Cluster hostname (default: localhost)
|
|
||||||
- `OPENSEARCH_PORT`: Cluster port (default: 9200)
|
|
||||||
- `OPENSEARCH_USERNAME`: Admin username (default: admin)
|
|
||||||
- `OPENSEARCH_SSL_VERIFICATION`: Enable SSL verification (default: false for dev)
|
|
||||||
- `OPENSEARCH_MAX_CONN_TOTAL`: Max connections (default: 30 for dev, 200 for prod)
|
|
||||||
|
|
||||||
## 🎯 Index Templates
|
|
||||||
|
|
||||||
Index templates automatically apply configuration to new indexes:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"index_patterns": ["stories_*"],
|
|
||||||
"template": {
|
|
||||||
"settings": {
|
|
||||||
"number_of_shards": "#{ENV_SPECIFIC}",
|
|
||||||
"analysis": {
|
|
||||||
"analyzer": {
|
|
||||||
"story_analyzer": {
|
|
||||||
"type": "standard",
|
|
||||||
"stopwords": "_english_"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🔍 Health Monitoring
|
|
||||||
|
|
||||||
Access health information:
|
|
||||||
- **Application Health**: `/actuator/health`
|
|
||||||
- **OpenSearch Specific**: `/actuator/health/opensearch`
|
|
||||||
- **Detailed Metrics**: Available when `enable-metrics: true`
|
|
||||||
|
|
||||||
## 🔄 Deployment Strategy
|
|
||||||
|
|
||||||
Recommended deployment approach:
|
|
||||||
|
|
||||||
1. **Development**: Test OpenSearch configuration locally
|
|
||||||
2. **Staging**: Validate performance and accuracy in staging environment
|
|
||||||
3. **Production**: Deploy with proper monitoring and backup procedures
|
|
||||||
|
|
||||||
## 🛠️ Troubleshooting
|
|
||||||
|
|
||||||
### Common Issues
|
|
||||||
|
|
||||||
1. **SSL Certificate Errors**
|
|
||||||
- Development: Set `trust-all-certificates: true`
|
|
||||||
- Production: Provide valid truststore path
|
|
||||||
|
|
||||||
2. **Connection Timeouts**
|
|
||||||
- Increase `connection.timeout` values
|
|
||||||
- Check network connectivity and firewall rules
|
|
||||||
|
|
||||||
3. **Index Creation Failures**
|
|
||||||
- Verify cluster health with `/actuator/health/opensearch`
|
|
||||||
- Check OpenSearch logs for detailed error messages
|
|
||||||
|
|
||||||
4. **Performance Issues**
|
|
||||||
- Monitor slow queries with configurable thresholds
|
|
||||||
- Adjust bulk operation settings
|
|
||||||
- Review shard allocation and replica settings
|
|
||||||
|
|
||||||
## 🔮 Future Enhancements
|
|
||||||
|
|
||||||
- **Multi-Cluster Support**: Connect to multiple OpenSearch clusters
|
|
||||||
- **Advanced Security**: Integration with OpenSearch Security plugin
|
|
||||||
- **Custom Analyzers**: Domain-specific text analysis
|
|
||||||
- **Index Aliases**: Zero-downtime index updates
|
|
||||||
- **Machine Learning**: Integration with OpenSearch ML features
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
This configuration provides a solid foundation that scales from development to enterprise production environments while maintaining security, performance, and operational excellence.
|
|
||||||
@@ -1,32 +0,0 @@
|
|||||||
# OpenSearch Development Configuration
|
|
||||||
opensearch:
|
|
||||||
cluster:
|
|
||||||
name: "storycove-dev"
|
|
||||||
initial_master_nodes: ["opensearch-node"]
|
|
||||||
|
|
||||||
# Development settings - single node, minimal resources
|
|
||||||
indices:
|
|
||||||
default_settings:
|
|
||||||
number_of_shards: 1
|
|
||||||
number_of_replicas: 0
|
|
||||||
refresh_interval: "1s"
|
|
||||||
|
|
||||||
# Security settings for development
|
|
||||||
security:
|
|
||||||
ssl_verification: false
|
|
||||||
trust_all_certificates: true
|
|
||||||
|
|
||||||
# Connection settings
|
|
||||||
connection:
|
|
||||||
timeout: "30s"
|
|
||||||
socket_timeout: "60s"
|
|
||||||
max_connections_per_route: 10
|
|
||||||
max_connections_total: 30
|
|
||||||
|
|
||||||
# Index management
|
|
||||||
index_management:
|
|
||||||
auto_create_templates: true
|
|
||||||
template_patterns:
|
|
||||||
stories: "stories_*"
|
|
||||||
authors: "authors_*"
|
|
||||||
collections: "collections_*"
|
|
||||||
@@ -1,60 +0,0 @@
|
|||||||
# OpenSearch Production Configuration
|
|
||||||
opensearch:
|
|
||||||
cluster:
|
|
||||||
name: "storycove-prod"
|
|
||||||
|
|
||||||
# Production settings - multi-shard, with replicas
|
|
||||||
indices:
|
|
||||||
default_settings:
|
|
||||||
number_of_shards: 3
|
|
||||||
number_of_replicas: 1
|
|
||||||
refresh_interval: "30s"
|
|
||||||
max_result_window: 50000
|
|
||||||
|
|
||||||
# Index lifecycle policies
|
|
||||||
lifecycle:
|
|
||||||
hot_phase_duration: "7d"
|
|
||||||
warm_phase_duration: "30d"
|
|
||||||
cold_phase_duration: "90d"
|
|
||||||
delete_after: "1y"
|
|
||||||
|
|
||||||
# Security settings for production
|
|
||||||
security:
|
|
||||||
ssl_verification: true
|
|
||||||
trust_all_certificates: false
|
|
||||||
certificate_verification: true
|
|
||||||
tls_version: "TLSv1.3"
|
|
||||||
|
|
||||||
# Connection settings
|
|
||||||
connection:
|
|
||||||
timeout: "10s"
|
|
||||||
socket_timeout: "30s"
|
|
||||||
max_connections_per_route: 50
|
|
||||||
max_connections_total: 200
|
|
||||||
retry_on_failure: true
|
|
||||||
max_retries: 3
|
|
||||||
retry_delay: "1s"
|
|
||||||
|
|
||||||
# Performance tuning
|
|
||||||
performance:
|
|
||||||
bulk_actions: 1000
|
|
||||||
bulk_size: "5MB"
|
|
||||||
bulk_timeout: "10s"
|
|
||||||
concurrent_requests: 4
|
|
||||||
|
|
||||||
# Monitoring and observability
|
|
||||||
monitoring:
|
|
||||||
health_check_interval: "30s"
|
|
||||||
slow_query_threshold: "5s"
|
|
||||||
enable_metrics: true
|
|
||||||
|
|
||||||
# Index management
|
|
||||||
index_management:
|
|
||||||
auto_create_templates: true
|
|
||||||
template_patterns:
|
|
||||||
stories: "stories_*"
|
|
||||||
authors: "authors_*"
|
|
||||||
collections: "collections_*"
|
|
||||||
retention_policy:
|
|
||||||
enabled: true
|
|
||||||
default_retention: "1y"
|
|
||||||
@@ -1,79 +0,0 @@
|
|||||||
{
|
|
||||||
"settings": {
|
|
||||||
"number_of_shards": 1,
|
|
||||||
"number_of_replicas": 0,
|
|
||||||
"analysis": {
|
|
||||||
"analyzer": {
|
|
||||||
"name_analyzer": {
|
|
||||||
"type": "standard",
|
|
||||||
"stopwords": "_english_"
|
|
||||||
},
|
|
||||||
"autocomplete_analyzer": {
|
|
||||||
"type": "custom",
|
|
||||||
"tokenizer": "standard",
|
|
||||||
"filter": ["lowercase", "edge_ngram"]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"filter": {
|
|
||||||
"edge_ngram": {
|
|
||||||
"type": "edge_ngram",
|
|
||||||
"min_gram": 2,
|
|
||||||
"max_gram": 20
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"mappings": {
|
|
||||||
"properties": {
|
|
||||||
"id": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"name": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "name_analyzer",
|
|
||||||
"fields": {
|
|
||||||
"autocomplete": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "autocomplete_analyzer"
|
|
||||||
},
|
|
||||||
"keyword": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"bio": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "name_analyzer"
|
|
||||||
},
|
|
||||||
"urls": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"imageUrl": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"storyCount": {
|
|
||||||
"type": "integer"
|
|
||||||
},
|
|
||||||
"averageRating": {
|
|
||||||
"type": "float"
|
|
||||||
},
|
|
||||||
"totalWordCount": {
|
|
||||||
"type": "long"
|
|
||||||
},
|
|
||||||
"totalReadingTime": {
|
|
||||||
"type": "integer"
|
|
||||||
},
|
|
||||||
"createdAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"updatedAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"libraryId": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,73 +0,0 @@
|
|||||||
{
|
|
||||||
"settings": {
|
|
||||||
"number_of_shards": 1,
|
|
||||||
"number_of_replicas": 0,
|
|
||||||
"analysis": {
|
|
||||||
"analyzer": {
|
|
||||||
"collection_analyzer": {
|
|
||||||
"type": "standard",
|
|
||||||
"stopwords": "_english_"
|
|
||||||
},
|
|
||||||
"autocomplete_analyzer": {
|
|
||||||
"type": "custom",
|
|
||||||
"tokenizer": "standard",
|
|
||||||
"filter": ["lowercase", "edge_ngram"]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"filter": {
|
|
||||||
"edge_ngram": {
|
|
||||||
"type": "edge_ngram",
|
|
||||||
"min_gram": 2,
|
|
||||||
"max_gram": 20
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"mappings": {
|
|
||||||
"properties": {
|
|
||||||
"id": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"name": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "collection_analyzer",
|
|
||||||
"fields": {
|
|
||||||
"autocomplete": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "autocomplete_analyzer"
|
|
||||||
},
|
|
||||||
"keyword": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"description": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "collection_analyzer"
|
|
||||||
},
|
|
||||||
"storyCount": {
|
|
||||||
"type": "integer"
|
|
||||||
},
|
|
||||||
"totalWordCount": {
|
|
||||||
"type": "long"
|
|
||||||
},
|
|
||||||
"averageRating": {
|
|
||||||
"type": "float"
|
|
||||||
},
|
|
||||||
"isPublic": {
|
|
||||||
"type": "boolean"
|
|
||||||
},
|
|
||||||
"createdAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"updatedAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"libraryId": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,120 +0,0 @@
|
|||||||
{
|
|
||||||
"settings": {
|
|
||||||
"number_of_shards": 1,
|
|
||||||
"number_of_replicas": 0,
|
|
||||||
"analysis": {
|
|
||||||
"analyzer": {
|
|
||||||
"story_analyzer": {
|
|
||||||
"type": "standard",
|
|
||||||
"stopwords": "_english_"
|
|
||||||
},
|
|
||||||
"autocomplete_analyzer": {
|
|
||||||
"type": "custom",
|
|
||||||
"tokenizer": "standard",
|
|
||||||
"filter": ["lowercase", "edge_ngram"]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"filter": {
|
|
||||||
"edge_ngram": {
|
|
||||||
"type": "edge_ngram",
|
|
||||||
"min_gram": 2,
|
|
||||||
"max_gram": 20
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"mappings": {
|
|
||||||
"properties": {
|
|
||||||
"id": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"title": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {
|
|
||||||
"autocomplete": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "autocomplete_analyzer"
|
|
||||||
},
|
|
||||||
"keyword": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"content": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer"
|
|
||||||
},
|
|
||||||
"summary": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer"
|
|
||||||
},
|
|
||||||
"authorNames": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {
|
|
||||||
"keyword": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"authorIds": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"tagNames": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"seriesTitle": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {
|
|
||||||
"keyword": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"seriesId": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"wordCount": {
|
|
||||||
"type": "integer"
|
|
||||||
},
|
|
||||||
"rating": {
|
|
||||||
"type": "float"
|
|
||||||
},
|
|
||||||
"readingTime": {
|
|
||||||
"type": "integer"
|
|
||||||
},
|
|
||||||
"language": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"status": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"createdAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"updatedAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"publishedAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"isRead": {
|
|
||||||
"type": "boolean"
|
|
||||||
},
|
|
||||||
"isFavorite": {
|
|
||||||
"type": "boolean"
|
|
||||||
},
|
|
||||||
"readingProgress": {
|
|
||||||
"type": "float"
|
|
||||||
},
|
|
||||||
"libraryId": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,77 +0,0 @@
|
|||||||
{
|
|
||||||
"policy": {
|
|
||||||
"description": "StoryCove index lifecycle policy",
|
|
||||||
"default_state": "hot",
|
|
||||||
"states": [
|
|
||||||
{
|
|
||||||
"name": "hot",
|
|
||||||
"actions": [
|
|
||||||
{
|
|
||||||
"rollover": {
|
|
||||||
"min_size": "50gb",
|
|
||||||
"min_doc_count": 1000000,
|
|
||||||
"min_age": "7d"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"transitions": [
|
|
||||||
{
|
|
||||||
"state_name": "warm",
|
|
||||||
"conditions": {
|
|
||||||
"min_age": "7d"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "warm",
|
|
||||||
"actions": [
|
|
||||||
{
|
|
||||||
"replica_count": {
|
|
||||||
"number_of_replicas": 0
|
|
||||||
}
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"force_merge": {
|
|
||||||
"max_num_segments": 1
|
|
||||||
}
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"transitions": [
|
|
||||||
{
|
|
||||||
"state_name": "cold",
|
|
||||||
"conditions": {
|
|
||||||
"min_age": "30d"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "cold",
|
|
||||||
"actions": [],
|
|
||||||
"transitions": [
|
|
||||||
{
|
|
||||||
"state_name": "delete",
|
|
||||||
"conditions": {
|
|
||||||
"min_age": "365d"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "delete",
|
|
||||||
"actions": [
|
|
||||||
{
|
|
||||||
"delete": {}
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"ism_template": [
|
|
||||||
{
|
|
||||||
"index_patterns": ["stories_*", "authors_*", "collections_*"],
|
|
||||||
"priority": 100
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -1,124 +0,0 @@
|
|||||||
{
|
|
||||||
"index_patterns": ["stories_*"],
|
|
||||||
"priority": 1,
|
|
||||||
"template": {
|
|
||||||
"settings": {
|
|
||||||
"number_of_shards": 1,
|
|
||||||
"number_of_replicas": 0,
|
|
||||||
"analysis": {
|
|
||||||
"analyzer": {
|
|
||||||
"story_analyzer": {
|
|
||||||
"type": "standard",
|
|
||||||
"stopwords": "_english_"
|
|
||||||
},
|
|
||||||
"autocomplete_analyzer": {
|
|
||||||
"type": "custom",
|
|
||||||
"tokenizer": "standard",
|
|
||||||
"filter": ["lowercase", "edge_ngram"]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"filter": {
|
|
||||||
"edge_ngram": {
|
|
||||||
"type": "edge_ngram",
|
|
||||||
"min_gram": 2,
|
|
||||||
"max_gram": 20
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"mappings": {
|
|
||||||
"properties": {
|
|
||||||
"id": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"title": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {
|
|
||||||
"autocomplete": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "autocomplete_analyzer"
|
|
||||||
},
|
|
||||||
"keyword": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"content": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer"
|
|
||||||
},
|
|
||||||
"summary": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer"
|
|
||||||
},
|
|
||||||
"authorNames": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {
|
|
||||||
"keyword": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"authorIds": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"tagNames": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"seriesTitle": {
|
|
||||||
"type": "text",
|
|
||||||
"analyzer": "story_analyzer",
|
|
||||||
"fields": {
|
|
||||||
"keyword": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"seriesId": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"wordCount": {
|
|
||||||
"type": "integer"
|
|
||||||
},
|
|
||||||
"rating": {
|
|
||||||
"type": "float"
|
|
||||||
},
|
|
||||||
"readingTime": {
|
|
||||||
"type": "integer"
|
|
||||||
},
|
|
||||||
"language": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"status": {
|
|
||||||
"type": "keyword"
|
|
||||||
},
|
|
||||||
"createdAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"updatedAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"publishedAt": {
|
|
||||||
"type": "date",
|
|
||||||
"format": "strict_date_optional_time||epoch_millis"
|
|
||||||
},
|
|
||||||
"isRead": {
|
|
||||||
"type": "boolean"
|
|
||||||
},
|
|
||||||
"isFavorite": {
|
|
||||||
"type": "boolean"
|
|
||||||
},
|
|
||||||
"readingProgress": {
|
|
||||||
"type": "float"
|
|
||||||
},
|
|
||||||
"libraryId": {
|
|
||||||
"type": "keyword"
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -19,11 +19,14 @@ storycove:
|
|||||||
auth:
|
auth:
|
||||||
password: test-password
|
password: test-password
|
||||||
search:
|
search:
|
||||||
engine: opensearch
|
engine: solr
|
||||||
opensearch:
|
solr:
|
||||||
host: localhost
|
host: localhost
|
||||||
port: 9200
|
port: 8983
|
||||||
scheme: http
|
scheme: http
|
||||||
|
cores:
|
||||||
|
stories: storycove_stories
|
||||||
|
authors: storycove_authors
|
||||||
images:
|
images:
|
||||||
storage-path: /tmp/test-images
|
storage-path: /tmp/test-images
|
||||||
|
|
||||||
|
|||||||
@@ -34,10 +34,10 @@ services:
|
|||||||
- SPRING_DATASOURCE_USERNAME=storycove
|
- SPRING_DATASOURCE_USERNAME=storycove
|
||||||
- SPRING_DATASOURCE_PASSWORD=${DB_PASSWORD}
|
- SPRING_DATASOURCE_PASSWORD=${DB_PASSWORD}
|
||||||
- JWT_SECRET=${JWT_SECRET}
|
- JWT_SECRET=${JWT_SECRET}
|
||||||
- OPENSEARCH_HOST=opensearch
|
- SOLR_HOST=solr
|
||||||
- OPENSEARCH_PORT=9200
|
- SOLR_PORT=8983
|
||||||
- OPENSEARCH_SCHEME=http
|
- SOLR_SCHEME=http
|
||||||
- SEARCH_ENGINE=${SEARCH_ENGINE:-opensearch}
|
- SEARCH_ENGINE=${SEARCH_ENGINE:-solr}
|
||||||
- IMAGE_STORAGE_PATH=/app/images
|
- IMAGE_STORAGE_PATH=/app/images
|
||||||
- APP_PASSWORD=${APP_PASSWORD}
|
- APP_PASSWORD=${APP_PASSWORD}
|
||||||
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
|
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
|
||||||
@@ -45,8 +45,10 @@ services:
|
|||||||
- images_data:/app/images
|
- images_data:/app/images
|
||||||
- library_config:/app/config
|
- library_config:/app/config
|
||||||
depends_on:
|
depends_on:
|
||||||
- postgres
|
postgres:
|
||||||
- opensearch
|
condition: service_started
|
||||||
|
solr:
|
||||||
|
condition: service_started
|
||||||
networks:
|
networks:
|
||||||
- storycove-network
|
- storycove-network
|
||||||
|
|
||||||
@@ -65,45 +67,38 @@ services:
|
|||||||
- storycove-network
|
- storycove-network
|
||||||
|
|
||||||
|
|
||||||
opensearch:
|
solr:
|
||||||
image: opensearchproject/opensearch:3.2.0
|
build:
|
||||||
# No port mapping - only accessible within the Docker network
|
context: .
|
||||||
|
dockerfile: solr.Dockerfile
|
||||||
|
ports:
|
||||||
|
- "8983:8983" # Expose Solr Admin UI for development
|
||||||
environment:
|
environment:
|
||||||
- cluster.name=storycove-opensearch
|
- SOLR_HEAP=512m
|
||||||
- node.name=opensearch-node
|
- SOLR_JAVA_MEM=-Xms256m -Xmx512m
|
||||||
- discovery.type=single-node
|
|
||||||
- bootstrap.memory_lock=false
|
|
||||||
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
|
|
||||||
- "DISABLE_INSTALL_DEMO_CONFIG=true"
|
|
||||||
- "DISABLE_SECURITY_PLUGIN=true"
|
|
||||||
ulimits:
|
|
||||||
memlock:
|
|
||||||
soft: -1
|
|
||||||
hard: -1
|
|
||||||
nofile:
|
|
||||||
soft: 65536
|
|
||||||
hard: 65536
|
|
||||||
volumes:
|
volumes:
|
||||||
- opensearch_data:/usr/share/opensearch/data
|
- solr_data:/var/solr
|
||||||
|
deploy:
|
||||||
|
resources:
|
||||||
|
limits:
|
||||||
|
memory: 1G
|
||||||
|
reservations:
|
||||||
|
memory: 512M
|
||||||
|
stop_grace_period: 30s
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "curl -f http://localhost:8983/solr/admin/ping || exit 1"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 10s
|
||||||
|
retries: 5
|
||||||
|
start_period: 60s
|
||||||
networks:
|
networks:
|
||||||
- storycove-network
|
- storycove-network
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
|
|
||||||
opensearch-dashboards:
|
|
||||||
image: opensearchproject/opensearch-dashboards:3.2.0
|
|
||||||
ports:
|
|
||||||
- "5601:5601" # Expose OpenSearch Dashboard
|
|
||||||
environment:
|
|
||||||
- OPENSEARCH_HOSTS=http://opensearch:9200
|
|
||||||
- "DISABLE_SECURITY_DASHBOARDS_PLUGIN=true"
|
|
||||||
depends_on:
|
|
||||||
- opensearch
|
|
||||||
networks:
|
|
||||||
- storycove-network
|
|
||||||
|
|
||||||
volumes:
|
volumes:
|
||||||
postgres_data:
|
postgres_data:
|
||||||
opensearch_data:
|
solr_data:
|
||||||
images_data:
|
images_data:
|
||||||
library_config:
|
library_config:
|
||||||
|
|
||||||
@@ -122,7 +117,7 @@ configs:
|
|||||||
}
|
}
|
||||||
server {
|
server {
|
||||||
listen 80;
|
listen 80;
|
||||||
client_max_body_size 256M;
|
client_max_body_size 600M;
|
||||||
location / {
|
location / {
|
||||||
proxy_pass http://frontend;
|
proxy_pass http://frontend;
|
||||||
proxy_http_version 1.1;
|
proxy_http_version 1.1;
|
||||||
@@ -140,9 +135,13 @@ configs:
|
|||||||
proxy_set_header X-Real-IP $$remote_addr;
|
proxy_set_header X-Real-IP $$remote_addr;
|
||||||
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
|
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
|
||||||
proxy_set_header X-Forwarded-Proto $$scheme;
|
proxy_set_header X-Forwarded-Proto $$scheme;
|
||||||
proxy_connect_timeout 60s;
|
proxy_connect_timeout 900s;
|
||||||
proxy_send_timeout 60s;
|
proxy_send_timeout 900s;
|
||||||
proxy_read_timeout 60s;
|
proxy_read_timeout 900s;
|
||||||
|
# Large upload settings
|
||||||
|
client_max_body_size 600M;
|
||||||
|
proxy_request_buffering off;
|
||||||
|
proxy_max_temp_file_size 0;
|
||||||
}
|
}
|
||||||
location /images/ {
|
location /images/ {
|
||||||
alias /app/images/;
|
alias /app/images/;
|
||||||
|
|||||||
@@ -20,12 +20,23 @@ COPY --from=deps /app/node_modules ./node_modules
|
|||||||
COPY . .
|
COPY . .
|
||||||
|
|
||||||
# Set Node.js memory limit for build
|
# Set Node.js memory limit for build
|
||||||
ENV NODE_OPTIONS="--max-old-space-size=1024"
|
ENV NODE_OPTIONS="--max-old-space-size=2048"
|
||||||
ENV NEXT_TELEMETRY_DISABLED=1
|
ENV NEXT_TELEMETRY_DISABLED=1
|
||||||
|
|
||||||
# Build the application
|
# List files to ensure everything is copied correctly
|
||||||
|
RUN ls -la
|
||||||
|
|
||||||
|
# Force clean build - remove any cached build artifacts
|
||||||
|
RUN rm -rf .next || true
|
||||||
|
|
||||||
|
# Build the application with verbose logging
|
||||||
RUN npm run build
|
RUN npm run build
|
||||||
|
|
||||||
|
# Verify the build output exists
|
||||||
|
RUN ls -la .next/ || (echo ".next directory not found!" && exit 1)
|
||||||
|
RUN ls -la .next/standalone/ || (echo ".next/standalone directory not found!" && cat build.log && exit 1)
|
||||||
|
RUN ls -la .next/static/ || (echo ".next/static directory not found!" && exit 1)
|
||||||
|
|
||||||
# Production stage
|
# Production stage
|
||||||
FROM node:18-alpine AS runner
|
FROM node:18-alpine AS runner
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
|
|||||||
@@ -2,6 +2,7 @@
|
|||||||
const nextConfig = {
|
const nextConfig = {
|
||||||
// Enable standalone output for optimized Docker builds
|
// Enable standalone output for optimized Docker builds
|
||||||
output: 'standalone',
|
output: 'standalone',
|
||||||
|
// Note: Body size limits are handled by nginx and backend, not Next.js frontend
|
||||||
// Removed Next.js rewrites since nginx handles all API routing
|
// Removed Next.js rewrites since nginx handles all API routing
|
||||||
webpack: (config, { isServer }) => {
|
webpack: (config, { isServer }) => {
|
||||||
// Exclude cheerio and its dependencies from client-side bundling
|
// Exclude cheerio and its dependencies from client-side bundling
|
||||||
|
|||||||
6542
frontend/package-lock.json
generated
Normal file
6542
frontend/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
@@ -10,22 +10,23 @@
|
|||||||
"type-check": "tsc --noEmit"
|
"type-check": "tsc --noEmit"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@heroicons/react": "^2.2.0",
|
"@heroicons/react": "^2.2.0",
|
||||||
"@portabletext/editor": "2.12.0",
|
"autoprefixer": "^10.4.16",
|
||||||
"@portabletext/keyboard-shortcuts": "^1.1.1",
|
"axios": "^1.7.7",
|
||||||
"@portabletext/react": "4.0.3",
|
"cheerio": "^1.0.0-rc.12",
|
||||||
"@portabletext/types": "2.0.14",
|
"dompurify": "^3.2.6",
|
||||||
"autoprefixer": "^10.4.16",
|
"next": "^14.2.32",
|
||||||
"axios": "^1.11.0",
|
"postcss": "^8.4.31",
|
||||||
"cheerio": "^1.0.0-rc.12",
|
"react": "^18",
|
||||||
"dompurify": "^3.2.6",
|
"react-dom": "^18",
|
||||||
"next": "^14.2.32",
|
"react-dropzone": "^14.2.3",
|
||||||
"postcss": "^8.4.31",
|
"rxjs": "^7.8.1",
|
||||||
"react": "^18",
|
"server-only": "^0.0.1",
|
||||||
"react-dom": "^18",
|
"slate": "^0.118.1",
|
||||||
"react-dropzone": "^14.2.3",
|
"slate-react": "^0.117.4",
|
||||||
"server-only": "^0.0.1",
|
"slate-history": "^0.113.1",
|
||||||
"tailwindcss": "^3.3.0"
|
"slate-dom": "^0.117.0",
|
||||||
|
"tailwindcss": "^3.3.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@types/dompurify": "^3.0.5",
|
"@types/dompurify": "^3.0.5",
|
||||||
@@ -36,4 +37,4 @@
|
|||||||
"eslint-config-next": "14.0.0",
|
"eslint-config-next": "14.0.0",
|
||||||
"typescript": "^5"
|
"typescript": "^5"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -6,7 +6,7 @@ import { useAuth } from '../../contexts/AuthContext';
|
|||||||
import { Input, Textarea } from '../../components/ui/Input';
|
import { Input, Textarea } from '../../components/ui/Input';
|
||||||
import Button from '../../components/ui/Button';
|
import Button from '../../components/ui/Button';
|
||||||
import TagInput from '../../components/stories/TagInput';
|
import TagInput from '../../components/stories/TagInput';
|
||||||
import PortableTextEditor from '../../components/stories/PortableTextEditorNew';
|
import SlateEditor from '../../components/stories/SlateEditor';
|
||||||
import ImageUpload from '../../components/ui/ImageUpload';
|
import ImageUpload from '../../components/ui/ImageUpload';
|
||||||
import AuthorSelector from '../../components/stories/AuthorSelector';
|
import AuthorSelector from '../../components/stories/AuthorSelector';
|
||||||
import SeriesSelector from '../../components/stories/SeriesSelector';
|
import SeriesSelector from '../../components/stories/SeriesSelector';
|
||||||
@@ -451,7 +451,7 @@ export default function AddStoryContent() {
|
|||||||
<label className="block text-sm font-medium theme-header mb-2">
|
<label className="block text-sm font-medium theme-header mb-2">
|
||||||
Story Content *
|
Story Content *
|
||||||
</label>
|
</label>
|
||||||
<PortableTextEditor
|
<SlateEditor
|
||||||
value={formData.contentHtml}
|
value={formData.contentHtml}
|
||||||
onChange={handleContentChange}
|
onChange={handleContentChange}
|
||||||
placeholder="Write or paste your story content here..."
|
placeholder="Write or paste your story content here..."
|
||||||
|
|||||||
@@ -35,30 +35,31 @@ export default function AuthorsPage() {
|
|||||||
} else {
|
} else {
|
||||||
setSearchLoading(true);
|
setSearchLoading(true);
|
||||||
}
|
}
|
||||||
const searchResults = await authorApi.getAuthors({
|
|
||||||
|
// Use Solr search for all queries (including empty search)
|
||||||
|
const searchResults = await authorApi.searchAuthors({
|
||||||
|
query: searchQuery || '*', // Use '*' for all authors when no search query
|
||||||
page: currentPage,
|
page: currentPage,
|
||||||
size: ITEMS_PER_PAGE,
|
size: ITEMS_PER_PAGE,
|
||||||
sortBy: sortBy,
|
sortBy: sortBy,
|
||||||
sortDir: sortOrder
|
sortDir: sortOrder
|
||||||
});
|
});
|
||||||
|
|
||||||
if (currentPage === 0) {
|
if (currentPage === 0) {
|
||||||
// First page - replace all results
|
// First page - replace all results
|
||||||
setAuthors(searchResults.content || []);
|
setAuthors(searchResults.results || []);
|
||||||
setFilteredAuthors(searchResults.content || []);
|
setFilteredAuthors(searchResults.results || []);
|
||||||
} else {
|
} else {
|
||||||
// Subsequent pages - append results
|
// Subsequent pages - append results
|
||||||
setAuthors(prev => [...prev, ...(searchResults.content || [])]);
|
setAuthors(prev => [...prev, ...(searchResults.results || [])]);
|
||||||
setFilteredAuthors(prev => [...prev, ...(searchResults.content || [])]);
|
setFilteredAuthors(prev => [...prev, ...(searchResults.results || [])]);
|
||||||
}
|
}
|
||||||
|
|
||||||
setTotalHits(searchResults.totalElements || 0);
|
setTotalHits(searchResults.totalHits || 0);
|
||||||
setHasMore(searchResults.content.length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < (searchResults.totalElements || 0));
|
setHasMore((searchResults.results || []).length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < (searchResults.totalHits || 0));
|
||||||
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Failed to load authors:', error);
|
console.error('Failed to search authors:', error);
|
||||||
// Error handling for API failures
|
|
||||||
console.error('Failed to load authors:', error);
|
|
||||||
} finally {
|
} finally {
|
||||||
setLoading(false);
|
setLoading(false);
|
||||||
setSearchLoading(false);
|
setSearchLoading(false);
|
||||||
@@ -84,17 +85,7 @@ export default function AuthorsPage() {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
// Client-side filtering for search query when using regular API
|
// No longer needed - Solr search handles filtering directly
|
||||||
useEffect(() => {
|
|
||||||
if (searchQuery) {
|
|
||||||
const filtered = authors.filter(author =>
|
|
||||||
author.name.toLowerCase().includes(searchQuery.toLowerCase())
|
|
||||||
);
|
|
||||||
setFilteredAuthors(filtered);
|
|
||||||
} else {
|
|
||||||
setFilteredAuthors(authors);
|
|
||||||
}
|
|
||||||
}, [authors, searchQuery]);
|
|
||||||
|
|
||||||
// Note: We no longer have individual story ratings in the author list
|
// Note: We no longer have individual story ratings in the author list
|
||||||
// Average rating would need to be calculated on backend if needed
|
// Average rating would need to be calculated on backend if needed
|
||||||
@@ -117,9 +108,8 @@ export default function AuthorsPage() {
|
|||||||
<div>
|
<div>
|
||||||
<h1 className="text-3xl font-bold theme-header">Authors</h1>
|
<h1 className="text-3xl font-bold theme-header">Authors</h1>
|
||||||
<p className="theme-text mt-1">
|
<p className="theme-text mt-1">
|
||||||
{searchQuery ? `${filteredAuthors.length} of ${authors.length}` : filteredAuthors.length} {(searchQuery ? authors.length : filteredAuthors.length) === 1 ? 'author' : 'authors'}
|
{searchQuery ? `${totalHits} authors found` : `${totalHits} authors in your library`}
|
||||||
{searchQuery ? ` found` : ` in your library`}
|
{hasMore && ` (showing first ${filteredAuthors.length})`}
|
||||||
{!searchQuery && hasMore && ` (showing first ${filteredAuthors.length})`}
|
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -226,7 +216,7 @@ export default function AuthorsPage() {
|
|||||||
className="px-8 py-3"
|
className="px-8 py-3"
|
||||||
loading={loading}
|
loading={loading}
|
||||||
>
|
>
|
||||||
{loading ? 'Loading...' : `Load More Authors (${totalHits - authors.length} remaining)`}
|
{loading ? 'Loading...' : `Load More Authors (${totalHits - filteredAuthors.length} remaining)`}
|
||||||
</Button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
|||||||
@@ -139,6 +139,15 @@
|
|||||||
@apply max-w-full h-auto mx-auto my-6 rounded-lg shadow-sm;
|
@apply max-w-full h-auto mx-auto my-6 rounded-lg shadow-sm;
|
||||||
max-height: 80vh; /* Prevent images from being too tall */
|
max-height: 80vh; /* Prevent images from being too tall */
|
||||||
display: block;
|
display: block;
|
||||||
|
/* Optimize for performance and prevent reloading */
|
||||||
|
will-change: auto;
|
||||||
|
transform: translateZ(0); /* Force hardware acceleration */
|
||||||
|
backface-visibility: hidden;
|
||||||
|
image-rendering: optimizeQuality;
|
||||||
|
/* Prevent layout shifts that might trigger reloads */
|
||||||
|
box-sizing: border-box;
|
||||||
|
/* Ensure stable dimensions */
|
||||||
|
min-height: 1px;
|
||||||
}
|
}
|
||||||
|
|
||||||
.reading-content img[align="left"] {
|
.reading-content img[align="left"] {
|
||||||
|
|||||||
@@ -15,7 +15,7 @@ import MinimalLayout from '../../components/library/MinimalLayout';
|
|||||||
import { useLibraryLayout } from '../../hooks/useLibraryLayout';
|
import { useLibraryLayout } from '../../hooks/useLibraryLayout';
|
||||||
|
|
||||||
type ViewMode = 'grid' | 'list';
|
type ViewMode = 'grid' | 'list';
|
||||||
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastRead';
|
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastReadAt';
|
||||||
|
|
||||||
export default function LibraryContent() {
|
export default function LibraryContent() {
|
||||||
const router = useRouter();
|
const router = useRouter();
|
||||||
@@ -29,7 +29,7 @@ export default function LibraryContent() {
|
|||||||
const [searchQuery, setSearchQuery] = useState('');
|
const [searchQuery, setSearchQuery] = useState('');
|
||||||
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
||||||
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
||||||
const [sortOption, setSortOption] = useState<SortOption>('lastRead');
|
const [sortOption, setSortOption] = useState<SortOption>('lastReadAt');
|
||||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
||||||
const [page, setPage] = useState(0);
|
const [page, setPage] = useState(0);
|
||||||
const [totalPages, setTotalPages] = useState(1);
|
const [totalPages, setTotalPages] = useState(1);
|
||||||
@@ -69,11 +69,11 @@ export default function LibraryContent() {
|
|||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
|
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
|
||||||
if (!facets || !facets.tagNames) {
|
if (!facets || !facets.tagNames_facet) {
|
||||||
return [];
|
return [];
|
||||||
}
|
}
|
||||||
|
|
||||||
return facets.tagNames.map(facet => {
|
return facets.tagNames_facet.map(facet => {
|
||||||
// Find the full tag data by name
|
// Find the full tag data by name
|
||||||
const fullTag = fullTags.find(tag => tag.name.toLowerCase() === facet.value.toLowerCase());
|
const fullTag = fullTags.find(tag => tag.name.toLowerCase() === facet.value.toLowerCase());
|
||||||
|
|
||||||
|
|||||||
@@ -493,11 +493,11 @@ async function processIndividualMode(
|
|||||||
|
|
||||||
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
|
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
|
||||||
|
|
||||||
// Trigger OpenSearch reindex if any stories were imported
|
// Trigger Solr reindex if any stories were imported
|
||||||
if (importedCount > 0) {
|
if (importedCount > 0) {
|
||||||
try {
|
try {
|
||||||
console.log('Triggering OpenSearch reindex after bulk import...');
|
console.log('Triggering Solr reindex after bulk import...');
|
||||||
const reindexUrl = `http://backend:8080/api/admin/search/opensearch/reindex`;
|
const reindexUrl = `http://backend:8080/api/admin/search/solr/reindex`;
|
||||||
const reindexResponse = await fetch(reindexUrl, {
|
const reindexResponse = await fetch(reindexUrl, {
|
||||||
method: 'POST',
|
method: 'POST',
|
||||||
headers: {
|
headers: {
|
||||||
@@ -508,12 +508,12 @@ async function processIndividualMode(
|
|||||||
|
|
||||||
if (reindexResponse.ok) {
|
if (reindexResponse.ok) {
|
||||||
const reindexResult = await reindexResponse.json();
|
const reindexResult = await reindexResponse.json();
|
||||||
console.log('OpenSearch reindex completed:', reindexResult);
|
console.log('Solr reindex completed:', reindexResult);
|
||||||
} else {
|
} else {
|
||||||
console.warn('OpenSearch reindex failed:', reindexResponse.status);
|
console.warn('Solr reindex failed:', reindexResponse.status);
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.warn('Failed to trigger OpenSearch reindex:', error);
|
console.warn('Failed to trigger Solr reindex:', error);
|
||||||
// Don't fail the whole request if reindex fails
|
// Don't fail the whole request if reindex fails
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -7,7 +7,7 @@ import { Input, Textarea } from '../../../../components/ui/Input';
|
|||||||
import Button from '../../../../components/ui/Button';
|
import Button from '../../../../components/ui/Button';
|
||||||
import TagInput from '../../../../components/stories/TagInput';
|
import TagInput from '../../../../components/stories/TagInput';
|
||||||
import TagSuggestions from '../../../../components/tags/TagSuggestions';
|
import TagSuggestions from '../../../../components/tags/TagSuggestions';
|
||||||
import PortableTextEditor from '../../../../components/stories/PortableTextEditorNew';
|
import SlateEditor from '../../../../components/stories/SlateEditor';
|
||||||
import ImageUpload from '../../../../components/ui/ImageUpload';
|
import ImageUpload from '../../../../components/ui/ImageUpload';
|
||||||
import AuthorSelector from '../../../../components/stories/AuthorSelector';
|
import AuthorSelector from '../../../../components/stories/AuthorSelector';
|
||||||
import SeriesSelector from '../../../../components/stories/SeriesSelector';
|
import SeriesSelector from '../../../../components/stories/SeriesSelector';
|
||||||
@@ -337,7 +337,7 @@ export default function EditStoryPage() {
|
|||||||
<label className="block text-sm font-medium theme-header mb-2">
|
<label className="block text-sm font-medium theme-header mb-2">
|
||||||
Story Content *
|
Story Content *
|
||||||
</label>
|
</label>
|
||||||
<PortableTextEditor
|
<SlateEditor
|
||||||
value={formData.contentHtml}
|
value={formData.contentHtml}
|
||||||
onChange={handleContentChange}
|
onChange={handleContentChange}
|
||||||
placeholder="Edit your story content here..."
|
placeholder="Edit your story content here..."
|
||||||
|
|||||||
@@ -11,6 +11,7 @@ import StoryRating from '../../../components/stories/StoryRating';
|
|||||||
import TagDisplay from '../../../components/tags/TagDisplay';
|
import TagDisplay from '../../../components/tags/TagDisplay';
|
||||||
import TableOfContents from '../../../components/stories/TableOfContents';
|
import TableOfContents from '../../../components/stories/TableOfContents';
|
||||||
import { sanitizeHtml, preloadSanitizationConfig } from '../../../lib/sanitization';
|
import { sanitizeHtml, preloadSanitizationConfig } from '../../../lib/sanitization';
|
||||||
|
import { debug } from '../../../lib/debug';
|
||||||
|
|
||||||
// Memoized content component that only re-renders when content changes
|
// Memoized content component that only re-renders when content changes
|
||||||
const StoryContent = memo(({
|
const StoryContent = memo(({
|
||||||
@@ -20,13 +21,50 @@ const StoryContent = memo(({
|
|||||||
content: string;
|
content: string;
|
||||||
contentRef: React.RefObject<HTMLDivElement>;
|
contentRef: React.RefObject<HTMLDivElement>;
|
||||||
}) => {
|
}) => {
|
||||||
console.log('🔄 StoryContent component rendering with content length:', content.length);
|
const renderTime = Date.now();
|
||||||
|
debug.log('🔄 StoryContent component rendering at', renderTime, 'with content length:', content.length, 'hash:', content.slice(0, 50) + '...');
|
||||||
|
|
||||||
|
// Add observer to track image loading events
|
||||||
|
useEffect(() => {
|
||||||
|
if (!contentRef.current) return;
|
||||||
|
|
||||||
|
const images = contentRef.current.querySelectorAll('img');
|
||||||
|
debug.log('📸 Found', images.length, 'images in content');
|
||||||
|
|
||||||
|
const handleImageLoad = (e: Event) => {
|
||||||
|
const img = e.target as HTMLImageElement;
|
||||||
|
debug.log('🖼️ Image loaded:', img.src);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleImageError = (e: Event) => {
|
||||||
|
const img = e.target as HTMLImageElement;
|
||||||
|
debug.log('❌ Image error:', img.src);
|
||||||
|
};
|
||||||
|
|
||||||
|
images.forEach(img => {
|
||||||
|
img.addEventListener('load', handleImageLoad);
|
||||||
|
img.addEventListener('error', handleImageError);
|
||||||
|
debug.log('👀 Monitoring image:', img.src);
|
||||||
|
});
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
images.forEach(img => {
|
||||||
|
img.removeEventListener('load', handleImageLoad);
|
||||||
|
img.removeEventListener('error', handleImageError);
|
||||||
|
});
|
||||||
|
};
|
||||||
|
}, [content]);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div
|
<div
|
||||||
ref={contentRef}
|
ref={contentRef}
|
||||||
className="reading-content"
|
className="reading-content"
|
||||||
dangerouslySetInnerHTML={{ __html: content }}
|
dangerouslySetInnerHTML={{ __html: content }}
|
||||||
|
style={{
|
||||||
|
// Prevent layout shifts that might cause image reloads
|
||||||
|
minHeight: '100vh',
|
||||||
|
contain: 'layout style'
|
||||||
|
}}
|
||||||
/>
|
/>
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
@@ -112,14 +150,14 @@ export default function StoryReadingPage() {
|
|||||||
// Debounced function to save reading position
|
// Debounced function to save reading position
|
||||||
const saveReadingPosition = useCallback(async (position: number) => {
|
const saveReadingPosition = useCallback(async (position: number) => {
|
||||||
if (!story || position === story.readingPosition) {
|
if (!story || position === story.readingPosition) {
|
||||||
console.log('Skipping save - no story or position unchanged:', { story: !!story, position, current: story?.readingPosition });
|
debug.log('Skipping save - no story or position unchanged:', { story: !!story, position, current: story?.readingPosition });
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
console.log('Saving reading position:', position, 'for story:', story.id);
|
debug.log('Saving reading position:', position, 'for story:', story.id);
|
||||||
try {
|
try {
|
||||||
const updatedStory = await storyApi.updateReadingProgress(story.id, position);
|
const updatedStory = await storyApi.updateReadingProgress(story.id, position);
|
||||||
console.log('Reading position saved successfully, updated story:', updatedStory.readingPosition);
|
debug.log('Reading position saved successfully, updated story:', updatedStory.readingPosition);
|
||||||
setStory(prev => prev ? { ...prev, readingPosition: position, lastReadAt: updatedStory.lastReadAt } : null);
|
setStory(prev => prev ? { ...prev, readingPosition: position, lastReadAt: updatedStory.lastReadAt } : null);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Failed to save reading position:', error);
|
console.error('Failed to save reading position:', error);
|
||||||
@@ -200,12 +238,12 @@ export default function StoryReadingPage() {
|
|||||||
if (story && sanitizedContent && !hasScrolledToPosition) {
|
if (story && sanitizedContent && !hasScrolledToPosition) {
|
||||||
// Use a small delay to ensure content is rendered
|
// Use a small delay to ensure content is rendered
|
||||||
const timeout = setTimeout(() => {
|
const timeout = setTimeout(() => {
|
||||||
console.log('Initializing reading position tracking, saved position:', story.readingPosition);
|
debug.log('Initializing reading position tracking, saved position:', story.readingPosition);
|
||||||
|
|
||||||
// Check if there's a hash in the URL (for TOC navigation)
|
// Check if there's a hash in the URL (for TOC navigation)
|
||||||
const hash = window.location.hash.substring(1);
|
const hash = window.location.hash.substring(1);
|
||||||
if (hash && hash.startsWith('heading-')) {
|
if (hash && hash.startsWith('heading-')) {
|
||||||
console.log('Auto-scrolling to heading from URL hash:', hash);
|
debug.log('Auto-scrolling to heading from URL hash:', hash);
|
||||||
const element = document.getElementById(hash);
|
const element = document.getElementById(hash);
|
||||||
if (element) {
|
if (element) {
|
||||||
element.scrollIntoView({
|
element.scrollIntoView({
|
||||||
@@ -219,13 +257,13 @@ export default function StoryReadingPage() {
|
|||||||
|
|
||||||
// Otherwise, use saved reading position
|
// Otherwise, use saved reading position
|
||||||
if (story.readingPosition && story.readingPosition > 0) {
|
if (story.readingPosition && story.readingPosition > 0) {
|
||||||
console.log('Auto-scrolling to saved position:', story.readingPosition);
|
debug.log('Auto-scrolling to saved position:', story.readingPosition);
|
||||||
const initialPercentage = calculateReadingPercentage(story.readingPosition);
|
const initialPercentage = calculateReadingPercentage(story.readingPosition);
|
||||||
setReadingPercentage(initialPercentage);
|
setReadingPercentage(initialPercentage);
|
||||||
scrollToCharacterPosition(story.readingPosition);
|
scrollToCharacterPosition(story.readingPosition);
|
||||||
} else {
|
} else {
|
||||||
// Even if there's no saved position, mark as ready for tracking
|
// Even if there's no saved position, mark as ready for tracking
|
||||||
console.log('No saved position, starting fresh tracking');
|
debug.log('No saved position, starting fresh tracking');
|
||||||
setReadingPercentage(0);
|
setReadingPercentage(0);
|
||||||
setHasScrolledToPosition(true);
|
setHasScrolledToPosition(true);
|
||||||
}
|
}
|
||||||
@@ -238,8 +276,14 @@ export default function StoryReadingPage() {
|
|||||||
// Track reading progress and save position
|
// Track reading progress and save position
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
let ticking = false;
|
let ticking = false;
|
||||||
|
let scrollEventCount = 0;
|
||||||
|
|
||||||
const handleScroll = () => {
|
const handleScroll = () => {
|
||||||
|
scrollEventCount++;
|
||||||
|
if (scrollEventCount % 10 === 0) {
|
||||||
|
debug.log('📜 Scroll event #', scrollEventCount, 'at', Date.now());
|
||||||
|
}
|
||||||
|
|
||||||
if (!ticking) {
|
if (!ticking) {
|
||||||
requestAnimationFrame(() => {
|
requestAnimationFrame(() => {
|
||||||
const article = document.querySelector('[data-reading-content]') as HTMLElement;
|
const article = document.querySelector('[data-reading-content]') as HTMLElement;
|
||||||
@@ -278,7 +322,7 @@ export default function StoryReadingPage() {
|
|||||||
|
|
||||||
// Trigger end detection if user is near bottom AND (has high progress OR story content is fully visible)
|
// Trigger end detection if user is near bottom AND (has high progress OR story content is fully visible)
|
||||||
if (nearBottom && (highProgress || storyContentFullyVisible) && !hasReachedEnd && hasScrolledToPosition) {
|
if (nearBottom && (highProgress || storyContentFullyVisible) && !hasReachedEnd && hasScrolledToPosition) {
|
||||||
console.log('End of story detected:', { nearBottom, highProgress, storyContentFullyVisible, distanceFromBottom, progress });
|
debug.log('End of story detected:', { nearBottom, highProgress, storyContentFullyVisible, distanceFromBottom, progress });
|
||||||
setHasReachedEnd(true);
|
setHasReachedEnd(true);
|
||||||
setShowEndOfStoryPopup(true);
|
setShowEndOfStoryPopup(true);
|
||||||
}
|
}
|
||||||
@@ -287,11 +331,11 @@ export default function StoryReadingPage() {
|
|||||||
if (hasScrolledToPosition) { // Only save after initial auto-scroll
|
if (hasScrolledToPosition) { // Only save after initial auto-scroll
|
||||||
const characterPosition = getCharacterPositionFromScroll();
|
const characterPosition = getCharacterPositionFromScroll();
|
||||||
const percentage = calculateReadingPercentage(characterPosition);
|
const percentage = calculateReadingPercentage(characterPosition);
|
||||||
console.log('Scroll detected, character position:', characterPosition, 'percentage:', percentage);
|
debug.log('Scroll detected, character position:', characterPosition, 'percentage:', percentage);
|
||||||
setReadingPercentage(percentage);
|
setReadingPercentage(percentage);
|
||||||
debouncedSavePosition(characterPosition);
|
debouncedSavePosition(characterPosition);
|
||||||
} else {
|
} else {
|
||||||
console.log('Scroll detected but not ready for tracking yet');
|
debug.log('Scroll detected but not ready for tracking yet');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
ticking = false;
|
ticking = false;
|
||||||
|
|||||||
259
frontend/src/components/ImageProcessingProgress.tsx
Normal file
259
frontend/src/components/ImageProcessingProgress.tsx
Normal file
@@ -0,0 +1,259 @@
|
|||||||
|
import React, { useState, useEffect } from 'react';
|
||||||
|
import { ImageProcessingProgressTracker, ImageProcessingProgress } from '../utils/imageProcessingProgress';
|
||||||
|
|
||||||
|
interface ImageProcessingProgressProps {
|
||||||
|
storyId: string;
|
||||||
|
autoStart?: boolean;
|
||||||
|
onComplete?: () => void;
|
||||||
|
onError?: (error: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const ImageProcessingProgressComponent: React.FC<ImageProcessingProgressProps> = ({
|
||||||
|
storyId,
|
||||||
|
autoStart = false,
|
||||||
|
onComplete,
|
||||||
|
onError
|
||||||
|
}) => {
|
||||||
|
const [progress, setProgress] = useState<ImageProcessingProgress | null>(null);
|
||||||
|
const [isTracking, setIsTracking] = useState(false);
|
||||||
|
const [tracker, setTracker] = useState<ImageProcessingProgressTracker | null>(null);
|
||||||
|
|
||||||
|
const startTracking = () => {
|
||||||
|
if (tracker) {
|
||||||
|
tracker.stop();
|
||||||
|
}
|
||||||
|
|
||||||
|
const newTracker = new ImageProcessingProgressTracker(storyId);
|
||||||
|
|
||||||
|
newTracker.onProgress((progress) => {
|
||||||
|
setProgress(progress);
|
||||||
|
});
|
||||||
|
|
||||||
|
newTracker.onComplete((finalProgress) => {
|
||||||
|
setProgress(finalProgress);
|
||||||
|
setIsTracking(false);
|
||||||
|
onComplete?.();
|
||||||
|
});
|
||||||
|
|
||||||
|
newTracker.onError((error) => {
|
||||||
|
console.error('Image processing error:', error);
|
||||||
|
setIsTracking(false);
|
||||||
|
onError?.(error);
|
||||||
|
});
|
||||||
|
|
||||||
|
setTracker(newTracker);
|
||||||
|
setIsTracking(true);
|
||||||
|
newTracker.start();
|
||||||
|
};
|
||||||
|
|
||||||
|
const stopTracking = () => {
|
||||||
|
if (tracker) {
|
||||||
|
tracker.stop();
|
||||||
|
setIsTracking(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (autoStart) {
|
||||||
|
startTracking();
|
||||||
|
}
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
if (tracker) {
|
||||||
|
tracker.stop();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}, [storyId, autoStart]);
|
||||||
|
|
||||||
|
if (!progress && !isTracking) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!progress?.isProcessing && !progress?.completed) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="image-processing-progress">
|
||||||
|
<div className="progress-header">
|
||||||
|
<h4>Processing Images</h4>
|
||||||
|
{isTracking && (
|
||||||
|
<button onClick={stopTracking} className="btn btn-sm btn-secondary">
|
||||||
|
Cancel
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{progress && (
|
||||||
|
<div className="progress-content">
|
||||||
|
{progress.error ? (
|
||||||
|
<div className="alert alert-danger">
|
||||||
|
<strong>Error:</strong> {progress.error}
|
||||||
|
</div>
|
||||||
|
) : progress.completed ? (
|
||||||
|
<div className="alert alert-success">
|
||||||
|
<strong>Completed:</strong> {progress.status}
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="progress-info">
|
||||||
|
<div className="status-text">
|
||||||
|
<strong>Status:</strong> {progress.status}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="progress-stats">
|
||||||
|
Processing {progress.processedImages} of {progress.totalImages} images
|
||||||
|
({progress.progressPercentage.toFixed(1)}%)
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{progress.currentImageUrl && (
|
||||||
|
<div className="current-image">
|
||||||
|
<strong>Current:</strong>
|
||||||
|
<span className="image-url" title={progress.currentImageUrl}>
|
||||||
|
{progress.currentImageUrl.length > 60
|
||||||
|
? `...${progress.currentImageUrl.slice(-60)}`
|
||||||
|
: progress.currentImageUrl
|
||||||
|
}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Progress bar */}
|
||||||
|
<div className="progress-bar-container">
|
||||||
|
<div className="progress-bar">
|
||||||
|
<div
|
||||||
|
className="progress-bar-fill"
|
||||||
|
style={{ width: `${progress.progressPercentage}%` }}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<span className="progress-percentage">
|
||||||
|
{progress.progressPercentage.toFixed(1)}%
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<style jsx>{`
|
||||||
|
.image-processing-progress {
|
||||||
|
background: #f8f9fa;
|
||||||
|
border: 1px solid #dee2e6;
|
||||||
|
border-radius: 4px;
|
||||||
|
padding: 1rem;
|
||||||
|
margin: 1rem 0;
|
||||||
|
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-header {
|
||||||
|
display: flex;
|
||||||
|
justify-content: space-between;
|
||||||
|
align-items: center;
|
||||||
|
margin-bottom: 0.75rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-header h4 {
|
||||||
|
margin: 0;
|
||||||
|
font-size: 1.1rem;
|
||||||
|
color: #495057;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-content {
|
||||||
|
space-y: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-text {
|
||||||
|
font-size: 0.9rem;
|
||||||
|
color: #6c757d;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-stats {
|
||||||
|
font-weight: 500;
|
||||||
|
color: #495057;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.current-image {
|
||||||
|
font-size: 0.85rem;
|
||||||
|
color: #6c757d;
|
||||||
|
margin-bottom: 0.75rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.image-url {
|
||||||
|
font-family: 'Monaco', 'Menlo', 'Ubuntu Mono', monospace;
|
||||||
|
background: #e9ecef;
|
||||||
|
padding: 0.1rem 0.3rem;
|
||||||
|
border-radius: 2px;
|
||||||
|
margin-left: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-bar-container {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.75rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-bar {
|
||||||
|
flex: 1;
|
||||||
|
height: 8px;
|
||||||
|
background: #e9ecef;
|
||||||
|
border-radius: 4px;
|
||||||
|
overflow: hidden;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-bar-fill {
|
||||||
|
height: 100%;
|
||||||
|
background: linear-gradient(90deg, #007bff, #0056b3);
|
||||||
|
transition: width 0.3s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.progress-percentage {
|
||||||
|
font-size: 0.85rem;
|
||||||
|
font-weight: 500;
|
||||||
|
color: #495057;
|
||||||
|
min-width: 3rem;
|
||||||
|
text-align: right;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn {
|
||||||
|
padding: 0.25rem 0.5rem;
|
||||||
|
border-radius: 3px;
|
||||||
|
border: 1px solid transparent;
|
||||||
|
cursor: pointer;
|
||||||
|
font-size: 0.8rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-secondary {
|
||||||
|
background: #6c757d;
|
||||||
|
color: white;
|
||||||
|
border-color: #6c757d;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-secondary:hover {
|
||||||
|
background: #5a6268;
|
||||||
|
border-color: #545b62;
|
||||||
|
}
|
||||||
|
|
||||||
|
.alert {
|
||||||
|
padding: 0.75rem;
|
||||||
|
border-radius: 4px;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.alert-danger {
|
||||||
|
background: #f8d7da;
|
||||||
|
color: #721c24;
|
||||||
|
border: 1px solid #f5c6cb;
|
||||||
|
}
|
||||||
|
|
||||||
|
.alert-success {
|
||||||
|
background: #d4edda;
|
||||||
|
color: #155724;
|
||||||
|
border: 1px solid #c3e6cb;
|
||||||
|
}
|
||||||
|
`}</style>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default ImageProcessingProgressComponent;
|
||||||
@@ -66,7 +66,7 @@ export default function MinimalLayout({
|
|||||||
|
|
||||||
const getSortDisplayText = () => {
|
const getSortDisplayText = () => {
|
||||||
const sortLabels: Record<string, string> = {
|
const sortLabels: Record<string, string> = {
|
||||||
lastRead: 'Last Read',
|
lastReadAt: 'Last Read',
|
||||||
createdAt: 'Date Added',
|
createdAt: 'Date Added',
|
||||||
title: 'Title',
|
title: 'Title',
|
||||||
authorName: 'Author',
|
authorName: 'Author',
|
||||||
|
|||||||
@@ -122,8 +122,8 @@ export default function SidebarLayout({
|
|||||||
}}
|
}}
|
||||||
className="px-2 py-1 border rounded-lg theme-card border-gray-300 dark:border-gray-600 text-xs"
|
className="px-2 py-1 border rounded-lg theme-card border-gray-300 dark:border-gray-600 text-xs"
|
||||||
>
|
>
|
||||||
<option value="lastRead_desc">Last Read ↓</option>
|
<option value="lastReadAt_desc">Last Read ↓</option>
|
||||||
<option value="lastRead_asc">Last Read ↑</option>
|
<option value="lastReadAt_asc">Last Read ↑</option>
|
||||||
<option value="createdAt_desc">Date Added ↓</option>
|
<option value="createdAt_desc">Date Added ↓</option>
|
||||||
<option value="createdAt_asc">Date Added ↑</option>
|
<option value="createdAt_asc">Date Added ↑</option>
|
||||||
<option value="title_asc">Title ↑</option>
|
<option value="title_asc">Title ↑</option>
|
||||||
@@ -226,7 +226,7 @@ export default function SidebarLayout({
|
|||||||
onChange={(e) => onSortChange(e.target.value)}
|
onChange={(e) => onSortChange(e.target.value)}
|
||||||
className="flex-1 px-3 py-2 border rounded-lg theme-card border-gray-300 dark:border-gray-600"
|
className="flex-1 px-3 py-2 border rounded-lg theme-card border-gray-300 dark:border-gray-600"
|
||||||
>
|
>
|
||||||
<option value="lastRead">Last Read</option>
|
<option value="lastReadAt">Last Read</option>
|
||||||
<option value="createdAt">Date Added</option>
|
<option value="createdAt">Date Added</option>
|
||||||
<option value="title">Title</option>
|
<option value="title">Title</option>
|
||||||
<option value="authorName">Author</option>
|
<option value="authorName">Author</option>
|
||||||
|
|||||||
@@ -110,8 +110,8 @@ export default function ToolbarLayout({
|
|||||||
}}
|
}}
|
||||||
className="w-full px-3 py-2 border rounded-lg theme-card border-gray-300 dark:border-gray-600 max-md:text-sm"
|
className="w-full px-3 py-2 border rounded-lg theme-card border-gray-300 dark:border-gray-600 max-md:text-sm"
|
||||||
>
|
>
|
||||||
<option value="lastRead_desc">Sort: Last Read ↓</option>
|
<option value="lastReadAt_desc">Sort: Last Read ↓</option>
|
||||||
<option value="lastRead_asc">Sort: Last Read ↑</option>
|
<option value="lastReadAt_asc">Sort: Last Read ↑</option>
|
||||||
<option value="createdAt_desc">Sort: Date Added ↓</option>
|
<option value="createdAt_desc">Sort: Date Added ↓</option>
|
||||||
<option value="createdAt_asc">Sort: Date Added ↑</option>
|
<option value="createdAt_asc">Sort: Date Added ↑</option>
|
||||||
<option value="title_asc">Sort: Title ↑</option>
|
<option value="title_asc">Sort: Title ↑</option>
|
||||||
|
|||||||
@@ -11,23 +11,25 @@ interface SystemSettingsProps {
|
|||||||
export default function SystemSettings({}: SystemSettingsProps) {
|
export default function SystemSettings({}: SystemSettingsProps) {
|
||||||
const [searchEngineStatus, setSearchEngineStatus] = useState<{
|
const [searchEngineStatus, setSearchEngineStatus] = useState<{
|
||||||
currentEngine: string;
|
currentEngine: string;
|
||||||
openSearchAvailable: boolean;
|
solrAvailable: boolean;
|
||||||
loading: boolean;
|
loading: boolean;
|
||||||
message: string;
|
message: string;
|
||||||
success?: boolean;
|
success?: boolean;
|
||||||
}>({
|
}>({
|
||||||
currentEngine: 'opensearch',
|
currentEngine: 'solr',
|
||||||
openSearchAvailable: false,
|
solrAvailable: false,
|
||||||
loading: false,
|
loading: false,
|
||||||
message: ''
|
message: ''
|
||||||
});
|
});
|
||||||
|
|
||||||
const [openSearchStatus, setOpenSearchStatus] = useState<{
|
const [solrStatus, setSolrStatus] = useState<{
|
||||||
reindex: { loading: boolean; message: string; success?: boolean };
|
reindex: { loading: boolean; message: string; success?: boolean };
|
||||||
recreate: { loading: boolean; message: string; success?: boolean };
|
recreate: { loading: boolean; message: string; success?: boolean };
|
||||||
|
migrate: { loading: boolean; message: string; success?: boolean };
|
||||||
}>({
|
}>({
|
||||||
reindex: { loading: false, message: '' },
|
reindex: { loading: false, message: '' },
|
||||||
recreate: { loading: false, message: '' }
|
recreate: { loading: false, message: '' },
|
||||||
|
migrate: { loading: false, message: '' }
|
||||||
});
|
});
|
||||||
|
|
||||||
const [databaseStatus, setDatabaseStatus] = useState<{
|
const [databaseStatus, setDatabaseStatus] = useState<{
|
||||||
@@ -47,6 +49,25 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
execute: { loading: false, message: '' }
|
execute: { loading: false, message: '' }
|
||||||
});
|
});
|
||||||
|
|
||||||
|
const [hoveredImage, setHoveredImage] = useState<{ src: string; alt: string } | null>(null);
|
||||||
|
const [mousePosition, setMousePosition] = useState<{ x: number; y: number }>({ x: 0, y: 0 });
|
||||||
|
|
||||||
|
const handleImageHover = (filePath: string, fileName: string, event: React.MouseEvent) => {
|
||||||
|
// Convert backend file path to frontend image URL
|
||||||
|
const imageUrl = filePath.replace(/^.*\/images\//, '/images/');
|
||||||
|
setHoveredImage({ src: imageUrl, alt: fileName });
|
||||||
|
setMousePosition({ x: event.clientX, y: event.clientY });
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleImageLeave = () => {
|
||||||
|
setHoveredImage(null);
|
||||||
|
};
|
||||||
|
|
||||||
|
const isImageFile = (fileName: string): boolean => {
|
||||||
|
const imageExtensions = ['.jpg', '.jpeg', '.png', '.gif', '.webp', '.bmp', '.svg'];
|
||||||
|
return imageExtensions.some(ext => fileName.toLowerCase().endsWith(ext));
|
||||||
|
};
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
const handleCompleteBackup = async () => {
|
const handleCompleteBackup = async () => {
|
||||||
@@ -229,13 +250,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
}));
|
}));
|
||||||
}
|
}
|
||||||
|
|
||||||
// Clear message after 10 seconds
|
// Note: Preview message no longer auto-clears to allow users to review file details
|
||||||
setTimeout(() => {
|
|
||||||
setCleanupStatus(prev => ({
|
|
||||||
...prev,
|
|
||||||
preview: { loading: false, message: '', success: undefined }
|
|
||||||
}));
|
|
||||||
}, 10000);
|
|
||||||
};
|
};
|
||||||
|
|
||||||
const handleImageCleanupExecute = async () => {
|
const handleImageCleanupExecute = async () => {
|
||||||
@@ -312,7 +327,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
setSearchEngineStatus(prev => ({
|
setSearchEngineStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
currentEngine: status.primaryEngine,
|
currentEngine: status.primaryEngine,
|
||||||
openSearchAvailable: status.openSearchAvailable,
|
solrAvailable: status.solrAvailable,
|
||||||
}));
|
}));
|
||||||
} catch (error: any) {
|
} catch (error: any) {
|
||||||
console.error('Failed to load search engine status:', error);
|
console.error('Failed to load search engine status:', error);
|
||||||
@@ -321,16 +336,16 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
|
|
||||||
|
|
||||||
|
|
||||||
const handleOpenSearchReindex = async () => {
|
const handleSolrReindex = async () => {
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
reindex: { loading: true, message: 'Reindexing OpenSearch...', success: undefined }
|
reindex: { loading: true, message: 'Reindexing Solr...', success: undefined }
|
||||||
}));
|
}));
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const result = await searchAdminApi.reindexOpenSearch();
|
const result = await searchAdminApi.reindexSolr();
|
||||||
|
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
reindex: {
|
reindex: {
|
||||||
loading: false,
|
loading: false,
|
||||||
@@ -340,13 +355,13 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
setTimeout(() => {
|
setTimeout(() => {
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
reindex: { loading: false, message: '', success: undefined }
|
reindex: { loading: false, message: '', success: undefined }
|
||||||
}));
|
}));
|
||||||
}, 8000);
|
}, 8000);
|
||||||
} catch (error: any) {
|
} catch (error: any) {
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
reindex: {
|
reindex: {
|
||||||
loading: false,
|
loading: false,
|
||||||
@@ -356,7 +371,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
setTimeout(() => {
|
setTimeout(() => {
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
reindex: { loading: false, message: '', success: undefined }
|
reindex: { loading: false, message: '', success: undefined }
|
||||||
}));
|
}));
|
||||||
@@ -364,16 +379,16 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
const handleOpenSearchRecreate = async () => {
|
const handleSolrRecreate = async () => {
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
recreate: { loading: true, message: 'Recreating OpenSearch indices...', success: undefined }
|
recreate: { loading: true, message: 'Recreating Solr indices...', success: undefined }
|
||||||
}));
|
}));
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const result = await searchAdminApi.recreateOpenSearchIndices();
|
const result = await searchAdminApi.recreateSolrIndices();
|
||||||
|
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
recreate: {
|
recreate: {
|
||||||
loading: false,
|
loading: false,
|
||||||
@@ -383,13 +398,13 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
setTimeout(() => {
|
setTimeout(() => {
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
recreate: { loading: false, message: '', success: undefined }
|
recreate: { loading: false, message: '', success: undefined }
|
||||||
}));
|
}));
|
||||||
}, 8000);
|
}, 8000);
|
||||||
} catch (error: any) {
|
} catch (error: any) {
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
recreate: {
|
recreate: {
|
||||||
loading: false,
|
loading: false,
|
||||||
@@ -399,7 +414,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
setTimeout(() => {
|
setTimeout(() => {
|
||||||
setOpenSearchStatus(prev => ({
|
setSolrStatus(prev => ({
|
||||||
...prev,
|
...prev,
|
||||||
recreate: { loading: false, message: '', success: undefined }
|
recreate: { loading: false, message: '', success: undefined }
|
||||||
}));
|
}));
|
||||||
@@ -407,6 +422,57 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const handleLibraryMigration = async () => {
|
||||||
|
const confirmed = window.confirm(
|
||||||
|
'This will migrate Solr to support library separation. It will clear existing search data and reindex with library context. Continue?'
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!confirmed) return;
|
||||||
|
|
||||||
|
setSolrStatus(prev => ({
|
||||||
|
...prev,
|
||||||
|
migrate: { loading: true, message: 'Migrating to library-aware schema...', success: undefined }
|
||||||
|
}));
|
||||||
|
|
||||||
|
try {
|
||||||
|
const result = await searchAdminApi.migrateLibrarySchema();
|
||||||
|
|
||||||
|
setSolrStatus(prev => ({
|
||||||
|
...prev,
|
||||||
|
migrate: {
|
||||||
|
loading: false,
|
||||||
|
message: result.success
|
||||||
|
? `${result.message}${result.note ? ` Note: ${result.note}` : ''}`
|
||||||
|
: (result.error || result.details || 'Migration failed'),
|
||||||
|
success: result.success
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
setSolrStatus(prev => ({
|
||||||
|
...prev,
|
||||||
|
migrate: { loading: false, message: '', success: undefined }
|
||||||
|
}));
|
||||||
|
}, 10000); // Longer timeout for migration messages
|
||||||
|
} catch (error: any) {
|
||||||
|
setSolrStatus(prev => ({
|
||||||
|
...prev,
|
||||||
|
migrate: {
|
||||||
|
loading: false,
|
||||||
|
message: error.message || 'Network error occurred',
|
||||||
|
success: false
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
|
||||||
|
setTimeout(() => {
|
||||||
|
setSolrStatus(prev => ({
|
||||||
|
...prev,
|
||||||
|
migrate: { loading: false, message: '', success: undefined }
|
||||||
|
}));
|
||||||
|
}, 10000);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
// Load status on component mount
|
// Load status on component mount
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
loadSearchEngineStatus();
|
loadSearchEngineStatus();
|
||||||
@@ -418,7 +484,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||||
<h2 className="text-xl font-semibold theme-header mb-4">Search Management</h2>
|
<h2 className="text-xl font-semibold theme-header mb-4">Search Management</h2>
|
||||||
<p className="theme-text mb-6">
|
<p className="theme-text mb-6">
|
||||||
Manage OpenSearch indices for stories and authors. Use these tools if search isn't returning expected results.
|
Manage Solr indices for stories and authors. Use these tools if search isn't returning expected results.
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
<div className="space-y-6">
|
<div className="space-y-6">
|
||||||
@@ -427,9 +493,9 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
<h3 className="text-lg font-semibold theme-header mb-3">Search Status</h3>
|
<h3 className="text-lg font-semibold theme-header mb-3">Search Status</h3>
|
||||||
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 text-sm">
|
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 text-sm">
|
||||||
<div className="flex justify-between">
|
<div className="flex justify-between">
|
||||||
<span>OpenSearch:</span>
|
<span>Solr:</span>
|
||||||
<span className={`font-medium ${searchEngineStatus.openSearchAvailable ? 'text-green-600 dark:text-green-400' : 'text-red-600 dark:text-red-400'}`}>
|
<span className={`font-medium ${searchEngineStatus.solrAvailable ? 'text-green-600 dark:text-green-400' : 'text-red-600 dark:text-red-400'}`}>
|
||||||
{searchEngineStatus.openSearchAvailable ? 'Available' : 'Unavailable'}
|
{searchEngineStatus.solrAvailable ? 'Available' : 'Unavailable'}
|
||||||
</span>
|
</span>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -444,43 +510,70 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
|
|
||||||
<div className="flex flex-col sm:flex-row gap-3 mb-4">
|
<div className="flex flex-col sm:flex-row gap-3 mb-4">
|
||||||
<Button
|
<Button
|
||||||
onClick={handleOpenSearchReindex}
|
onClick={handleSolrReindex}
|
||||||
disabled={openSearchStatus.reindex.loading || openSearchStatus.recreate.loading || !searchEngineStatus.openSearchAvailable}
|
disabled={solrStatus.reindex.loading || solrStatus.recreate.loading || solrStatus.migrate.loading || !searchEngineStatus.solrAvailable}
|
||||||
loading={openSearchStatus.reindex.loading}
|
loading={solrStatus.reindex.loading}
|
||||||
variant="ghost"
|
variant="ghost"
|
||||||
className="flex-1"
|
className="flex-1"
|
||||||
>
|
>
|
||||||
{openSearchStatus.reindex.loading ? 'Reindexing...' : '🔄 Reindex All'}
|
{solrStatus.reindex.loading ? 'Reindexing...' : '🔄 Reindex All'}
|
||||||
</Button>
|
</Button>
|
||||||
<Button
|
<Button
|
||||||
onClick={handleOpenSearchRecreate}
|
onClick={handleSolrRecreate}
|
||||||
disabled={openSearchStatus.reindex.loading || openSearchStatus.recreate.loading || !searchEngineStatus.openSearchAvailable}
|
disabled={solrStatus.reindex.loading || solrStatus.recreate.loading || solrStatus.migrate.loading || !searchEngineStatus.solrAvailable}
|
||||||
loading={openSearchStatus.recreate.loading}
|
loading={solrStatus.recreate.loading}
|
||||||
variant="secondary"
|
variant="secondary"
|
||||||
className="flex-1"
|
className="flex-1"
|
||||||
>
|
>
|
||||||
{openSearchStatus.recreate.loading ? 'Recreating...' : '🏗️ Recreate Indices'}
|
{solrStatus.recreate.loading ? 'Recreating...' : '🏗️ Recreate Indices'}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Library Migration Section */}
|
||||||
|
<div className="border-t theme-border pt-4">
|
||||||
|
<h4 className="text-md font-medium theme-header mb-2">Library Separation Migration</h4>
|
||||||
|
<p className="text-sm theme-text mb-3">
|
||||||
|
Migrate Solr to support proper library separation. This ensures search results are isolated between different libraries (password-based access).
|
||||||
|
</p>
|
||||||
|
<Button
|
||||||
|
onClick={handleLibraryMigration}
|
||||||
|
disabled={solrStatus.reindex.loading || solrStatus.recreate.loading || solrStatus.migrate.loading || !searchEngineStatus.solrAvailable}
|
||||||
|
loading={solrStatus.migrate.loading}
|
||||||
|
variant="primary"
|
||||||
|
className="w-full sm:w-auto"
|
||||||
|
>
|
||||||
|
{solrStatus.migrate.loading ? 'Migrating...' : '🔒 Migrate Library Schema'}
|
||||||
</Button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Status Messages */}
|
{/* Status Messages */}
|
||||||
{openSearchStatus.reindex.message && (
|
{solrStatus.reindex.message && (
|
||||||
<div className={`text-sm p-3 rounded mb-3 ${
|
<div className={`text-sm p-3 rounded mb-3 ${
|
||||||
openSearchStatus.reindex.success
|
solrStatus.reindex.success
|
||||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||||
}`}>
|
}`}>
|
||||||
{openSearchStatus.reindex.message}
|
{solrStatus.reindex.message}
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
{openSearchStatus.recreate.message && (
|
{solrStatus.recreate.message && (
|
||||||
<div className={`text-sm p-3 rounded mb-3 ${
|
<div className={`text-sm p-3 rounded mb-3 ${
|
||||||
openSearchStatus.recreate.success
|
solrStatus.recreate.success
|
||||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||||
}`}>
|
}`}>
|
||||||
{openSearchStatus.recreate.message}
|
{solrStatus.recreate.message}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{solrStatus.migrate.message && (
|
||||||
|
<div className={`text-sm p-3 rounded mb-3 ${
|
||||||
|
solrStatus.migrate.success
|
||||||
|
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||||
|
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||||
|
}`}>
|
||||||
|
{solrStatus.migrate.message}
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
@@ -490,7 +583,12 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
<ul className="text-xs space-y-1 ml-4">
|
<ul className="text-xs space-y-1 ml-4">
|
||||||
<li>• <strong>Reindex All:</strong> Refresh all search data while keeping existing schemas (fixes data sync issues)</li>
|
<li>• <strong>Reindex All:</strong> Refresh all search data while keeping existing schemas (fixes data sync issues)</li>
|
||||||
<li>• <strong>Recreate Indices:</strong> Delete and rebuild all search indexes from scratch (fixes schema and structure issues)</li>
|
<li>• <strong>Recreate Indices:</strong> Delete and rebuild all search indexes from scratch (fixes schema and structure issues)</li>
|
||||||
|
<li>• <strong>Migrate Library Schema:</strong> One-time migration to enable library separation (isolates search results by library)</li>
|
||||||
</ul>
|
</ul>
|
||||||
|
<div className="mt-2 pt-2 border-t border-blue-200 dark:border-blue-700">
|
||||||
|
<p className="font-medium text-xs">⚠️ Library Migration:</p>
|
||||||
|
<p className="text-xs">Only run this once to enable library-aware search. Requires Solr schema to support libraryId field.</p>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -529,6 +627,18 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
>
|
>
|
||||||
{cleanupStatus.execute.loading ? 'Cleaning...' : 'Execute Cleanup'}
|
{cleanupStatus.execute.loading ? 'Cleaning...' : 'Execute Cleanup'}
|
||||||
</Button>
|
</Button>
|
||||||
|
{cleanupStatus.preview.message && (
|
||||||
|
<Button
|
||||||
|
onClick={() => setCleanupStatus(prev => ({
|
||||||
|
...prev,
|
||||||
|
preview: { loading: false, message: '', success: undefined, data: undefined }
|
||||||
|
}))}
|
||||||
|
variant="ghost"
|
||||||
|
className="px-4 py-2 text-sm"
|
||||||
|
>
|
||||||
|
Clear Preview
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Preview Results */}
|
{/* Preview Results */}
|
||||||
@@ -582,6 +692,76 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
<span className="font-medium">Referenced Images:</span> {cleanupStatus.preview.data.referencedImagesCount}
|
<span className="font-medium">Referenced Images:</span> {cleanupStatus.preview.data.referencedImagesCount}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{/* Detailed File List */}
|
||||||
|
{cleanupStatus.preview.data.orphanedFiles && cleanupStatus.preview.data.orphanedFiles.length > 0 && (
|
||||||
|
<div className="mt-4">
|
||||||
|
<details className="cursor-pointer">
|
||||||
|
<summary className="font-medium text-sm theme-header mb-2">
|
||||||
|
📁 View Files to be Deleted ({cleanupStatus.preview.data.orphanedFiles.length})
|
||||||
|
</summary>
|
||||||
|
<div className="mt-3 max-h-96 overflow-y-auto border theme-border rounded">
|
||||||
|
<table className="w-full text-xs">
|
||||||
|
<thead className="bg-gray-100 dark:bg-gray-800 sticky top-0">
|
||||||
|
<tr>
|
||||||
|
<th className="text-left p-2 font-medium">File Name</th>
|
||||||
|
<th className="text-left p-2 font-medium">Size</th>
|
||||||
|
<th className="text-left p-2 font-medium">Story</th>
|
||||||
|
<th className="text-left p-2 font-medium">Status</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
{cleanupStatus.preview.data.orphanedFiles.map((file: any, index: number) => (
|
||||||
|
<tr key={index} className="border-t theme-border hover:bg-gray-50 dark:hover:bg-gray-800">
|
||||||
|
<td className="p-2">
|
||||||
|
<div
|
||||||
|
className={`truncate max-w-xs ${isImageFile(file.fileName) ? 'cursor-pointer text-blue-600 dark:text-blue-400' : ''}`}
|
||||||
|
title={file.fileName}
|
||||||
|
onMouseEnter={isImageFile(file.fileName) ? (e) => handleImageHover(file.filePath, file.fileName, e) : undefined}
|
||||||
|
onMouseMove={isImageFile(file.fileName) ? (e) => setMousePosition({ x: e.clientX, y: e.clientY }) : undefined}
|
||||||
|
onMouseLeave={isImageFile(file.fileName) ? handleImageLeave : undefined}
|
||||||
|
>
|
||||||
|
{isImageFile(file.fileName) && '🖼️ '}{file.fileName}
|
||||||
|
</div>
|
||||||
|
<div className="text-xs text-gray-500 truncate max-w-xs" title={file.filePath}>
|
||||||
|
{file.filePath}
|
||||||
|
</div>
|
||||||
|
</td>
|
||||||
|
<td className="p-2">{file.formattedSize}</td>
|
||||||
|
<td className="p-2">
|
||||||
|
{file.storyExists && file.storyTitle ? (
|
||||||
|
<a
|
||||||
|
href={`/stories/${file.storyId}`}
|
||||||
|
className="text-blue-600 dark:text-blue-400 hover:underline truncate max-w-xs block"
|
||||||
|
title={file.storyTitle}
|
||||||
|
>
|
||||||
|
{file.storyTitle}
|
||||||
|
</a>
|
||||||
|
) : file.storyId !== 'unknown' && file.storyId !== 'error' ? (
|
||||||
|
<span className="text-gray-500" title={`Story ID: ${file.storyId}`}>
|
||||||
|
Deleted Story
|
||||||
|
</span>
|
||||||
|
) : (
|
||||||
|
<span className="text-gray-400">Unknown</span>
|
||||||
|
)}
|
||||||
|
</td>
|
||||||
|
<td className="p-2">
|
||||||
|
{file.storyExists ? (
|
||||||
|
<span className="text-orange-600 dark:text-orange-400 text-xs">Orphaned</span>
|
||||||
|
) : file.storyId !== 'unknown' && file.storyId !== 'error' ? (
|
||||||
|
<span className="text-red-600 dark:text-red-400 text-xs">Story Deleted</span>
|
||||||
|
) : (
|
||||||
|
<span className="text-gray-500 text-xs">Unknown Folder</span>
|
||||||
|
)}
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
))}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</details>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
@@ -702,6 +882,31 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{/* Image Preview Overlay */}
|
||||||
|
{hoveredImage && (
|
||||||
|
<div
|
||||||
|
className="fixed pointer-events-none z-50 bg-white dark:bg-gray-900 border border-gray-300 dark:border-gray-600 rounded-lg shadow-xl p-2 max-w-sm"
|
||||||
|
style={{
|
||||||
|
left: mousePosition.x + 10,
|
||||||
|
top: mousePosition.y - 100,
|
||||||
|
transform: mousePosition.x > window.innerWidth - 300 ? 'translateX(-100%)' : 'none'
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<img
|
||||||
|
src={hoveredImage.src}
|
||||||
|
alt={hoveredImage.alt}
|
||||||
|
className="max-w-full max-h-64 object-contain rounded"
|
||||||
|
onError={() => {
|
||||||
|
// Hide preview if image fails to load
|
||||||
|
setHoveredImage(null);
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
<div className="text-xs theme-text mt-1 truncate">
|
||||||
|
{hoveredImage.alt}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -1,610 +0,0 @@
|
|||||||
'use client';
|
|
||||||
|
|
||||||
import React, { useState, useEffect, useCallback, useRef } from 'react';
|
|
||||||
import { PortableText } from '@portabletext/react';
|
|
||||||
import type { PortableTextBlock } from '@portabletext/types';
|
|
||||||
import Button from '../ui/Button';
|
|
||||||
import { Textarea } from '../ui/Input';
|
|
||||||
import { sanitizeHtmlSync } from '../../lib/sanitization';
|
|
||||||
import { storyApi } from '../../lib/api';
|
|
||||||
import {
|
|
||||||
htmlToPortableText,
|
|
||||||
portableTextToHtml,
|
|
||||||
parseHtmlToBlocks
|
|
||||||
} from '../../lib/portabletext/conversion';
|
|
||||||
import {
|
|
||||||
createTextBlock,
|
|
||||||
createImageBlock,
|
|
||||||
emptyPortableTextContent,
|
|
||||||
portableTextSchema
|
|
||||||
} from '../../lib/portabletext/schema';
|
|
||||||
import type { CustomPortableTextBlock } from '../../lib/portabletext/schema';
|
|
||||||
|
|
||||||
interface PortableTextEditorProps {
|
|
||||||
value: string; // HTML value for compatibility
|
|
||||||
onChange: (value: string) => void; // Returns HTML for compatibility
|
|
||||||
placeholder?: string;
|
|
||||||
error?: string;
|
|
||||||
storyId?: string;
|
|
||||||
enableImageProcessing?: boolean;
|
|
||||||
}
|
|
||||||
|
|
||||||
export default function PortableTextEditor({
|
|
||||||
value,
|
|
||||||
onChange,
|
|
||||||
placeholder = 'Write your story here...',
|
|
||||||
error,
|
|
||||||
storyId,
|
|
||||||
enableImageProcessing = false
|
|
||||||
}: PortableTextEditorProps) {
|
|
||||||
console.log('🎯 PortableTextEditor loaded!', { value: value?.length, enableImageProcessing });
|
|
||||||
const [viewMode, setViewMode] = useState<'visual' | 'html'>('visual');
|
|
||||||
const [portableTextValue, setPortableTextValue] = useState<CustomPortableTextBlock[]>(emptyPortableTextContent);
|
|
||||||
const [htmlValue, setHtmlValue] = useState(value);
|
|
||||||
const [isMaximized, setIsMaximized] = useState(false);
|
|
||||||
const [containerHeight, setContainerHeight] = useState(300);
|
|
||||||
const containerRef = useRef<HTMLDivElement>(null);
|
|
||||||
const editableRef = useRef<HTMLDivElement>(null);
|
|
||||||
|
|
||||||
// Image processing state
|
|
||||||
const [imageProcessingQueue, setImageProcessingQueue] = useState<string[]>([]);
|
|
||||||
const [processedImages, setProcessedImages] = useState<Set<string>>(new Set());
|
|
||||||
const [imageWarnings, setImageWarnings] = useState<string[]>([]);
|
|
||||||
const imageProcessingTimeoutRef = useRef<NodeJS.Timeout | null>(null);
|
|
||||||
|
|
||||||
// Initialize Portable Text content from HTML value
|
|
||||||
useEffect(() => {
|
|
||||||
if (value && value !== htmlValue) {
|
|
||||||
const blocks = parseHtmlToBlocks(value);
|
|
||||||
setPortableTextValue(blocks);
|
|
||||||
setHtmlValue(value);
|
|
||||||
}
|
|
||||||
}, [value]);
|
|
||||||
|
|
||||||
// Convert Portable Text to HTML when content changes
|
|
||||||
const updateHtmlFromPortableText = useCallback((blocks: CustomPortableTextBlock[]) => {
|
|
||||||
const html = portableTextToHtml(blocks);
|
|
||||||
setHtmlValue(html);
|
|
||||||
onChange(html);
|
|
||||||
}, [onChange]);
|
|
||||||
|
|
||||||
// Image processing functionality (maintained from original)
|
|
||||||
const findImageUrlsInHtml = (html: string): string[] => {
|
|
||||||
const imgRegex = /<img[^>]+src=["']([^"']+)["'][^>]*>/gi;
|
|
||||||
const urls: string[] = [];
|
|
||||||
let match;
|
|
||||||
while ((match = imgRegex.exec(html)) !== null) {
|
|
||||||
const url = match[1];
|
|
||||||
if (!url.startsWith('/') && !url.startsWith('data:')) {
|
|
||||||
urls.push(url);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return urls;
|
|
||||||
};
|
|
||||||
|
|
||||||
const processContentImagesDebounced = useCallback(async (content: string) => {
|
|
||||||
if (!enableImageProcessing || !storyId) return;
|
|
||||||
|
|
||||||
const imageUrls = findImageUrlsInHtml(content);
|
|
||||||
if (imageUrls.length === 0) return;
|
|
||||||
|
|
||||||
const newUrls = imageUrls.filter(url => !processedImages.has(url));
|
|
||||||
if (newUrls.length === 0) return;
|
|
||||||
|
|
||||||
setImageProcessingQueue(prev => [...prev, ...newUrls]);
|
|
||||||
|
|
||||||
try {
|
|
||||||
const result = await storyApi.processContentImages(storyId, content);
|
|
||||||
setProcessedImages(prev => new Set([...Array.from(prev), ...newUrls]));
|
|
||||||
setImageProcessingQueue(prev => prev.filter(url => !newUrls.includes(url)));
|
|
||||||
|
|
||||||
if (result.processedContent !== content) {
|
|
||||||
const newBlocks = parseHtmlToBlocks(result.processedContent);
|
|
||||||
setPortableTextValue(newBlocks);
|
|
||||||
onChange(result.processedContent);
|
|
||||||
setHtmlValue(result.processedContent);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (result.hasWarnings && result.warnings) {
|
|
||||||
setImageWarnings(prev => [...prev, ...result.warnings!]);
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Failed to process content images:', error);
|
|
||||||
setImageProcessingQueue(prev => prev.filter(url => !newUrls.includes(url)));
|
|
||||||
const errorMessage = error instanceof Error ? error.message : String(error);
|
|
||||||
setImageWarnings(prev => [...prev, `Failed to process some images: ${errorMessage}`]);
|
|
||||||
}
|
|
||||||
}, [enableImageProcessing, storyId, processedImages, onChange]);
|
|
||||||
|
|
||||||
const triggerImageProcessing = useCallback((content: string) => {
|
|
||||||
if (!enableImageProcessing || !storyId) return;
|
|
||||||
|
|
||||||
if (imageProcessingTimeoutRef.current) {
|
|
||||||
clearTimeout(imageProcessingTimeoutRef.current);
|
|
||||||
}
|
|
||||||
|
|
||||||
imageProcessingTimeoutRef.current = setTimeout(() => {
|
|
||||||
processContentImagesDebounced(content);
|
|
||||||
}, 2000);
|
|
||||||
}, [enableImageProcessing, storyId, processContentImagesDebounced]);
|
|
||||||
|
|
||||||
// Toolbar functionality
|
|
||||||
const insertTextWithFormat = (format: string) => {
|
|
||||||
const newBlock = createTextBlock('New ' + format, format === 'normal' ? 'normal' : format);
|
|
||||||
const newBlocks = [...portableTextValue, newBlock];
|
|
||||||
setPortableTextValue(newBlocks);
|
|
||||||
updateHtmlFromPortableText(newBlocks);
|
|
||||||
};
|
|
||||||
|
|
||||||
const formatText = useCallback((format: string) => {
|
|
||||||
if (viewMode === 'visual') {
|
|
||||||
// In visual mode, add a new formatted block
|
|
||||||
insertTextWithFormat(format);
|
|
||||||
} else {
|
|
||||||
// HTML mode - maintain original functionality
|
|
||||||
const textarea = document.querySelector('textarea') as HTMLTextAreaElement;
|
|
||||||
if (!textarea) return;
|
|
||||||
|
|
||||||
const start = textarea.selectionStart;
|
|
||||||
const end = textarea.selectionEnd;
|
|
||||||
const selectedText = htmlValue.substring(start, end);
|
|
||||||
|
|
||||||
if (selectedText) {
|
|
||||||
const beforeText = htmlValue.substring(0, start);
|
|
||||||
const afterText = htmlValue.substring(end);
|
|
||||||
const formattedText = `<${format}>${selectedText}</${format}>`;
|
|
||||||
const newValue = beforeText + formattedText + afterText;
|
|
||||||
|
|
||||||
setHtmlValue(newValue);
|
|
||||||
onChange(newValue);
|
|
||||||
|
|
||||||
setTimeout(() => {
|
|
||||||
textarea.focus();
|
|
||||||
textarea.setSelectionRange(start, start + formattedText.length);
|
|
||||||
}, 0);
|
|
||||||
} else {
|
|
||||||
const template = format === 'h1' ? '<h1>Heading 1</h1>' :
|
|
||||||
format === 'h2' ? '<h2>Heading 2</h2>' :
|
|
||||||
format === 'h3' ? '<h3>Heading 3</h3>' :
|
|
||||||
format === 'h4' ? '<h4>Heading 4</h4>' :
|
|
||||||
format === 'h5' ? '<h5>Heading 5</h5>' :
|
|
||||||
format === 'h6' ? '<h6>Heading 6</h6>' :
|
|
||||||
`<${format}>Formatted text</${format}>`;
|
|
||||||
|
|
||||||
const newValue = htmlValue.substring(0, start) + template + htmlValue.substring(start);
|
|
||||||
setHtmlValue(newValue);
|
|
||||||
onChange(newValue);
|
|
||||||
|
|
||||||
setTimeout(() => {
|
|
||||||
const tagLength = `<${format}>`.length;
|
|
||||||
const newPosition = start + tagLength;
|
|
||||||
textarea.focus();
|
|
||||||
textarea.setSelectionRange(newPosition, newPosition + (template.includes('Heading') ? template.split('>')[1].split('<')[0].length : 'Formatted text'.length));
|
|
||||||
}, 0);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}, [viewMode, htmlValue, onChange, portableTextValue, updateHtmlFromPortableText]);
|
|
||||||
|
|
||||||
// Handle HTML mode changes
|
|
||||||
const handleHtmlChange = (e: React.ChangeEvent<HTMLTextAreaElement>) => {
|
|
||||||
const html = e.target.value;
|
|
||||||
setHtmlValue(html);
|
|
||||||
onChange(html);
|
|
||||||
|
|
||||||
// Update Portable Text representation
|
|
||||||
const blocks = parseHtmlToBlocks(html);
|
|
||||||
setPortableTextValue(blocks);
|
|
||||||
|
|
||||||
triggerImageProcessing(html);
|
|
||||||
};
|
|
||||||
|
|
||||||
// Handle visual mode content changes
|
|
||||||
const handleVisualContentChange = () => {
|
|
||||||
if (editableRef.current) {
|
|
||||||
const html = editableRef.current.innerHTML;
|
|
||||||
const blocks = parseHtmlToBlocks(html);
|
|
||||||
setPortableTextValue(blocks);
|
|
||||||
updateHtmlFromPortableText(blocks);
|
|
||||||
triggerImageProcessing(html);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
// Paste handling
|
|
||||||
const handlePaste = async (e: React.ClipboardEvent<HTMLDivElement>) => {
|
|
||||||
if (viewMode !== 'visual') return;
|
|
||||||
|
|
||||||
e.preventDefault();
|
|
||||||
|
|
||||||
try {
|
|
||||||
const clipboardData = e.clipboardData;
|
|
||||||
let htmlContent = '';
|
|
||||||
let plainText = '';
|
|
||||||
|
|
||||||
try {
|
|
||||||
htmlContent = clipboardData.getData('text/html');
|
|
||||||
plainText = clipboardData.getData('text/plain');
|
|
||||||
} catch (e) {
|
|
||||||
console.log('Direct getData failed:', e);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (htmlContent && htmlContent.trim().length > 0) {
|
|
||||||
let processedHtml = htmlContent;
|
|
||||||
|
|
||||||
if (enableImageProcessing && storyId) {
|
|
||||||
const hasImages = /<img[^>]+src=['"'][^'"']*['"][^>]*>/i.test(htmlContent);
|
|
||||||
if (hasImages) {
|
|
||||||
try {
|
|
||||||
const result = await storyApi.processContentImages(storyId, htmlContent);
|
|
||||||
processedHtml = result.processedContent;
|
|
||||||
|
|
||||||
if (result.downloadedImages && result.downloadedImages.length > 0) {
|
|
||||||
setProcessedImages(prev => new Set([...Array.from(prev), ...result.downloadedImages]));
|
|
||||||
}
|
|
||||||
if (result.warnings && result.warnings.length > 0) {
|
|
||||||
setImageWarnings(prev => [...prev, ...result.warnings!]);
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Image processing failed during paste:', error);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const sanitizedHtml = sanitizeHtmlSync(processedHtml);
|
|
||||||
const blocks = parseHtmlToBlocks(sanitizedHtml);
|
|
||||||
|
|
||||||
// Insert at current position
|
|
||||||
const newBlocks = [...portableTextValue, ...blocks];
|
|
||||||
setPortableTextValue(newBlocks);
|
|
||||||
updateHtmlFromPortableText(newBlocks);
|
|
||||||
|
|
||||||
} else if (plainText && plainText.trim().length > 0) {
|
|
||||||
const textBlocks = plainText
|
|
||||||
.split('\n\n')
|
|
||||||
.filter(p => p.trim())
|
|
||||||
.map(p => createTextBlock(p.trim()));
|
|
||||||
|
|
||||||
const newBlocks = [...portableTextValue, ...textBlocks];
|
|
||||||
setPortableTextValue(newBlocks);
|
|
||||||
updateHtmlFromPortableText(newBlocks);
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Error handling paste:', error);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
// Maximize/minimize functionality
|
|
||||||
const toggleMaximize = () => {
|
|
||||||
if (!isMaximized) {
|
|
||||||
if (containerRef.current) {
|
|
||||||
setContainerHeight(containerRef.current.scrollHeight || containerHeight);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
setIsMaximized(!isMaximized);
|
|
||||||
};
|
|
||||||
|
|
||||||
// Keyboard shortcuts
|
|
||||||
useEffect(() => {
|
|
||||||
const handleKeyDown = (e: KeyboardEvent) => {
|
|
||||||
if (e.key === 'Escape' && isMaximized) {
|
|
||||||
setIsMaximized(false);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (e.ctrlKey && e.shiftKey && !e.altKey && !e.metaKey) {
|
|
||||||
const num = parseInt(e.key);
|
|
||||||
if (num >= 1 && num <= 6) {
|
|
||||||
e.preventDefault();
|
|
||||||
formatText(`h${num}`);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (e.ctrlKey && !e.shiftKey && !e.altKey && !e.metaKey) {
|
|
||||||
switch (e.key.toLowerCase()) {
|
|
||||||
case 'b':
|
|
||||||
e.preventDefault();
|
|
||||||
formatText('strong');
|
|
||||||
return;
|
|
||||||
case 'i':
|
|
||||||
e.preventDefault();
|
|
||||||
formatText('em');
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
document.addEventListener('keydown', handleKeyDown);
|
|
||||||
|
|
||||||
if (isMaximized) {
|
|
||||||
document.body.style.overflow = 'hidden';
|
|
||||||
} else {
|
|
||||||
document.body.style.overflow = '';
|
|
||||||
}
|
|
||||||
|
|
||||||
return () => {
|
|
||||||
document.removeEventListener('keydown', handleKeyDown);
|
|
||||||
document.body.style.overflow = '';
|
|
||||||
};
|
|
||||||
}, [isMaximized, formatText]);
|
|
||||||
|
|
||||||
// Cleanup
|
|
||||||
useEffect(() => {
|
|
||||||
return () => {
|
|
||||||
if (imageProcessingTimeoutRef.current) {
|
|
||||||
clearTimeout(imageProcessingTimeoutRef.current);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
// Custom components for Portable Text rendering
|
|
||||||
const portableTextComponents = {
|
|
||||||
types: {
|
|
||||||
image: ({ value }: { value: any }) => (
|
|
||||||
<div className="image-block my-4">
|
|
||||||
<img
|
|
||||||
src={value.src}
|
|
||||||
alt={value.alt || ''}
|
|
||||||
className="max-w-full h-auto"
|
|
||||||
loading="lazy"
|
|
||||||
/>
|
|
||||||
{value.caption && (
|
|
||||||
<p className="text-sm text-gray-600 mt-2 italic">{value.caption}</p>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
),
|
|
||||||
},
|
|
||||||
block: {
|
|
||||||
normal: ({ children }: any) => <p className="mb-2">{children}</p>,
|
|
||||||
h1: ({ children }: any) => <h1 className="text-3xl font-bold mb-4">{children}</h1>,
|
|
||||||
h2: ({ children }: any) => <h2 className="text-2xl font-bold mb-3">{children}</h2>,
|
|
||||||
h3: ({ children }: any) => <h3 className="text-xl font-bold mb-3">{children}</h3>,
|
|
||||||
h4: ({ children }: any) => <h4 className="text-lg font-bold mb-2">{children}</h4>,
|
|
||||||
h5: ({ children }: any) => <h5 className="text-base font-bold mb-2">{children}</h5>,
|
|
||||||
h6: ({ children }: any) => <h6 className="text-sm font-bold mb-2">{children}</h6>,
|
|
||||||
blockquote: ({ children }: any) => (
|
|
||||||
<blockquote className="border-l-4 border-gray-300 pl-4 italic my-4">{children}</blockquote>
|
|
||||||
),
|
|
||||||
},
|
|
||||||
marks: {
|
|
||||||
strong: ({ children }: any) => <strong>{children}</strong>,
|
|
||||||
em: ({ children }: any) => <em>{children}</em>,
|
|
||||||
underline: ({ children }: any) => <u>{children}</u>,
|
|
||||||
strike: ({ children }: any) => <s>{children}</s>,
|
|
||||||
code: ({ children }: any) => (
|
|
||||||
<code className="bg-gray-100 px-1 py-0.5 rounded text-sm font-mono">{children}</code>
|
|
||||||
),
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="space-y-2">
|
|
||||||
{/* Toolbar */}
|
|
||||||
<div className="flex items-center justify-between p-2 theme-card border theme-border rounded-t-lg">
|
|
||||||
<div className="flex items-center gap-2">
|
|
||||||
<div className="text-xs bg-green-100 text-green-800 px-2 py-1 rounded">
|
|
||||||
✨ Portable Text Editor
|
|
||||||
</div>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => setViewMode('visual')}
|
|
||||||
className={viewMode === 'visual' ? 'theme-accent-bg text-white' : ''}
|
|
||||||
>
|
|
||||||
Visual
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => setViewMode('html')}
|
|
||||||
className={viewMode === 'html' ? 'theme-accent-bg text-white' : ''}
|
|
||||||
>
|
|
||||||
HTML
|
|
||||||
</Button>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div className="flex items-center gap-1">
|
|
||||||
{/* Image processing status */}
|
|
||||||
{enableImageProcessing && (
|
|
||||||
<>
|
|
||||||
{imageProcessingQueue.length > 0 && (
|
|
||||||
<div className="flex items-center gap-1 text-xs text-blue-600 dark:text-blue-400 mr-2">
|
|
||||||
<div className="animate-spin h-3 w-3 border-2 border-blue-600 border-t-transparent rounded-full"></div>
|
|
||||||
<span>Processing {imageProcessingQueue.length} image{imageProcessingQueue.length > 1 ? 's' : ''}...</span>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
{imageWarnings.length > 0 && (
|
|
||||||
<div className="flex items-center gap-1 text-xs text-orange-600 dark:text-orange-400 mr-2" title={imageWarnings.join('\n')}>
|
|
||||||
<span>⚠️</span>
|
|
||||||
<span>{imageWarnings.length} warning{imageWarnings.length > 1 ? 's' : ''}</span>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</>
|
|
||||||
)}
|
|
||||||
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={toggleMaximize}
|
|
||||||
title={isMaximized ? "Minimize editor" : "Maximize editor"}
|
|
||||||
className="font-mono"
|
|
||||||
>
|
|
||||||
{isMaximized ? "⊡" : "⊞"}
|
|
||||||
</Button>
|
|
||||||
<div className="w-px h-4 bg-gray-300 mx-1" />
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('strong')}
|
|
||||||
title="Bold (Ctrl+B)"
|
|
||||||
className="font-bold"
|
|
||||||
>
|
|
||||||
B
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('em')}
|
|
||||||
title="Italic (Ctrl+I)"
|
|
||||||
className="italic"
|
|
||||||
>
|
|
||||||
I
|
|
||||||
</Button>
|
|
||||||
<div className="w-px h-4 bg-gray-300 mx-1" />
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('h1')}
|
|
||||||
title="Heading 1 (Ctrl+Shift+1)"
|
|
||||||
className="text-lg font-bold"
|
|
||||||
>
|
|
||||||
H1
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('h2')}
|
|
||||||
title="Heading 2 (Ctrl+Shift+2)"
|
|
||||||
className="text-base font-bold"
|
|
||||||
>
|
|
||||||
H2
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('h3')}
|
|
||||||
title="Heading 3 (Ctrl+Shift+3)"
|
|
||||||
className="text-sm font-bold"
|
|
||||||
>
|
|
||||||
H3
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('h4')}
|
|
||||||
title="Heading 4 (Ctrl+Shift+4)"
|
|
||||||
className="text-xs font-bold"
|
|
||||||
>
|
|
||||||
H4
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('h5')}
|
|
||||||
title="Heading 5 (Ctrl+Shift+5)"
|
|
||||||
className="text-xs font-bold"
|
|
||||||
>
|
|
||||||
H5
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('h6')}
|
|
||||||
title="Heading 6 (Ctrl+Shift+6)"
|
|
||||||
className="text-xs font-bold"
|
|
||||||
>
|
|
||||||
H6
|
|
||||||
</Button>
|
|
||||||
<div className="w-px h-4 bg-gray-300 mx-1" />
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => formatText('p')}
|
|
||||||
title="Paragraph"
|
|
||||||
>
|
|
||||||
P
|
|
||||||
</Button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Editor */}
|
|
||||||
<div
|
|
||||||
className={`relative border theme-border rounded-b-lg ${
|
|
||||||
isMaximized ? 'fixed inset-4 z-50 bg-white dark:bg-gray-900 shadow-2xl' : ''
|
|
||||||
}`}
|
|
||||||
style={isMaximized ? {} : { height: containerHeight }}
|
|
||||||
>
|
|
||||||
<div
|
|
||||||
ref={containerRef}
|
|
||||||
className="h-full flex flex-col overflow-hidden"
|
|
||||||
>
|
|
||||||
{/* Editor content */}
|
|
||||||
<div className="flex-1 overflow-hidden">
|
|
||||||
{viewMode === 'visual' ? (
|
|
||||||
<div className="relative h-full">
|
|
||||||
<div
|
|
||||||
ref={editableRef}
|
|
||||||
contentEditable
|
|
||||||
onInput={handleVisualContentChange}
|
|
||||||
onPaste={handlePaste}
|
|
||||||
className="p-3 h-full overflow-y-auto focus:outline-none focus:ring-0 resize-none"
|
|
||||||
suppressContentEditableWarning={true}
|
|
||||||
>
|
|
||||||
<PortableText
|
|
||||||
value={portableTextValue}
|
|
||||||
components={portableTextComponents}
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
{(!portableTextValue || portableTextValue.length === 0 ||
|
|
||||||
(portableTextValue.length === 1 && !portableTextValue[0])) && (
|
|
||||||
<div className="absolute top-3 left-3 text-gray-500 dark:text-gray-400 pointer-events-none select-none">
|
|
||||||
{placeholder}
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
) : (
|
|
||||||
<Textarea
|
|
||||||
value={htmlValue}
|
|
||||||
onChange={handleHtmlChange}
|
|
||||||
placeholder="<p>Write your HTML content here...</p>"
|
|
||||||
className="border-0 rounded-none focus:ring-0 font-mono text-sm h-full resize-none"
|
|
||||||
/>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Preview for HTML mode */}
|
|
||||||
{viewMode === 'html' && htmlValue && !isMaximized && (
|
|
||||||
<div className="space-y-2">
|
|
||||||
<h4 className="text-sm font-medium theme-header">Preview:</h4>
|
|
||||||
<div className="p-4 border theme-border rounded-lg theme-card max-h-40 overflow-y-auto">
|
|
||||||
<PortableText
|
|
||||||
value={portableTextValue}
|
|
||||||
components={portableTextComponents}
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
|
|
||||||
{error && (
|
|
||||||
<p className="text-sm text-red-600 dark:text-red-400">{error}</p>
|
|
||||||
)}
|
|
||||||
|
|
||||||
<div className="text-xs theme-text">
|
|
||||||
<p>
|
|
||||||
<strong>Visual mode:</strong> Structured content editor with rich formatting.
|
|
||||||
Paste content from websites and it will be converted to structured format.
|
|
||||||
</p>
|
|
||||||
<p>
|
|
||||||
<strong>HTML mode:</strong> Edit HTML source directly for advanced formatting.
|
|
||||||
Content is automatically sanitized for security.
|
|
||||||
</p>
|
|
||||||
<p>
|
|
||||||
<strong>Keyboard shortcuts:</strong> Ctrl+B (Bold), Ctrl+I (Italic), Ctrl+Shift+1-6 (Headings 1-6).
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -1,671 +0,0 @@
|
|||||||
'use client';
|
|
||||||
|
|
||||||
import React, { useState, useEffect, useCallback, useRef } from 'react';
|
|
||||||
import {
|
|
||||||
EditorProvider,
|
|
||||||
PortableTextEditable,
|
|
||||||
useEditor,
|
|
||||||
type PortableTextBlock,
|
|
||||||
type RenderDecoratorFunction,
|
|
||||||
type RenderStyleFunction,
|
|
||||||
type RenderBlockFunction,
|
|
||||||
type RenderListItemFunction,
|
|
||||||
type RenderAnnotationFunction
|
|
||||||
} from '@portabletext/editor';
|
|
||||||
import { PortableText } from '@portabletext/react';
|
|
||||||
import Button from '../ui/Button';
|
|
||||||
import { sanitizeHtmlSync } from '../../lib/sanitization';
|
|
||||||
import { editorSchema } from '../../lib/portabletext/editorSchema';
|
|
||||||
|
|
||||||
interface PortableTextEditorProps {
|
|
||||||
value: string; // HTML value for compatibility - will be converted
|
|
||||||
onChange: (value: string) => void; // Returns HTML for compatibility
|
|
||||||
placeholder?: string;
|
|
||||||
error?: string;
|
|
||||||
storyId?: string;
|
|
||||||
enableImageProcessing?: boolean;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Conversion utilities
|
|
||||||
function htmlToPortableTextBlocks(html: string): PortableTextBlock[] {
|
|
||||||
if (!html || html.trim() === '') {
|
|
||||||
return [{ _type: 'block', _key: generateKey(), style: 'normal', markDefs: [], children: [{ _type: 'span', _key: generateKey(), text: '', marks: [] }] }];
|
|
||||||
}
|
|
||||||
|
|
||||||
// Basic HTML to Portable Text conversion
|
|
||||||
// This is a simplified implementation - you could enhance this
|
|
||||||
const sanitizedHtml = sanitizeHtmlSync(html);
|
|
||||||
const parser = new DOMParser();
|
|
||||||
const doc = parser.parseFromString(sanitizedHtml, 'text/html');
|
|
||||||
|
|
||||||
const blocks: PortableTextBlock[] = [];
|
|
||||||
const paragraphs = doc.querySelectorAll('p, h1, h2, h3, h4, h5, h6, blockquote, div');
|
|
||||||
|
|
||||||
if (paragraphs.length === 0) {
|
|
||||||
// Fallback: treat as single paragraph
|
|
||||||
return [{
|
|
||||||
_type: 'block',
|
|
||||||
_key: generateKey(),
|
|
||||||
style: 'normal',
|
|
||||||
markDefs: [],
|
|
||||||
children: [{
|
|
||||||
_type: 'span',
|
|
||||||
_key: generateKey(),
|
|
||||||
text: doc.body.textContent || '',
|
|
||||||
marks: []
|
|
||||||
}]
|
|
||||||
}];
|
|
||||||
}
|
|
||||||
|
|
||||||
// Process all elements in document order to maintain sequence
|
|
||||||
const allElements = Array.from(doc.body.querySelectorAll('*'));
|
|
||||||
const processedElements = new Set<Element>();
|
|
||||||
|
|
||||||
for (const element of allElements) {
|
|
||||||
// Skip if already processed
|
|
||||||
if (processedElements.has(element)) continue;
|
|
||||||
|
|
||||||
// Handle images
|
|
||||||
if (element.tagName === 'IMG') {
|
|
||||||
const img = element as HTMLImageElement;
|
|
||||||
blocks.push({
|
|
||||||
_type: 'image',
|
|
||||||
_key: generateKey(),
|
|
||||||
src: img.getAttribute('src') || '',
|
|
||||||
alt: img.getAttribute('alt') || '',
|
|
||||||
caption: img.getAttribute('title') || '',
|
|
||||||
width: img.getAttribute('width') ? parseInt(img.getAttribute('width')!) : undefined,
|
|
||||||
height: img.getAttribute('height') ? parseInt(img.getAttribute('height')!) : undefined,
|
|
||||||
});
|
|
||||||
processedElements.add(element);
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle code blocks
|
|
||||||
if ((element.tagName === 'CODE' && element.parentElement?.tagName === 'PRE') ||
|
|
||||||
(element.tagName === 'PRE' && element.querySelector('code'))) {
|
|
||||||
const codeEl = element.tagName === 'CODE' ? element : element.querySelector('code');
|
|
||||||
if (codeEl) {
|
|
||||||
const code = codeEl.textContent || '';
|
|
||||||
const language = codeEl.getAttribute('class')?.replace('language-', '') || '';
|
|
||||||
|
|
||||||
if (code.trim()) {
|
|
||||||
blocks.push({
|
|
||||||
_type: 'codeBlock',
|
|
||||||
_key: generateKey(),
|
|
||||||
code,
|
|
||||||
language,
|
|
||||||
});
|
|
||||||
processedElements.add(element);
|
|
||||||
if (element.tagName === 'PRE') processedElements.add(codeEl);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle text blocks (paragraphs, headings, etc.)
|
|
||||||
if (['P', 'H1', 'H2', 'H3', 'H4', 'H5', 'H6', 'BLOCKQUOTE', 'DIV'].includes(element.tagName)) {
|
|
||||||
// Skip if this contains already processed elements
|
|
||||||
if (element.querySelector('img') || (element.querySelector('code') && element.querySelector('pre'))) {
|
|
||||||
processedElements.add(element);
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
const style = getStyleFromElement(element);
|
|
||||||
const text = element.textContent || '';
|
|
||||||
|
|
||||||
if (text.trim()) {
|
|
||||||
blocks.push({
|
|
||||||
_type: 'block',
|
|
||||||
_key: generateKey(),
|
|
||||||
style,
|
|
||||||
markDefs: [],
|
|
||||||
children: [{
|
|
||||||
_type: 'span',
|
|
||||||
_key: generateKey(),
|
|
||||||
text,
|
|
||||||
marks: []
|
|
||||||
}]
|
|
||||||
});
|
|
||||||
processedElements.add(element);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return blocks.length > 0 ? blocks : [{
|
|
||||||
_type: 'block',
|
|
||||||
_key: generateKey(),
|
|
||||||
style: 'normal',
|
|
||||||
markDefs: [],
|
|
||||||
children: [{
|
|
||||||
_type: 'span',
|
|
||||||
_key: generateKey(),
|
|
||||||
text: '',
|
|
||||||
marks: []
|
|
||||||
}]
|
|
||||||
}];
|
|
||||||
}
|
|
||||||
|
|
||||||
function portableTextToHtml(blocks: PortableTextBlock[]): string {
|
|
||||||
if (!blocks || blocks.length === 0) return '';
|
|
||||||
|
|
||||||
const htmlParts: string[] = [];
|
|
||||||
|
|
||||||
blocks.forEach(block => {
|
|
||||||
if (block._type === 'block' && Array.isArray(block.children)) {
|
|
||||||
const tag = getHtmlTagFromStyle((block.style as string) || 'normal');
|
|
||||||
const children = block.children as PortableTextChild[];
|
|
||||||
const text = children
|
|
||||||
.map(child => child._type === 'span' ? child.text || '' : '')
|
|
||||||
.join('') || '';
|
|
||||||
|
|
||||||
if (text.trim() || block.style !== 'normal') {
|
|
||||||
htmlParts.push(`<${tag}>${text}</${tag}>`);
|
|
||||||
}
|
|
||||||
} else if (block._type === 'image' && isImageBlock(block)) {
|
|
||||||
// Convert image blocks back to HTML
|
|
||||||
const attrs: string[] = [];
|
|
||||||
if (block.src) attrs.push(`src="${block.src}"`);
|
|
||||||
if (block.alt) attrs.push(`alt="${block.alt}"`);
|
|
||||||
if (block.caption) attrs.push(`title="${block.caption}"`);
|
|
||||||
if (block.width) attrs.push(`width="${block.width}"`);
|
|
||||||
if (block.height) attrs.push(`height="${block.height}"`);
|
|
||||||
|
|
||||||
htmlParts.push(`<img ${attrs.join(' ')} />`);
|
|
||||||
} else if (block._type === 'codeBlock' && isCodeBlock(block)) {
|
|
||||||
// Convert code blocks back to HTML
|
|
||||||
const langClass = block.language ? ` class="language-${block.language}"` : '';
|
|
||||||
htmlParts.push(`<pre><code${langClass}>${block.code || ''}</code></pre>`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
const html = htmlParts.join('\n');
|
|
||||||
return sanitizeHtmlSync(html);
|
|
||||||
}
|
|
||||||
|
|
||||||
function getStyleFromElement(element: Element): string {
|
|
||||||
const tagName = element.tagName.toLowerCase();
|
|
||||||
const styleMap: Record<string, string> = {
|
|
||||||
'p': 'normal',
|
|
||||||
'div': 'normal',
|
|
||||||
'h1': 'h1',
|
|
||||||
'h2': 'h2',
|
|
||||||
'h3': 'h3',
|
|
||||||
'h4': 'h4',
|
|
||||||
'h5': 'h5',
|
|
||||||
'h6': 'h6',
|
|
||||||
'blockquote': 'blockquote',
|
|
||||||
};
|
|
||||||
return styleMap[tagName] || 'normal';
|
|
||||||
}
|
|
||||||
|
|
||||||
function getHtmlTagFromStyle(style: string): string {
|
|
||||||
const tagMap: Record<string, string> = {
|
|
||||||
'normal': 'p',
|
|
||||||
'h1': 'h1',
|
|
||||||
'h2': 'h2',
|
|
||||||
'h3': 'h3',
|
|
||||||
'h4': 'h4',
|
|
||||||
'h5': 'h5',
|
|
||||||
'h6': 'h6',
|
|
||||||
'blockquote': 'blockquote',
|
|
||||||
};
|
|
||||||
return tagMap[style] || 'p';
|
|
||||||
}
|
|
||||||
|
|
||||||
interface PortableTextChild {
|
|
||||||
_type: string;
|
|
||||||
_key: string;
|
|
||||||
text?: string;
|
|
||||||
marks?: string[];
|
|
||||||
}
|
|
||||||
|
|
||||||
// Type guards for custom block types
|
|
||||||
function isImageBlock(value: any): value is {
|
|
||||||
_type: 'image';
|
|
||||||
src?: string;
|
|
||||||
alt?: string;
|
|
||||||
caption?: string;
|
|
||||||
width?: number;
|
|
||||||
height?: number;
|
|
||||||
} {
|
|
||||||
return value && typeof value === 'object' && value._type === 'image';
|
|
||||||
}
|
|
||||||
|
|
||||||
function isCodeBlock(value: any): value is {
|
|
||||||
_type: 'codeBlock';
|
|
||||||
code?: string;
|
|
||||||
language?: string;
|
|
||||||
} {
|
|
||||||
return value && typeof value === 'object' && value._type === 'codeBlock';
|
|
||||||
}
|
|
||||||
|
|
||||||
function generateKey(): string {
|
|
||||||
return Math.random().toString(36).substring(2, 11);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Toolbar component
|
|
||||||
function EditorToolbar({
|
|
||||||
isScrollable,
|
|
||||||
onToggleScrollable
|
|
||||||
}: {
|
|
||||||
isScrollable: boolean;
|
|
||||||
onToggleScrollable: () => void;
|
|
||||||
}) {
|
|
||||||
const editor = useEditor();
|
|
||||||
|
|
||||||
const toggleDecorator = (decorator: string) => {
|
|
||||||
editor.send({ type: 'decorator.toggle', decorator });
|
|
||||||
};
|
|
||||||
|
|
||||||
const setStyle = (style: string) => {
|
|
||||||
editor.send({ type: 'style.toggle', style });
|
|
||||||
};
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="flex items-center justify-between p-2 theme-card border theme-border rounded-t-lg">
|
|
||||||
<div className="flex items-center gap-2">
|
|
||||||
<div className="text-xs bg-blue-100 text-blue-800 px-2 py-1 rounded">
|
|
||||||
✨ Portable Text Editor
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Style buttons */}
|
|
||||||
<div className="flex items-center gap-1 border-r pr-2 mr-2">
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => setStyle('normal')}
|
|
||||||
title="Normal paragraph"
|
|
||||||
>
|
|
||||||
P
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => setStyle('h1')}
|
|
||||||
title="Heading 1"
|
|
||||||
className="text-lg font-bold"
|
|
||||||
>
|
|
||||||
H1
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => setStyle('h2')}
|
|
||||||
title="Heading 2"
|
|
||||||
className="text-base font-bold"
|
|
||||||
>
|
|
||||||
H2
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => setStyle('h3')}
|
|
||||||
title="Heading 3"
|
|
||||||
className="text-sm font-bold"
|
|
||||||
>
|
|
||||||
H3
|
|
||||||
</Button>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Decorator buttons */}
|
|
||||||
<div className="flex items-center gap-1">
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => toggleDecorator('strong')}
|
|
||||||
title="Bold (Ctrl+B)"
|
|
||||||
className="font-bold"
|
|
||||||
>
|
|
||||||
B
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => toggleDecorator('em')}
|
|
||||||
title="Italic (Ctrl+I)"
|
|
||||||
className="italic"
|
|
||||||
>
|
|
||||||
I
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => toggleDecorator('underline')}
|
|
||||||
title="Underline"
|
|
||||||
className="underline"
|
|
||||||
>
|
|
||||||
U
|
|
||||||
</Button>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={() => toggleDecorator('strike')}
|
|
||||||
title="Strike-through"
|
|
||||||
className="line-through"
|
|
||||||
>
|
|
||||||
S
|
|
||||||
</Button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Scrollable toggle */}
|
|
||||||
<div className="flex items-center gap-2">
|
|
||||||
<span className="text-xs theme-text">Scrollable:</span>
|
|
||||||
<Button
|
|
||||||
type="button"
|
|
||||||
size="sm"
|
|
||||||
variant="ghost"
|
|
||||||
onClick={onToggleScrollable}
|
|
||||||
className={isScrollable ? 'theme-accent-bg text-white' : ''}
|
|
||||||
title={isScrollable ? 'Switch to auto-expand mode' : 'Switch to scrollable mode'}
|
|
||||||
>
|
|
||||||
{isScrollable ? '📜' : '📏'}
|
|
||||||
</Button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Simple component that uses Portable Text editor directly
|
|
||||||
function EditorContent({
|
|
||||||
value,
|
|
||||||
onChange,
|
|
||||||
placeholder,
|
|
||||||
error
|
|
||||||
}: {
|
|
||||||
value: string;
|
|
||||||
onChange: (value: string) => void;
|
|
||||||
placeholder?: string;
|
|
||||||
error?: string;
|
|
||||||
}) {
|
|
||||||
const [portableTextValue, setPortableTextValue] = useState<PortableTextBlock[]>(() =>
|
|
||||||
htmlToPortableTextBlocks(value)
|
|
||||||
);
|
|
||||||
const [isScrollable, setIsScrollable] = useState(true); // Default to scrollable
|
|
||||||
|
|
||||||
// Sync HTML value with prop changes
|
|
||||||
useEffect(() => {
|
|
||||||
console.log('🔄 Editor value changed:', { valueLength: value?.length, valuePreview: value?.substring(0, 100) });
|
|
||||||
setPortableTextValue(htmlToPortableTextBlocks(value));
|
|
||||||
}, [value]);
|
|
||||||
|
|
||||||
// Debug: log when portableTextValue changes
|
|
||||||
useEffect(() => {
|
|
||||||
console.log('📝 Portable text blocks updated:', { blockCount: portableTextValue.length, blocks: portableTextValue });
|
|
||||||
}, [portableTextValue]);
|
|
||||||
|
|
||||||
// Add a ref to the editor container for direct paste handling
|
|
||||||
const editorContainerRef = useRef<HTMLDivElement>(null);
|
|
||||||
|
|
||||||
// Global paste event listener to catch ALL paste events
|
|
||||||
useEffect(() => {
|
|
||||||
const handleGlobalPaste = (event: ClipboardEvent) => {
|
|
||||||
console.log('🌍 Global paste event captured');
|
|
||||||
|
|
||||||
// Check if the paste is happening within our editor
|
|
||||||
const target = event.target as Element;
|
|
||||||
const isInEditor = editorContainerRef.current?.contains(target);
|
|
||||||
|
|
||||||
console.log('📋 Paste details:', {
|
|
||||||
isInEditor,
|
|
||||||
targetTag: target?.tagName,
|
|
||||||
targetClasses: target?.className,
|
|
||||||
hasClipboardData: !!event.clipboardData
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isInEditor && event.clipboardData) {
|
|
||||||
const htmlData = event.clipboardData.getData('text/html');
|
|
||||||
const textData = event.clipboardData.getData('text/plain');
|
|
||||||
|
|
||||||
console.log('📋 Clipboard contents:', {
|
|
||||||
htmlLength: htmlData.length,
|
|
||||||
textLength: textData.length,
|
|
||||||
hasImages: htmlData.includes('<img'),
|
|
||||||
htmlPreview: htmlData.substring(0, 300)
|
|
||||||
});
|
|
||||||
|
|
||||||
if (htmlData && htmlData.includes('<img')) {
|
|
||||||
console.log('📋 Images detected in paste! Attempting to process...');
|
|
||||||
|
|
||||||
// Prevent default paste to handle it completely ourselves
|
|
||||||
event.preventDefault();
|
|
||||||
event.stopPropagation();
|
|
||||||
|
|
||||||
// Convert the pasted HTML to our blocks maintaining order
|
|
||||||
const pastedBlocks = htmlToPortableTextBlocks(htmlData);
|
|
||||||
|
|
||||||
console.log('📋 Converted blocks:', pastedBlocks.map(block => ({
|
|
||||||
type: block._type,
|
|
||||||
key: block._key,
|
|
||||||
...(block._type === 'image' ? { src: (block as any).src, alt: (block as any).alt } : {}),
|
|
||||||
...(block._type === 'block' ? { style: (block as any).style, text: (block as any).children?.[0]?.text?.substring(0, 50) } : {})
|
|
||||||
})));
|
|
||||||
|
|
||||||
if (pastedBlocks.length > 0) {
|
|
||||||
// Insert the blocks at the end of current content (maintaining order within the paste)
|
|
||||||
setTimeout(() => {
|
|
||||||
setPortableTextValue(prev => {
|
|
||||||
const updatedBlocks = [...prev, ...pastedBlocks];
|
|
||||||
const html = portableTextToHtml(updatedBlocks);
|
|
||||||
onChange(html);
|
|
||||||
console.log('📋 Added structured blocks maintaining order:', { pastedCount: pastedBlocks.length, totalBlocks: updatedBlocks.length });
|
|
||||||
return updatedBlocks;
|
|
||||||
});
|
|
||||||
}, 10);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
// Add global event listener with capture phase to catch events early
|
|
||||||
document.addEventListener('paste', handleGlobalPaste, true);
|
|
||||||
|
|
||||||
return () => {
|
|
||||||
document.removeEventListener('paste', handleGlobalPaste, true);
|
|
||||||
};
|
|
||||||
}, [onChange]);
|
|
||||||
|
|
||||||
// Handle paste events directly on the editor container (backup approach)
|
|
||||||
const handleContainerPaste = useCallback((_event: React.ClipboardEvent) => {
|
|
||||||
console.log('📦 Container paste handler triggered');
|
|
||||||
// This might not be reached if global handler prevents default
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
// Render functions for the editor
|
|
||||||
const renderStyle: RenderStyleFunction = useCallback((props) => {
|
|
||||||
const { schemaType, children } = props;
|
|
||||||
|
|
||||||
switch (schemaType.value) {
|
|
||||||
case 'h1':
|
|
||||||
return <h1 className="text-3xl font-bold mb-4">{children}</h1>;
|
|
||||||
case 'h2':
|
|
||||||
return <h2 className="text-2xl font-bold mb-3">{children}</h2>;
|
|
||||||
case 'h3':
|
|
||||||
return <h3 className="text-xl font-bold mb-3">{children}</h3>;
|
|
||||||
case 'h4':
|
|
||||||
return <h4 className="text-lg font-bold mb-2">{children}</h4>;
|
|
||||||
case 'h5':
|
|
||||||
return <h5 className="text-base font-bold mb-2">{children}</h5>;
|
|
||||||
case 'h6':
|
|
||||||
return <h6 className="text-sm font-bold mb-2">{children}</h6>;
|
|
||||||
case 'blockquote':
|
|
||||||
return <blockquote className="border-l-4 border-gray-300 pl-4 italic my-4">{children}</blockquote>;
|
|
||||||
default:
|
|
||||||
return <p className="mb-2">{children}</p>;
|
|
||||||
}
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const renderDecorator: RenderDecoratorFunction = useCallback((props) => {
|
|
||||||
const { schemaType, children } = props;
|
|
||||||
|
|
||||||
switch (schemaType.value) {
|
|
||||||
case 'strong':
|
|
||||||
return <strong>{children}</strong>;
|
|
||||||
case 'em':
|
|
||||||
return <em>{children}</em>;
|
|
||||||
case 'underline':
|
|
||||||
return <u>{children}</u>;
|
|
||||||
case 'strike':
|
|
||||||
return <s>{children}</s>;
|
|
||||||
case 'code':
|
|
||||||
return <code className="bg-gray-100 px-1 py-0.5 rounded text-sm font-mono">{children}</code>;
|
|
||||||
default:
|
|
||||||
return <>{children}</>;
|
|
||||||
}
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const renderBlock: RenderBlockFunction = useCallback((props) => {
|
|
||||||
const { schemaType, value, children } = props;
|
|
||||||
|
|
||||||
console.log('🎨 Rendering block:', { schemaType: schemaType.name, valueType: value?._type, value });
|
|
||||||
|
|
||||||
// Handle image blocks
|
|
||||||
if (schemaType.name === 'image' && isImageBlock(value)) {
|
|
||||||
console.log('🖼️ Rendering image block:', value);
|
|
||||||
return (
|
|
||||||
<div className="my-4 p-3 border border-dashed border-gray-300 rounded-lg bg-gray-50">
|
|
||||||
<div className="flex items-center gap-2 mb-2">
|
|
||||||
<span className="text-lg">🖼️</span>
|
|
||||||
<span className="font-medium text-gray-700">Image Block</span>
|
|
||||||
</div>
|
|
||||||
<div className="text-sm text-gray-600 space-y-1">
|
|
||||||
<p><strong>Source:</strong> {value.src || 'No source'}</p>
|
|
||||||
{value.alt && <p><strong>Alt text:</strong> {value.alt}</p>}
|
|
||||||
{value.caption && <p><strong>Caption:</strong> {value.caption}</p>}
|
|
||||||
{(value.width || value.height) && (
|
|
||||||
<p><strong>Dimensions:</strong> {value.width || '?'} × {value.height || '?'}</p>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle code blocks
|
|
||||||
if (schemaType.name === 'codeBlock' && isCodeBlock(value)) {
|
|
||||||
return (
|
|
||||||
<div className="my-4 p-3 border border-dashed border-blue-300 rounded-lg bg-blue-50">
|
|
||||||
<div className="flex items-center gap-2 mb-2">
|
|
||||||
<span className="text-lg">💻</span>
|
|
||||||
<span className="font-medium text-blue-700">Code Block</span>
|
|
||||||
{value.language && (
|
|
||||||
<span className="text-xs bg-blue-200 text-blue-800 px-2 py-1 rounded">
|
|
||||||
{value.language}
|
|
||||||
</span>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
<pre className="text-sm text-gray-800 bg-white p-2 rounded border overflow-x-auto">
|
|
||||||
<code>{value.code || '// No code'}</code>
|
|
||||||
</pre>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Default block rendering
|
|
||||||
return <div>{children}</div>;
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const renderListItem: RenderListItemFunction = useCallback((props) => {
|
|
||||||
return <li>{props.children}</li>;
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const renderAnnotation: RenderAnnotationFunction = useCallback((props) => {
|
|
||||||
const { schemaType, children, value } = props;
|
|
||||||
|
|
||||||
if (schemaType.name === 'link' && value && typeof value === 'object') {
|
|
||||||
const linkValue = value as { href?: string; target?: string; title?: string };
|
|
||||||
return (
|
|
||||||
<a
|
|
||||||
href={linkValue.href}
|
|
||||||
target={linkValue.target || '_self'}
|
|
||||||
title={linkValue.title}
|
|
||||||
className="text-blue-600 hover:text-blue-800 underline"
|
|
||||||
>
|
|
||||||
{children}
|
|
||||||
</a>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
return <>{children}</>;
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="space-y-2">
|
|
||||||
<EditorProvider
|
|
||||||
key={`editor-${portableTextValue.length}-${Date.now()}`}
|
|
||||||
initialConfig={{
|
|
||||||
schemaDefinition: editorSchema,
|
|
||||||
initialValue: portableTextValue,
|
|
||||||
}}
|
|
||||||
>
|
|
||||||
<EditorToolbar
|
|
||||||
isScrollable={isScrollable}
|
|
||||||
onToggleScrollable={() => setIsScrollable(!isScrollable)}
|
|
||||||
/>
|
|
||||||
<div
|
|
||||||
ref={editorContainerRef}
|
|
||||||
className="border theme-border rounded-b-lg overflow-hidden"
|
|
||||||
onPaste={handleContainerPaste}
|
|
||||||
>
|
|
||||||
<PortableTextEditable
|
|
||||||
className={`p-3 focus:outline-none focus:ring-0 resize-none ${
|
|
||||||
isScrollable
|
|
||||||
? 'h-[400px] overflow-y-auto'
|
|
||||||
: 'min-h-[300px]'
|
|
||||||
}`}
|
|
||||||
placeholder={placeholder}
|
|
||||||
renderStyle={renderStyle}
|
|
||||||
renderDecorator={renderDecorator}
|
|
||||||
renderBlock={renderBlock}
|
|
||||||
renderListItem={renderListItem}
|
|
||||||
renderAnnotation={renderAnnotation}
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
</EditorProvider>
|
|
||||||
|
|
||||||
{error && (
|
|
||||||
<p className="text-sm text-red-600 dark:text-red-400">{error}</p>
|
|
||||||
)}
|
|
||||||
|
|
||||||
<div className="text-xs theme-text">
|
|
||||||
<p>
|
|
||||||
<strong>Portable Text Editor:</strong> Rich text editor with structured content.
|
|
||||||
{isScrollable ? ' Fixed height with scrolling.' : ' Auto-expanding height.'}
|
|
||||||
📋 Paste detection active.
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
export default function PortableTextEditorNew({
|
|
||||||
value,
|
|
||||||
onChange,
|
|
||||||
placeholder = 'Write your story here...',
|
|
||||||
error,
|
|
||||||
storyId,
|
|
||||||
enableImageProcessing = false
|
|
||||||
}: PortableTextEditorProps) {
|
|
||||||
console.log('🎯 Portable Text Editor loaded!', {
|
|
||||||
valueLength: value?.length,
|
|
||||||
enableImageProcessing,
|
|
||||||
hasStoryId: !!storyId
|
|
||||||
});
|
|
||||||
|
|
||||||
return (
|
|
||||||
<EditorContent
|
|
||||||
value={value}
|
|
||||||
onChange={onChange}
|
|
||||||
placeholder={placeholder}
|
|
||||||
error={error}
|
|
||||||
/>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
892
frontend/src/components/stories/SlateEditor.tsx
Normal file
892
frontend/src/components/stories/SlateEditor.tsx
Normal file
@@ -0,0 +1,892 @@
|
|||||||
|
'use client';
|
||||||
|
|
||||||
|
import React, { useState, useCallback, useMemo } from 'react';
|
||||||
|
import {
|
||||||
|
createEditor,
|
||||||
|
Descendant,
|
||||||
|
Element as SlateElement,
|
||||||
|
Node as SlateNode,
|
||||||
|
Transforms,
|
||||||
|
Editor,
|
||||||
|
Range
|
||||||
|
} from 'slate';
|
||||||
|
import {
|
||||||
|
Slate,
|
||||||
|
Editable,
|
||||||
|
withReact,
|
||||||
|
ReactEditor,
|
||||||
|
RenderElementProps,
|
||||||
|
RenderLeafProps,
|
||||||
|
useSlate as useEditor
|
||||||
|
} from 'slate-react';
|
||||||
|
import { withHistory } from 'slate-history';
|
||||||
|
import Button from '../ui/Button';
|
||||||
|
import { sanitizeHtmlSync } from '../../lib/sanitization';
|
||||||
|
import { debug } from '../../lib/debug';
|
||||||
|
|
||||||
|
interface SlateEditorProps {
|
||||||
|
value: string; // HTML value for compatibility with existing code
|
||||||
|
onChange: (value: string) => void; // Returns HTML for compatibility
|
||||||
|
placeholder?: string;
|
||||||
|
error?: string;
|
||||||
|
storyId?: string;
|
||||||
|
enableImageProcessing?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Custom types for our editor
|
||||||
|
type CustomElement = {
|
||||||
|
type: 'paragraph' | 'heading-one' | 'heading-two' | 'heading-three' | 'blockquote' | 'image' | 'code-block';
|
||||||
|
children: CustomText[];
|
||||||
|
src?: string; // for images
|
||||||
|
alt?: string; // for images
|
||||||
|
caption?: string; // for images
|
||||||
|
language?: string; // for code blocks
|
||||||
|
};
|
||||||
|
|
||||||
|
type CustomText = {
|
||||||
|
text: string;
|
||||||
|
bold?: boolean;
|
||||||
|
italic?: boolean;
|
||||||
|
underline?: boolean;
|
||||||
|
strikethrough?: boolean;
|
||||||
|
code?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
declare module 'slate' {
|
||||||
|
interface CustomTypes {
|
||||||
|
Editor: ReactEditor;
|
||||||
|
Element: CustomElement;
|
||||||
|
Text: CustomText;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// HTML to Slate conversion - preserves mixed content order
|
||||||
|
const htmlToSlate = (html: string): Descendant[] => {
|
||||||
|
if (!html || html.trim() === '') {
|
||||||
|
return [{ type: 'paragraph', children: [{ text: '' }] }];
|
||||||
|
}
|
||||||
|
|
||||||
|
const sanitizedHtml = sanitizeHtmlSync(html);
|
||||||
|
const parser = new DOMParser();
|
||||||
|
const doc = parser.parseFromString(sanitizedHtml, 'text/html');
|
||||||
|
|
||||||
|
const nodes: Descendant[] = [];
|
||||||
|
|
||||||
|
// Process all nodes in document order to maintain sequence
|
||||||
|
const processChildNodes = (parentNode: Node): Descendant[] => {
|
||||||
|
const results: Descendant[] = [];
|
||||||
|
|
||||||
|
Array.from(parentNode.childNodes).forEach(node => {
|
||||||
|
if (node.nodeType === Node.ELEMENT_NODE) {
|
||||||
|
const element = node as Element;
|
||||||
|
|
||||||
|
switch (element.tagName.toLowerCase()) {
|
||||||
|
case 'h1':
|
||||||
|
results.push({
|
||||||
|
type: 'heading-one',
|
||||||
|
children: [{ text: element.textContent || '' }]
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case 'h2':
|
||||||
|
results.push({
|
||||||
|
type: 'heading-two',
|
||||||
|
children: [{ text: element.textContent || '' }]
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case 'h3':
|
||||||
|
results.push({
|
||||||
|
type: 'heading-three',
|
||||||
|
children: [{ text: element.textContent || '' }]
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case 'blockquote':
|
||||||
|
results.push({
|
||||||
|
type: 'blockquote',
|
||||||
|
children: [{ text: element.textContent || '' }]
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case 'img':
|
||||||
|
const img = element as HTMLImageElement;
|
||||||
|
results.push({
|
||||||
|
type: 'image',
|
||||||
|
src: img.src || img.getAttribute('src') || '',
|
||||||
|
alt: img.alt || img.getAttribute('alt') || '',
|
||||||
|
caption: img.title || img.getAttribute('title') || '',
|
||||||
|
children: [{ text: '' }] // Images need children in Slate
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case 'pre':
|
||||||
|
const codeEl = element.querySelector('code');
|
||||||
|
const code = codeEl ? codeEl.textContent || '' : element.textContent || '';
|
||||||
|
const language = codeEl?.className?.replace('language-', '') || '';
|
||||||
|
results.push({
|
||||||
|
type: 'code-block',
|
||||||
|
language,
|
||||||
|
children: [{ text: code }]
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
case 'p':
|
||||||
|
case 'div':
|
||||||
|
// Check if this paragraph contains mixed content (text + images)
|
||||||
|
if (element.querySelector('img')) {
|
||||||
|
// Process mixed content - handle both text and images in order
|
||||||
|
results.push(...processChildNodes(element));
|
||||||
|
} else {
|
||||||
|
const text = element.textContent || '';
|
||||||
|
if (text.trim()) {
|
||||||
|
results.push({
|
||||||
|
type: 'paragraph',
|
||||||
|
children: [{ text }]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case 'br':
|
||||||
|
// Handle line breaks by creating empty paragraphs
|
||||||
|
results.push({
|
||||||
|
type: 'paragraph',
|
||||||
|
children: [{ text: '' }]
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
// For other elements, try to extract text or recurse
|
||||||
|
const text = element.textContent || '';
|
||||||
|
if (text.trim()) {
|
||||||
|
results.push({
|
||||||
|
type: 'paragraph',
|
||||||
|
children: [{ text }]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
} else if (node.nodeType === Node.TEXT_NODE) {
|
||||||
|
const text = node.textContent || '';
|
||||||
|
if (text.trim()) {
|
||||||
|
results.push({
|
||||||
|
type: 'paragraph',
|
||||||
|
children: [{ text: text.trim() }]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return results;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Process all content
|
||||||
|
nodes.push(...processChildNodes(doc.body));
|
||||||
|
|
||||||
|
// Fallback for simple text content
|
||||||
|
if (nodes.length === 0 && doc.body.textContent?.trim()) {
|
||||||
|
const text = doc.body.textContent.trim();
|
||||||
|
const lines = text.split('\n').filter(line => line.trim());
|
||||||
|
lines.forEach(line => {
|
||||||
|
nodes.push({
|
||||||
|
type: 'paragraph',
|
||||||
|
children: [{ text: line.trim() }]
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return nodes.length > 0 ? nodes : [{ type: 'paragraph', children: [{ text: '' }] }];
|
||||||
|
};
|
||||||
|
|
||||||
|
// Slate to HTML conversion
|
||||||
|
const slateToHtml = (nodes: Descendant[]): string => {
|
||||||
|
const htmlParts: string[] = [];
|
||||||
|
|
||||||
|
nodes.forEach(node => {
|
||||||
|
if (SlateElement.isElement(node)) {
|
||||||
|
const element = node as CustomElement;
|
||||||
|
const text = SlateNode.string(node);
|
||||||
|
|
||||||
|
switch (element.type) {
|
||||||
|
case 'heading-one':
|
||||||
|
htmlParts.push(`<h1>${text}</h1>`);
|
||||||
|
break;
|
||||||
|
case 'heading-two':
|
||||||
|
htmlParts.push(`<h2>${text}</h2>`);
|
||||||
|
break;
|
||||||
|
case 'heading-three':
|
||||||
|
htmlParts.push(`<h3>${text}</h3>`);
|
||||||
|
break;
|
||||||
|
case 'blockquote':
|
||||||
|
htmlParts.push(`<blockquote>${text}</blockquote>`);
|
||||||
|
break;
|
||||||
|
case 'image':
|
||||||
|
const attrs: string[] = [];
|
||||||
|
if (element.src) attrs.push(`src="${element.src}"`);
|
||||||
|
if (element.alt) attrs.push(`alt="${element.alt}"`);
|
||||||
|
if (element.caption) attrs.push(`title="${element.caption}"`);
|
||||||
|
htmlParts.push(`<img ${attrs.join(' ')} />`);
|
||||||
|
break;
|
||||||
|
case 'code-block':
|
||||||
|
const langClass = element.language ? ` class="language-${element.language}"` : '';
|
||||||
|
const escapedText = text
|
||||||
|
.replace(/&/g, '&')
|
||||||
|
.replace(/</g, '<')
|
||||||
|
.replace(/>/g, '>')
|
||||||
|
.replace(/"/g, '"')
|
||||||
|
.replace(/'/g, ''');
|
||||||
|
htmlParts.push(`<pre><code${langClass}>${escapedText}</code></pre>`);
|
||||||
|
break;
|
||||||
|
case 'paragraph':
|
||||||
|
default:
|
||||||
|
htmlParts.push(text ? `<p>${text}</p>` : '<p></p>');
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
const html = htmlParts.join('\n');
|
||||||
|
return sanitizeHtmlSync(html);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Custom plugin to handle images
|
||||||
|
const withImages = (editor: ReactEditor) => {
|
||||||
|
const { insertData, isVoid } = editor;
|
||||||
|
|
||||||
|
editor.isVoid = element => {
|
||||||
|
return element.type === 'image' ? true : isVoid(element);
|
||||||
|
};
|
||||||
|
|
||||||
|
editor.insertData = (data) => {
|
||||||
|
const html = data.getData('text/html');
|
||||||
|
|
||||||
|
if (html && html.includes('<img')) {
|
||||||
|
debug.log('📋 Image paste detected in Slate editor');
|
||||||
|
|
||||||
|
// Convert HTML to Slate nodes maintaining order
|
||||||
|
const slateNodes = htmlToSlate(html);
|
||||||
|
|
||||||
|
// Insert all nodes in sequence
|
||||||
|
slateNodes.forEach(node => {
|
||||||
|
Transforms.insertNodes(editor, node);
|
||||||
|
});
|
||||||
|
|
||||||
|
debug.log(`📋 Inserted ${slateNodes.length} nodes from pasted HTML`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
insertData(data);
|
||||||
|
};
|
||||||
|
|
||||||
|
return editor;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Interactive Image Component
|
||||||
|
const ImageElement = ({ attributes, element, children }: {
|
||||||
|
attributes: any;
|
||||||
|
element: CustomElement;
|
||||||
|
children: React.ReactNode;
|
||||||
|
}) => {
|
||||||
|
const editor = useEditor();
|
||||||
|
const [isEditing, setIsEditing] = useState(false);
|
||||||
|
const [editUrl, setEditUrl] = useState(element.src || '');
|
||||||
|
const [editAlt, setEditAlt] = useState(element.alt || '');
|
||||||
|
const [editCaption, setEditCaption] = useState(element.caption || '');
|
||||||
|
|
||||||
|
const handleDelete = () => {
|
||||||
|
const path = ReactEditor.findPath(editor, element);
|
||||||
|
Transforms.removeNodes(editor, { at: path });
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSave = () => {
|
||||||
|
const path = ReactEditor.findPath(editor, element);
|
||||||
|
const newProperties: Partial<CustomElement> = {
|
||||||
|
src: editUrl,
|
||||||
|
alt: editAlt,
|
||||||
|
caption: editCaption,
|
||||||
|
};
|
||||||
|
Transforms.setNodes(editor, newProperties, { at: path });
|
||||||
|
setIsEditing(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleCancel = () => {
|
||||||
|
setEditUrl(element.src || '');
|
||||||
|
setEditAlt(element.alt || '');
|
||||||
|
setEditCaption(element.caption || '');
|
||||||
|
setIsEditing(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
if (isEditing) {
|
||||||
|
return (
|
||||||
|
<div {...attributes} contentEditable={false} className="my-4">
|
||||||
|
<div className="border border-blue-300 rounded-lg p-4 bg-blue-50">
|
||||||
|
<h4 className="font-medium text-blue-900 mb-3">Edit Image</h4>
|
||||||
|
<div className="space-y-3">
|
||||||
|
<div>
|
||||||
|
<label className="block text-sm font-medium text-blue-800 mb-1">
|
||||||
|
Image URL *
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="url"
|
||||||
|
value={editUrl}
|
||||||
|
onChange={(e) => setEditUrl(e.target.value)}
|
||||||
|
className="w-full px-3 py-2 border border-blue-300 rounded-md focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
|
||||||
|
placeholder="https://example.com/image.jpg"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label className="block text-sm font-medium text-blue-800 mb-1">
|
||||||
|
Alt Text
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={editAlt}
|
||||||
|
onChange={(e) => setEditAlt(e.target.value)}
|
||||||
|
className="w-full px-3 py-2 border border-blue-300 rounded-md focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
|
||||||
|
placeholder="Describe the image"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<label className="block text-sm font-medium text-blue-800 mb-1">
|
||||||
|
Caption
|
||||||
|
</label>
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={editCaption}
|
||||||
|
onChange={(e) => setEditCaption(e.target.value)}
|
||||||
|
className="w-full px-3 py-2 border border-blue-300 rounded-md focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
|
||||||
|
placeholder="Image caption"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div className="flex gap-2 mt-4">
|
||||||
|
<button
|
||||||
|
onClick={handleSave}
|
||||||
|
className="px-3 py-1 bg-blue-600 text-white text-sm rounded hover:bg-blue-700 focus:ring-2 focus:ring-blue-500"
|
||||||
|
>
|
||||||
|
Save
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
onClick={handleCancel}
|
||||||
|
className="px-3 py-1 bg-gray-300 text-gray-700 text-sm rounded hover:bg-gray-400 focus:ring-2 focus:ring-gray-500"
|
||||||
|
>
|
||||||
|
Cancel
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{children}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div {...attributes} contentEditable={false} className="my-4">
|
||||||
|
<div
|
||||||
|
className="relative border border-gray-200 rounded-lg overflow-hidden bg-white shadow-sm group hover:shadow-md transition-shadow focus-within:ring-2 focus-within:ring-blue-500 focus-within:border-blue-500"
|
||||||
|
tabIndex={0}
|
||||||
|
onKeyDown={(event) => {
|
||||||
|
// Handle delete/backspace on focused image
|
||||||
|
if (event.key === 'Delete' || event.key === 'Backspace') {
|
||||||
|
event.preventDefault();
|
||||||
|
handleDelete();
|
||||||
|
}
|
||||||
|
// Handle Enter to edit
|
||||||
|
if (event.key === 'Enter') {
|
||||||
|
event.preventDefault();
|
||||||
|
setIsEditing(true);
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
onClick={() => {
|
||||||
|
// Focus the image element when clicked
|
||||||
|
const path = ReactEditor.findPath(editor, element);
|
||||||
|
const start = Editor.start(editor, path);
|
||||||
|
Transforms.select(editor, start);
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
{/* Control buttons - show on hover */}
|
||||||
|
<div className="absolute top-2 left-2 opacity-0 group-hover:opacity-100 transition-opacity z-10">
|
||||||
|
<div className="flex gap-1">
|
||||||
|
<button
|
||||||
|
onClick={() => setIsEditing(true)}
|
||||||
|
className="p-1 bg-white rounded-full shadow-sm hover:bg-blue-50 border border-gray-200 text-blue-600 hover:text-blue-700"
|
||||||
|
title="Edit image"
|
||||||
|
>
|
||||||
|
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z" />
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
onClick={handleDelete}
|
||||||
|
className="p-1 bg-white rounded-full shadow-sm hover:bg-red-50 border border-gray-200 text-red-600 hover:text-red-700"
|
||||||
|
title="Delete image"
|
||||||
|
>
|
||||||
|
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{element.src ? (
|
||||||
|
<>
|
||||||
|
<img
|
||||||
|
src={element.src}
|
||||||
|
alt={element.alt || ''}
|
||||||
|
className="w-full h-auto max-h-96 object-contain cursor-pointer"
|
||||||
|
onDoubleClick={() => setIsEditing(true)}
|
||||||
|
onError={(e) => {
|
||||||
|
// Fallback to text block if image fails to load
|
||||||
|
const target = e.target as HTMLImageElement;
|
||||||
|
const parent = target.parentElement;
|
||||||
|
if (parent) {
|
||||||
|
parent.innerHTML = `
|
||||||
|
<div class="p-3 border border-dashed border-red-300 rounded-lg bg-red-50">
|
||||||
|
<div class="flex items-center gap-2 mb-2">
|
||||||
|
<span class="text-lg">⚠️</span>
|
||||||
|
<span class="font-medium text-red-700">Image failed to load</span>
|
||||||
|
</div>
|
||||||
|
<div class="text-sm text-red-600 space-y-1">
|
||||||
|
<p><strong>Source:</strong> ${element.src}</p>
|
||||||
|
${element.alt ? `<p><strong>Alt:</strong> ${element.alt}</p>` : ''}
|
||||||
|
${element.caption ? `<p><strong>Caption:</strong> ${element.caption}</p>` : ''}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`;
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
{(element.alt || element.caption) && (
|
||||||
|
<div className="p-2 bg-gray-50 border-t border-gray-200">
|
||||||
|
<div className="text-sm text-gray-600">
|
||||||
|
{element.caption && (
|
||||||
|
<p className="font-medium">{element.caption}</p>
|
||||||
|
)}
|
||||||
|
{element.alt && element.alt !== element.caption && (
|
||||||
|
<p className="italic">{element.alt}</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* External image indicator */}
|
||||||
|
{element.src.startsWith('http') && (
|
||||||
|
<div className="absolute top-2 right-2">
|
||||||
|
<div className="bg-blue-100 text-blue-800 text-xs px-2 py-1 rounded-full flex items-center gap-1">
|
||||||
|
<span>🌐</span>
|
||||||
|
<span>External</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<div className="p-3 border border-dashed border-gray-300 rounded-lg bg-gray-50">
|
||||||
|
<div className="flex items-center gap-2 mb-2">
|
||||||
|
<span className="text-lg">🖼️</span>
|
||||||
|
<span className="font-medium text-gray-700">Image (No Source)</span>
|
||||||
|
</div>
|
||||||
|
<div className="text-sm text-gray-600 space-y-1">
|
||||||
|
{element.alt && <p><strong>Alt:</strong> {element.alt}</p>}
|
||||||
|
{element.caption && <p><strong>Caption:</strong> {element.caption}</p>}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
{children}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Component for rendering elements
|
||||||
|
const Element = ({ attributes, children, element }: RenderElementProps) => {
|
||||||
|
const customElement = element as CustomElement;
|
||||||
|
|
||||||
|
switch (customElement.type) {
|
||||||
|
case 'heading-one':
|
||||||
|
return <h1 {...attributes} className="text-3xl font-bold mb-4">{children}</h1>;
|
||||||
|
case 'heading-two':
|
||||||
|
return <h2 {...attributes} className="text-2xl font-bold mb-3">{children}</h2>;
|
||||||
|
case 'heading-three':
|
||||||
|
return <h3 {...attributes} className="text-xl font-bold mb-3">{children}</h3>;
|
||||||
|
case 'blockquote':
|
||||||
|
return <blockquote {...attributes} className="border-l-4 border-gray-300 pl-4 italic my-4">{children}</blockquote>;
|
||||||
|
case 'image':
|
||||||
|
return (
|
||||||
|
<ImageElement
|
||||||
|
attributes={attributes}
|
||||||
|
element={customElement}
|
||||||
|
children={children}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
case 'code-block':
|
||||||
|
return (
|
||||||
|
<pre {...attributes} className="my-4 p-3 bg-gray-100 rounded-lg overflow-x-auto">
|
||||||
|
<code className="text-sm font-mono">{children}</code>
|
||||||
|
</pre>
|
||||||
|
);
|
||||||
|
default:
|
||||||
|
return <p {...attributes} className="mb-2">{children}</p>;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Component for rendering leaves (text formatting)
|
||||||
|
const Leaf = ({ attributes, children, leaf }: RenderLeafProps) => {
|
||||||
|
const customLeaf = leaf as CustomText;
|
||||||
|
|
||||||
|
if (customLeaf.bold) {
|
||||||
|
children = <strong>{children}</strong>;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (customLeaf.italic) {
|
||||||
|
children = <em>{children}</em>;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (customLeaf.underline) {
|
||||||
|
children = <u>{children}</u>;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (customLeaf.strikethrough) {
|
||||||
|
children = <s>{children}</s>;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (customLeaf.code) {
|
||||||
|
children = <code className="bg-gray-100 px-1 py-0.5 rounded text-sm font-mono">{children}</code>;
|
||||||
|
}
|
||||||
|
|
||||||
|
return <span {...attributes}>{children}</span>;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Toolbar component
|
||||||
|
const Toolbar = ({ editor }: { editor: ReactEditor }) => {
|
||||||
|
type MarkFormat = 'bold' | 'italic' | 'underline' | 'strikethrough' | 'code';
|
||||||
|
|
||||||
|
const isMarkActive = (format: MarkFormat) => {
|
||||||
|
const marks = Editor.marks(editor);
|
||||||
|
return marks ? marks[format] === true : false;
|
||||||
|
};
|
||||||
|
|
||||||
|
const toggleMark = (format: MarkFormat) => {
|
||||||
|
const isActive = isMarkActive(format);
|
||||||
|
if (isActive) {
|
||||||
|
Editor.removeMark(editor, format);
|
||||||
|
} else {
|
||||||
|
Editor.addMark(editor, format, true);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const isBlockActive = (format: CustomElement['type']) => {
|
||||||
|
const { selection } = editor;
|
||||||
|
if (!selection) return false;
|
||||||
|
|
||||||
|
const [match] = Array.from(
|
||||||
|
Editor.nodes(editor, {
|
||||||
|
at: Editor.unhangRange(editor, selection),
|
||||||
|
match: n =>
|
||||||
|
!Editor.isEditor(n) &&
|
||||||
|
SlateElement.isElement(n) &&
|
||||||
|
n.type === format,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
|
return !!match;
|
||||||
|
};
|
||||||
|
|
||||||
|
const toggleBlock = (format: CustomElement['type']) => {
|
||||||
|
const isActive = isBlockActive(format);
|
||||||
|
|
||||||
|
Transforms.setNodes(
|
||||||
|
editor,
|
||||||
|
{ type: isActive ? 'paragraph' : format },
|
||||||
|
{ match: n => SlateElement.isElement(n) && Editor.isBlock(editor, n) }
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
const insertImage = () => {
|
||||||
|
const url = prompt('Enter image URL:', 'https://');
|
||||||
|
if (url && url.trim() !== 'https://') {
|
||||||
|
const imageNode: CustomElement = {
|
||||||
|
type: 'image',
|
||||||
|
src: url.trim(),
|
||||||
|
alt: '',
|
||||||
|
caption: '',
|
||||||
|
children: [{ text: '' }],
|
||||||
|
};
|
||||||
|
|
||||||
|
Transforms.insertNodes(editor, imageNode);
|
||||||
|
// Add a paragraph after the image
|
||||||
|
Transforms.insertNodes(editor, {
|
||||||
|
type: 'paragraph',
|
||||||
|
children: [{ text: '' }],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="flex items-center gap-2 p-2 theme-card border theme-border rounded-t-lg">
|
||||||
|
<div className="text-xs bg-green-100 text-green-800 px-2 py-1 rounded">
|
||||||
|
✨ Slate.js Editor
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Block type buttons */}
|
||||||
|
<div className="flex items-center gap-1 border-r pr-2 mr-2">
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => toggleBlock('paragraph')}
|
||||||
|
className={isBlockActive('paragraph') ? 'theme-accent-bg text-white' : ''}
|
||||||
|
title="Normal paragraph"
|
||||||
|
>
|
||||||
|
P
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => toggleBlock('heading-one')}
|
||||||
|
className={`text-lg font-bold ${isBlockActive('heading-one') ? 'theme-accent-bg text-white' : ''}`}
|
||||||
|
title="Heading 1"
|
||||||
|
>
|
||||||
|
H1
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => toggleBlock('heading-two')}
|
||||||
|
className={`text-base font-bold ${isBlockActive('heading-two') ? 'theme-accent-bg text-white' : ''}`}
|
||||||
|
title="Heading 2"
|
||||||
|
>
|
||||||
|
H2
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => toggleBlock('heading-three')}
|
||||||
|
className={`text-sm font-bold ${isBlockActive('heading-three') ? 'theme-accent-bg text-white' : ''}`}
|
||||||
|
title="Heading 3"
|
||||||
|
>
|
||||||
|
H3
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Text formatting buttons */}
|
||||||
|
<div className="flex items-center gap-1">
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => toggleMark('bold')}
|
||||||
|
className={`font-bold ${isMarkActive('bold') ? 'theme-accent-bg text-white' : ''}`}
|
||||||
|
title="Bold (Ctrl+B)"
|
||||||
|
>
|
||||||
|
B
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => toggleMark('italic')}
|
||||||
|
className={`italic ${isMarkActive('italic') ? 'theme-accent-bg text-white' : ''}`}
|
||||||
|
title="Italic (Ctrl+I)"
|
||||||
|
>
|
||||||
|
I
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => toggleMark('underline')}
|
||||||
|
className={`underline ${isMarkActive('underline') ? 'theme-accent-bg text-white' : ''}`}
|
||||||
|
title="Underline"
|
||||||
|
>
|
||||||
|
U
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => toggleMark('strikethrough')}
|
||||||
|
className={`line-through ${isMarkActive('strikethrough') ? 'theme-accent-bg text-white' : ''}`}
|
||||||
|
title="Strike-through"
|
||||||
|
>
|
||||||
|
S
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Image insertion button */}
|
||||||
|
<div className="flex items-center gap-1 border-l pl-2 ml-2">
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={insertImage}
|
||||||
|
className="text-green-600 hover:bg-green-50"
|
||||||
|
title="Insert Image"
|
||||||
|
>
|
||||||
|
🖼️
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default function SlateEditor({
|
||||||
|
value,
|
||||||
|
onChange,
|
||||||
|
placeholder = 'Write your story here...',
|
||||||
|
error,
|
||||||
|
storyId,
|
||||||
|
enableImageProcessing = false
|
||||||
|
}: SlateEditorProps) {
|
||||||
|
const [isScrollable, setIsScrollable] = useState(true);
|
||||||
|
|
||||||
|
// Create editor with plugins
|
||||||
|
const editor = useMemo(
|
||||||
|
() => withImages(withHistory(withReact(createEditor()))),
|
||||||
|
[]
|
||||||
|
);
|
||||||
|
|
||||||
|
// Convert HTML to Slate format for initial value
|
||||||
|
const initialValue = useMemo(() => {
|
||||||
|
debug.log('🚀 Slate Editor initializing with HTML:', { htmlLength: value?.length });
|
||||||
|
return htmlToSlate(value);
|
||||||
|
}, [value]);
|
||||||
|
|
||||||
|
// Handle changes
|
||||||
|
const handleChange = useCallback((newValue: Descendant[]) => {
|
||||||
|
// Convert back to HTML and call onChange
|
||||||
|
const html = slateToHtml(newValue);
|
||||||
|
onChange(html);
|
||||||
|
|
||||||
|
debug.log('📝 Slate Editor changed:', {
|
||||||
|
htmlLength: html.length,
|
||||||
|
nodeCount: newValue.length
|
||||||
|
});
|
||||||
|
}, [onChange]);
|
||||||
|
|
||||||
|
debug.log('🎯 Slate Editor loaded!', {
|
||||||
|
valueLength: value?.length,
|
||||||
|
enableImageProcessing,
|
||||||
|
hasStoryId: !!storyId
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-2">
|
||||||
|
<Slate editor={editor} initialValue={initialValue} onChange={handleChange}>
|
||||||
|
<Toolbar editor={editor} />
|
||||||
|
<div className="border theme-border rounded-b-lg overflow-hidden">
|
||||||
|
<Editable
|
||||||
|
className={`p-3 focus:outline-none focus:ring-0 resize-none ${
|
||||||
|
isScrollable
|
||||||
|
? 'h-[400px] overflow-y-auto'
|
||||||
|
: 'min-h-[300px]'
|
||||||
|
}`}
|
||||||
|
placeholder={placeholder}
|
||||||
|
renderElement={Element}
|
||||||
|
renderLeaf={Leaf}
|
||||||
|
onKeyDown={(event) => {
|
||||||
|
// Handle delete/backspace for selected content (including images)
|
||||||
|
if (event.key === 'Delete' || event.key === 'Backspace') {
|
||||||
|
const { selection } = editor;
|
||||||
|
if (!selection) return;
|
||||||
|
|
||||||
|
// If there's an expanded selection, let Slate handle it naturally
|
||||||
|
// This will delete all selected content including images
|
||||||
|
if (!Range.isCollapsed(selection)) {
|
||||||
|
// Slate will handle this automatically, including void elements
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle single point deletions near images
|
||||||
|
const { anchor } = selection;
|
||||||
|
|
||||||
|
if (event.key === 'Delete') {
|
||||||
|
// Delete key - check if next node is an image
|
||||||
|
try {
|
||||||
|
const [nextNode] = Editor.next(editor, { at: anchor }) || [];
|
||||||
|
if (nextNode && SlateElement.isElement(nextNode) && nextNode.type === 'image') {
|
||||||
|
event.preventDefault();
|
||||||
|
const path = ReactEditor.findPath(editor, nextNode);
|
||||||
|
Transforms.removeNodes(editor, { at: path });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
// Ignore navigation errors at document boundaries
|
||||||
|
}
|
||||||
|
} else if (event.key === 'Backspace') {
|
||||||
|
// Backspace key - check if previous node is an image
|
||||||
|
try {
|
||||||
|
const [prevNode] = Editor.previous(editor, { at: anchor }) || [];
|
||||||
|
if (prevNode && SlateElement.isElement(prevNode) && prevNode.type === 'image') {
|
||||||
|
event.preventDefault();
|
||||||
|
const path = ReactEditor.findPath(editor, prevNode);
|
||||||
|
Transforms.removeNodes(editor, { at: path });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
// Ignore navigation errors at document boundaries
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle keyboard shortcuts
|
||||||
|
if (!event.ctrlKey && !event.metaKey) return;
|
||||||
|
|
||||||
|
switch (event.key) {
|
||||||
|
case 'b': {
|
||||||
|
event.preventDefault();
|
||||||
|
const marks = Editor.marks(editor);
|
||||||
|
const isActive = marks ? marks.bold === true : false;
|
||||||
|
if (isActive) {
|
||||||
|
Editor.removeMark(editor, 'bold');
|
||||||
|
} else {
|
||||||
|
Editor.addMark(editor, 'bold', true);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
case 'i': {
|
||||||
|
event.preventDefault();
|
||||||
|
const marks = Editor.marks(editor);
|
||||||
|
const isActive = marks ? marks.italic === true : false;
|
||||||
|
if (isActive) {
|
||||||
|
Editor.removeMark(editor, 'italic');
|
||||||
|
} else {
|
||||||
|
Editor.addMark(editor, 'italic', true);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
case 'a': {
|
||||||
|
// Handle Ctrl+A / Cmd+A to select all
|
||||||
|
event.preventDefault();
|
||||||
|
Transforms.select(editor, {
|
||||||
|
anchor: Editor.start(editor, []),
|
||||||
|
focus: Editor.end(editor, []),
|
||||||
|
});
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex justify-between items-center">
|
||||||
|
<div className="text-xs theme-text">
|
||||||
|
<p>
|
||||||
|
<strong>Slate.js Editor:</strong> Rich text editor with advanced image paste handling.
|
||||||
|
{isScrollable ? ' Fixed height with scrolling.' : ' Auto-expanding height.'}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => setIsScrollable(!isScrollable)}
|
||||||
|
className={isScrollable ? 'theme-accent-bg text-white' : ''}
|
||||||
|
title={isScrollable ? 'Switch to auto-expand mode' : 'Switch to scrollable mode'}
|
||||||
|
>
|
||||||
|
{isScrollable ? '📜' : '📏'}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</Slate>
|
||||||
|
|
||||||
|
{error && (
|
||||||
|
<p className="text-sm text-red-600 dark:text-red-400">{error}</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -218,43 +218,91 @@ export const storyApi = {
|
|||||||
hiddenGemsOnly?: boolean;
|
hiddenGemsOnly?: boolean;
|
||||||
}): Promise<Story | null> => {
|
}): Promise<Story | null> => {
|
||||||
try {
|
try {
|
||||||
// Create URLSearchParams to properly handle array parameters like tags
|
// Use proper Solr RandomSortField with dynamic field random_* for true randomness
|
||||||
const searchParams = new URLSearchParams();
|
// Each call generates a different random seed to ensure different random results
|
||||||
|
const randomSeed = Math.floor(Math.random() * 1000000);
|
||||||
if (filters?.searchQuery) {
|
const searchResult = await searchApi.search({
|
||||||
searchParams.append('searchQuery', filters.searchQuery);
|
query: filters?.searchQuery || '*:*',
|
||||||
}
|
page: 0,
|
||||||
if (filters?.tags && filters.tags.length > 0) {
|
size: 1, // Only get one result - Solr RandomSortField considers entire dataset
|
||||||
filters.tags.forEach(tag => searchParams.append('tags', tag));
|
authors: [],
|
||||||
}
|
tags: filters?.tags || [],
|
||||||
|
minRating: filters?.minRating,
|
||||||
// Advanced filters
|
maxRating: filters?.maxRating,
|
||||||
if (filters?.minWordCount !== undefined) searchParams.append('minWordCount', filters.minWordCount.toString());
|
sortBy: `random_${randomSeed}`, // Use proper dynamic field with random seed
|
||||||
if (filters?.maxWordCount !== undefined) searchParams.append('maxWordCount', filters.maxWordCount.toString());
|
sortDir: 'desc',
|
||||||
if (filters?.createdAfter) searchParams.append('createdAfter', filters.createdAfter);
|
|
||||||
if (filters?.createdBefore) searchParams.append('createdBefore', filters.createdBefore);
|
// Advanced filters - pass through all filter options
|
||||||
if (filters?.lastReadAfter) searchParams.append('lastReadAfter', filters.lastReadAfter);
|
minWordCount: filters?.minWordCount,
|
||||||
if (filters?.lastReadBefore) searchParams.append('lastReadBefore', filters.lastReadBefore);
|
maxWordCount: filters?.maxWordCount,
|
||||||
if (filters?.minRating !== undefined) searchParams.append('minRating', filters.minRating.toString());
|
createdAfter: filters?.createdAfter,
|
||||||
if (filters?.maxRating !== undefined) searchParams.append('maxRating', filters.maxRating.toString());
|
createdBefore: filters?.createdBefore,
|
||||||
if (filters?.unratedOnly !== undefined) searchParams.append('unratedOnly', filters.unratedOnly.toString());
|
lastReadAfter: filters?.lastReadAfter,
|
||||||
if (filters?.readingStatus) searchParams.append('readingStatus', filters.readingStatus);
|
lastReadBefore: filters?.lastReadBefore,
|
||||||
if (filters?.hasReadingProgress !== undefined) searchParams.append('hasReadingProgress', filters.hasReadingProgress.toString());
|
unratedOnly: filters?.unratedOnly,
|
||||||
if (filters?.hasCoverImage !== undefined) searchParams.append('hasCoverImage', filters.hasCoverImage.toString());
|
readingStatus: filters?.readingStatus,
|
||||||
if (filters?.sourceDomain) searchParams.append('sourceDomain', filters.sourceDomain);
|
hasReadingProgress: filters?.hasReadingProgress,
|
||||||
if (filters?.seriesFilter) searchParams.append('seriesFilter', filters.seriesFilter);
|
hasCoverImage: filters?.hasCoverImage,
|
||||||
if (filters?.minTagCount !== undefined) searchParams.append('minTagCount', filters.minTagCount.toString());
|
sourceDomain: filters?.sourceDomain,
|
||||||
if (filters?.popularOnly !== undefined) searchParams.append('popularOnly', filters.popularOnly.toString());
|
seriesFilter: filters?.seriesFilter,
|
||||||
if (filters?.hiddenGemsOnly !== undefined) searchParams.append('hiddenGemsOnly', filters.hiddenGemsOnly.toString());
|
minTagCount: filters?.minTagCount,
|
||||||
|
popularOnly: filters?.popularOnly,
|
||||||
const response = await api.get(`/stories/random?${searchParams.toString()}`);
|
hiddenGemsOnly: filters?.hiddenGemsOnly,
|
||||||
return response.data;
|
});
|
||||||
|
|
||||||
|
return searchResult.results && searchResult.results.length > 0
|
||||||
|
? searchResult.results[0]
|
||||||
|
: null;
|
||||||
|
|
||||||
} catch (error: any) {
|
} catch (error: any) {
|
||||||
if (error.response?.status === 204) {
|
if (error.response?.status === 404 || error.response?.status === 204) {
|
||||||
// No content - no stories match filters
|
// No content - no stories match filters
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
throw error;
|
|
||||||
|
// If random sorting fails, fallback to client-side approach
|
||||||
|
console.warn('Solr random sorting failed, falling back to client-side selection:', error.message);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Fallback: get larger sample and pick randomly client-side
|
||||||
|
const fallbackResult = await searchApi.search({
|
||||||
|
query: filters?.searchQuery || '*:*',
|
||||||
|
page: 0,
|
||||||
|
size: 200, // Large enough sample for good randomness
|
||||||
|
authors: [],
|
||||||
|
tags: filters?.tags || [],
|
||||||
|
minRating: filters?.minRating,
|
||||||
|
maxRating: filters?.maxRating,
|
||||||
|
sortBy: 'createdAt',
|
||||||
|
sortDir: 'desc',
|
||||||
|
|
||||||
|
// Same advanced filters
|
||||||
|
minWordCount: filters?.minWordCount,
|
||||||
|
maxWordCount: filters?.maxWordCount,
|
||||||
|
createdAfter: filters?.createdAfter,
|
||||||
|
createdBefore: filters?.createdBefore,
|
||||||
|
lastReadAfter: filters?.lastReadAfter,
|
||||||
|
lastReadBefore: filters?.lastReadBefore,
|
||||||
|
unratedOnly: filters?.unratedOnly,
|
||||||
|
readingStatus: filters?.readingStatus,
|
||||||
|
hasReadingProgress: filters?.hasReadingProgress,
|
||||||
|
hasCoverImage: filters?.hasCoverImage,
|
||||||
|
sourceDomain: filters?.sourceDomain,
|
||||||
|
seriesFilter: filters?.seriesFilter,
|
||||||
|
minTagCount: filters?.minTagCount,
|
||||||
|
popularOnly: filters?.popularOnly,
|
||||||
|
hiddenGemsOnly: filters?.hiddenGemsOnly,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (fallbackResult.results && fallbackResult.results.length > 0) {
|
||||||
|
const randomIndex = Math.floor(Math.random() * fallbackResult.results.length);
|
||||||
|
return fallbackResult.results[randomIndex];
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
} catch (fallbackError: any) {
|
||||||
|
throw fallbackError;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
@@ -295,7 +343,34 @@ export const authorApi = {
|
|||||||
removeAvatar: async (id: string): Promise<void> => {
|
removeAvatar: async (id: string): Promise<void> => {
|
||||||
await api.delete(`/authors/${id}/avatar`);
|
await api.delete(`/authors/${id}/avatar`);
|
||||||
},
|
},
|
||||||
|
|
||||||
|
searchAuthors: async (params: {
|
||||||
|
query?: string;
|
||||||
|
page?: number;
|
||||||
|
size?: number;
|
||||||
|
sortBy?: string;
|
||||||
|
sortDir?: string;
|
||||||
|
}): Promise<{
|
||||||
|
results: Author[];
|
||||||
|
totalHits: number;
|
||||||
|
page: number;
|
||||||
|
perPage: number;
|
||||||
|
query: string;
|
||||||
|
searchTimeMs: number;
|
||||||
|
}> => {
|
||||||
|
const searchParams = new URLSearchParams();
|
||||||
|
|
||||||
|
// Add query parameter
|
||||||
|
searchParams.append('q', params.query || '*');
|
||||||
|
if (params.page !== undefined) searchParams.append('page', params.page.toString());
|
||||||
|
if (params.size !== undefined) searchParams.append('size', params.size.toString());
|
||||||
|
if (params.sortBy) searchParams.append('sortBy', params.sortBy);
|
||||||
|
if (params.sortDir) searchParams.append('sortOrder', params.sortDir);
|
||||||
|
|
||||||
|
const response = await api.get(`/authors/search-typesense?${searchParams.toString()}`);
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
};
|
};
|
||||||
|
|
||||||
// Tag endpoints
|
// Tag endpoints
|
||||||
@@ -548,6 +623,17 @@ export const configApi = {
|
|||||||
hasErrors: boolean;
|
hasErrors: boolean;
|
||||||
dryRun: boolean;
|
dryRun: boolean;
|
||||||
error?: string;
|
error?: string;
|
||||||
|
orphanedFiles?: Array<{
|
||||||
|
filePath: string;
|
||||||
|
fileName: string;
|
||||||
|
fileSize: number;
|
||||||
|
formattedSize: string;
|
||||||
|
storyId: string;
|
||||||
|
storyTitle: string | null;
|
||||||
|
storyExists: boolean;
|
||||||
|
canAccessStory: boolean;
|
||||||
|
error?: string;
|
||||||
|
}>;
|
||||||
}> => {
|
}> => {
|
||||||
const response = await api.post('/config/cleanup/images/preview');
|
const response = await api.post('/config/cleanup/images/preview');
|
||||||
return response.data;
|
return response.data;
|
||||||
@@ -576,7 +662,7 @@ export const searchAdminApi = {
|
|||||||
getStatus: async (): Promise<{
|
getStatus: async (): Promise<{
|
||||||
primaryEngine: string;
|
primaryEngine: string;
|
||||||
dualWrite: boolean;
|
dualWrite: boolean;
|
||||||
openSearchAvailable: boolean;
|
solrAvailable: boolean;
|
||||||
}> => {
|
}> => {
|
||||||
const response = await api.get('/admin/search/status');
|
const response = await api.get('/admin/search/status');
|
||||||
return response.data;
|
return response.data;
|
||||||
@@ -600,8 +686,8 @@ export const searchAdminApi = {
|
|||||||
},
|
},
|
||||||
|
|
||||||
// Switch engines
|
// Switch engines
|
||||||
switchToOpenSearch: async (): Promise<{ message: string }> => {
|
switchToSolr: async (): Promise<{ message: string }> => {
|
||||||
const response = await api.post('/admin/search/switch/opensearch');
|
const response = await api.post('/admin/search/switch/solr');
|
||||||
return response.data;
|
return response.data;
|
||||||
},
|
},
|
||||||
|
|
||||||
@@ -612,8 +698,8 @@ export const searchAdminApi = {
|
|||||||
return response.data;
|
return response.data;
|
||||||
},
|
},
|
||||||
|
|
||||||
// OpenSearch operations
|
// Solr operations
|
||||||
reindexOpenSearch: async (): Promise<{
|
reindexSolr: async (): Promise<{
|
||||||
success: boolean;
|
success: boolean;
|
||||||
message: string;
|
message: string;
|
||||||
storiesCount?: number;
|
storiesCount?: number;
|
||||||
@@ -621,11 +707,11 @@ export const searchAdminApi = {
|
|||||||
totalCount?: number;
|
totalCount?: number;
|
||||||
error?: string;
|
error?: string;
|
||||||
}> => {
|
}> => {
|
||||||
const response = await api.post('/admin/search/opensearch/reindex');
|
const response = await api.post('/admin/search/solr/reindex');
|
||||||
return response.data;
|
return response.data;
|
||||||
},
|
},
|
||||||
|
|
||||||
recreateOpenSearchIndices: async (): Promise<{
|
recreateSolrIndices: async (): Promise<{
|
||||||
success: boolean;
|
success: boolean;
|
||||||
message: string;
|
message: string;
|
||||||
storiesCount?: number;
|
storiesCount?: number;
|
||||||
@@ -633,7 +719,34 @@ export const searchAdminApi = {
|
|||||||
totalCount?: number;
|
totalCount?: number;
|
||||||
error?: string;
|
error?: string;
|
||||||
}> => {
|
}> => {
|
||||||
const response = await api.post('/admin/search/opensearch/recreate');
|
const response = await api.post('/admin/search/solr/recreate');
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
// Add libraryId field to schema
|
||||||
|
addLibraryField: async (): Promise<{
|
||||||
|
success: boolean;
|
||||||
|
message: string;
|
||||||
|
error?: string;
|
||||||
|
details?: string;
|
||||||
|
note?: string;
|
||||||
|
}> => {
|
||||||
|
const response = await api.post('/admin/search/solr/add-library-field');
|
||||||
|
return response.data;
|
||||||
|
},
|
||||||
|
|
||||||
|
// Migrate to library-aware schema
|
||||||
|
migrateLibrarySchema: async (): Promise<{
|
||||||
|
success: boolean;
|
||||||
|
message: string;
|
||||||
|
storiesCount?: number;
|
||||||
|
authorsCount?: number;
|
||||||
|
totalCount?: number;
|
||||||
|
error?: string;
|
||||||
|
details?: string;
|
||||||
|
note?: string;
|
||||||
|
}> => {
|
||||||
|
const response = await api.post('/admin/search/solr/migrate-library-schema');
|
||||||
return response.data;
|
return response.data;
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|||||||
90
frontend/src/lib/debug.ts
Normal file
90
frontend/src/lib/debug.ts
Normal file
@@ -0,0 +1,90 @@
|
|||||||
|
/**
|
||||||
|
* Debug logging utility
|
||||||
|
* Allows conditional logging based on environment or debug flags
|
||||||
|
*/
|
||||||
|
|
||||||
|
// Check if we're in development mode or debug is explicitly enabled
|
||||||
|
const isDebugEnabled = (): boolean => {
|
||||||
|
if (typeof window === 'undefined') {
|
||||||
|
// Server-side: check NODE_ENV
|
||||||
|
return process.env.NODE_ENV === 'development' || process.env.DEBUG === 'true';
|
||||||
|
}
|
||||||
|
|
||||||
|
// Client-side: check localStorage flag or development mode
|
||||||
|
try {
|
||||||
|
return (
|
||||||
|
process.env.NODE_ENV === 'development' ||
|
||||||
|
localStorage.getItem('debug') === 'true' ||
|
||||||
|
window.location.search.includes('debug=true')
|
||||||
|
);
|
||||||
|
} catch {
|
||||||
|
return process.env.NODE_ENV === 'development';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Debug logger that only outputs in development or when debug is enabled
|
||||||
|
*/
|
||||||
|
export const debug = {
|
||||||
|
log: (...args: any[]) => {
|
||||||
|
if (isDebugEnabled()) {
|
||||||
|
console.log('[DEBUG]', ...args);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
warn: (...args: any[]) => {
|
||||||
|
if (isDebugEnabled()) {
|
||||||
|
console.warn('[DEBUG]', ...args);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
error: (...args: any[]) => {
|
||||||
|
if (isDebugEnabled()) {
|
||||||
|
console.error('[DEBUG]', ...args);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
group: (label: string) => {
|
||||||
|
if (isDebugEnabled()) {
|
||||||
|
console.group(`[DEBUG] ${label}`);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
groupEnd: () => {
|
||||||
|
if (isDebugEnabled()) {
|
||||||
|
console.groupEnd();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
time: (label: string) => {
|
||||||
|
if (isDebugEnabled()) {
|
||||||
|
console.time(`[DEBUG] ${label}`);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
timeEnd: (label: string) => {
|
||||||
|
if (isDebugEnabled()) {
|
||||||
|
console.timeEnd(`[DEBUG] ${label}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Enable debug mode (persists in localStorage)
|
||||||
|
*/
|
||||||
|
export const enableDebug = () => {
|
||||||
|
if (typeof window !== 'undefined') {
|
||||||
|
localStorage.setItem('debug', 'true');
|
||||||
|
console.log('Debug mode enabled. Reload page to see debug output.');
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Disable debug mode
|
||||||
|
*/
|
||||||
|
export const disableDebug = () => {
|
||||||
|
if (typeof window !== 'undefined') {
|
||||||
|
localStorage.removeItem('debug');
|
||||||
|
console.log('Debug mode disabled. Reload page to hide debug output.');
|
||||||
|
}
|
||||||
|
};
|
||||||
@@ -1,274 +0,0 @@
|
|||||||
/**
|
|
||||||
* Conversion utilities between HTML and Portable Text
|
|
||||||
* Maintains compatibility with existing sanitization strategy
|
|
||||||
*/
|
|
||||||
|
|
||||||
import type { PortableTextBlock } from '@portabletext/types';
|
|
||||||
import type { CustomPortableTextBlock } from './schema';
|
|
||||||
import { createTextBlock, createImageBlock } from './schema';
|
|
||||||
import { sanitizeHtmlSync } from '../sanitization';
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Convert HTML to Portable Text
|
|
||||||
* This maintains backward compatibility with existing HTML content
|
|
||||||
*/
|
|
||||||
export function htmlToPortableText(html: string): CustomPortableTextBlock[] {
|
|
||||||
if (!html || html.trim() === '') {
|
|
||||||
return [createTextBlock()];
|
|
||||||
}
|
|
||||||
|
|
||||||
// First sanitize the HTML using existing strategy
|
|
||||||
const sanitizedHtml = sanitizeHtmlSync(html);
|
|
||||||
|
|
||||||
// Parse the sanitized HTML into Portable Text blocks
|
|
||||||
const parser = new DOMParser();
|
|
||||||
const doc = parser.parseFromString(sanitizedHtml, 'text/html');
|
|
||||||
|
|
||||||
const blocks: CustomPortableTextBlock[] = [];
|
|
||||||
|
|
||||||
// Process each child element in the body
|
|
||||||
const walker = doc.createTreeWalker(
|
|
||||||
doc.body,
|
|
||||||
NodeFilter.SHOW_ELEMENT | NodeFilter.SHOW_TEXT
|
|
||||||
);
|
|
||||||
|
|
||||||
let currentBlock: PortableTextBlock | null = null;
|
|
||||||
let node = walker.nextNode();
|
|
||||||
|
|
||||||
while (node) {
|
|
||||||
if (node.nodeType === Node.ELEMENT_NODE) {
|
|
||||||
const element = node as Element;
|
|
||||||
|
|
||||||
// Handle block-level elements
|
|
||||||
if (isBlockElement(element.tagName)) {
|
|
||||||
// Finish current block if any
|
|
||||||
if (currentBlock) {
|
|
||||||
blocks.push(currentBlock);
|
|
||||||
currentBlock = null;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle images separately
|
|
||||||
if (element.tagName === 'IMG') {
|
|
||||||
const img = element as HTMLImageElement;
|
|
||||||
blocks.push(createImageBlock(
|
|
||||||
img.src,
|
|
||||||
img.alt,
|
|
||||||
img.title || undefined
|
|
||||||
));
|
|
||||||
} else {
|
|
||||||
// Create new block for this element
|
|
||||||
const style = getBlockStyle(element.tagName);
|
|
||||||
const text = element.textContent || '';
|
|
||||||
currentBlock = createTextBlock(text, style);
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
// Handle inline elements - add to current block
|
|
||||||
if (!currentBlock) {
|
|
||||||
currentBlock = createTextBlock();
|
|
||||||
}
|
|
||||||
// Inline elements are handled by processing their text content
|
|
||||||
// Mark handling would go here for future enhancement
|
|
||||||
}
|
|
||||||
} else if (node.nodeType === Node.TEXT_NODE && node.textContent?.trim()) {
|
|
||||||
// Handle text nodes
|
|
||||||
if (!currentBlock) {
|
|
||||||
currentBlock = createTextBlock();
|
|
||||||
}
|
|
||||||
// Text content is already included in the parent element processing
|
|
||||||
}
|
|
||||||
|
|
||||||
node = walker.nextNode();
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add final block if any
|
|
||||||
if (currentBlock) {
|
|
||||||
blocks.push(currentBlock);
|
|
||||||
}
|
|
||||||
|
|
||||||
// If no blocks were created, return empty content
|
|
||||||
if (blocks.length === 0) {
|
|
||||||
return [createTextBlock()];
|
|
||||||
}
|
|
||||||
|
|
||||||
return blocks;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Convert Portable Text to HTML
|
|
||||||
* This ensures compatibility with existing backend processing
|
|
||||||
*/
|
|
||||||
export function portableTextToHtml(blocks: CustomPortableTextBlock[]): string {
|
|
||||||
if (!blocks || blocks.length === 0) {
|
|
||||||
return '';
|
|
||||||
}
|
|
||||||
|
|
||||||
const htmlParts: string[] = [];
|
|
||||||
|
|
||||||
for (const block of blocks) {
|
|
||||||
if (block._type === 'block') {
|
|
||||||
const portableBlock = block as PortableTextBlock;
|
|
||||||
const tag = getHtmlTag(portableBlock.style || 'normal');
|
|
||||||
const text = extractTextFromBlock(portableBlock);
|
|
||||||
|
|
||||||
if (text.trim() || portableBlock.style !== 'normal') {
|
|
||||||
htmlParts.push(`<${tag}>${text}</${tag}>`);
|
|
||||||
}
|
|
||||||
} else if (block._type === 'image') {
|
|
||||||
const imgBlock = block as any; // Type assertion for custom image block
|
|
||||||
const alt = imgBlock.alt ? ` alt="${escapeHtml(imgBlock.alt)}"` : '';
|
|
||||||
const title = imgBlock.caption ? ` title="${escapeHtml(imgBlock.caption)}"` : '';
|
|
||||||
htmlParts.push(`<img src="${escapeHtml(imgBlock.src)}"${alt}${title} />`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const html = htmlParts.join('\n');
|
|
||||||
|
|
||||||
// Apply final sanitization to ensure security
|
|
||||||
return sanitizeHtmlSync(html);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Extract plain text from a Portable Text block
|
|
||||||
*/
|
|
||||||
function extractTextFromBlock(block: PortableTextBlock): string {
|
|
||||||
if (!block.children) return '';
|
|
||||||
|
|
||||||
return block.children
|
|
||||||
.map(child => {
|
|
||||||
if (child._type === 'span') {
|
|
||||||
return child.text || '';
|
|
||||||
}
|
|
||||||
return '';
|
|
||||||
})
|
|
||||||
.join('');
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Determine if an HTML tag is a block-level element
|
|
||||||
*/
|
|
||||||
function isBlockElement(tagName: string): boolean {
|
|
||||||
const blockElements = [
|
|
||||||
'P', 'DIV', 'H1', 'H2', 'H3', 'H4', 'H5', 'H6',
|
|
||||||
'BLOCKQUOTE', 'UL', 'OL', 'LI', 'IMG', 'BR'
|
|
||||||
];
|
|
||||||
return blockElements.includes(tagName.toUpperCase());
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get Portable Text block style from HTML tag
|
|
||||||
*/
|
|
||||||
function getBlockStyle(tagName: string): string {
|
|
||||||
const styleMap: Record<string, string> = {
|
|
||||||
'P': 'normal',
|
|
||||||
'DIV': 'normal',
|
|
||||||
'H1': 'h1',
|
|
||||||
'H2': 'h2',
|
|
||||||
'H3': 'h3',
|
|
||||||
'H4': 'h4',
|
|
||||||
'H5': 'h5',
|
|
||||||
'H6': 'h6',
|
|
||||||
'BLOCKQUOTE': 'blockquote',
|
|
||||||
};
|
|
||||||
|
|
||||||
return styleMap[tagName.toUpperCase()] || 'normal';
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get HTML tag from Portable Text block style
|
|
||||||
*/
|
|
||||||
function getHtmlTag(style: string): string {
|
|
||||||
const tagMap: Record<string, string> = {
|
|
||||||
'normal': 'p',
|
|
||||||
'h1': 'h1',
|
|
||||||
'h2': 'h2',
|
|
||||||
'h3': 'h3',
|
|
||||||
'h4': 'h4',
|
|
||||||
'h5': 'h5',
|
|
||||||
'h6': 'h6',
|
|
||||||
'blockquote': 'blockquote',
|
|
||||||
};
|
|
||||||
|
|
||||||
return tagMap[style] || 'p';
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Escape HTML entities
|
|
||||||
*/
|
|
||||||
function escapeHtml(text: string): string {
|
|
||||||
const div = document.createElement('div');
|
|
||||||
div.textContent = text;
|
|
||||||
return div.innerHTML;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Simple HTML parsing for converting existing content
|
|
||||||
* This is a basic implementation - could be enhanced with more sophisticated parsing
|
|
||||||
*/
|
|
||||||
export function parseHtmlToBlocks(html: string): CustomPortableTextBlock[] {
|
|
||||||
if (!html || html.trim() === '') {
|
|
||||||
return [createTextBlock()];
|
|
||||||
}
|
|
||||||
|
|
||||||
// Sanitize first
|
|
||||||
const sanitizedHtml = sanitizeHtmlSync(html);
|
|
||||||
|
|
||||||
// Split by block-level elements and convert
|
|
||||||
const blocks: CustomPortableTextBlock[] = [];
|
|
||||||
|
|
||||||
// Simple regex-based parsing for common elements
|
|
||||||
const blockElements = sanitizedHtml.split(/(<\/?(?:p|div|h[1-6]|blockquote|img)[^>]*>)/i)
|
|
||||||
.filter(part => part.trim().length > 0);
|
|
||||||
|
|
||||||
let currentText = '';
|
|
||||||
let currentStyle = 'normal';
|
|
||||||
|
|
||||||
for (const part of blockElements) {
|
|
||||||
if (part.match(/^<(h[1-6]|p|div|blockquote)/i)) {
|
|
||||||
// Start of block element
|
|
||||||
const match = part.match(/^<(h[1-6]|p|div|blockquote)/i);
|
|
||||||
if (match) {
|
|
||||||
currentStyle = getBlockStyle(match[1]);
|
|
||||||
}
|
|
||||||
} else if (part.match(/^<img/i)) {
|
|
||||||
// Image element
|
|
||||||
const srcMatch = part.match(/src=['"']([^'"']+)['"']/);
|
|
||||||
const altMatch = part.match(/alt=['"']([^'"']+)['"']/);
|
|
||||||
const titleMatch = part.match(/title=['"']([^'"']+)['"']/);
|
|
||||||
|
|
||||||
if (srcMatch) {
|
|
||||||
blocks.push(createImageBlock(
|
|
||||||
srcMatch[1],
|
|
||||||
altMatch?.[1],
|
|
||||||
titleMatch?.[1]
|
|
||||||
));
|
|
||||||
}
|
|
||||||
} else if (part.match(/^<\//)) {
|
|
||||||
// End tag - finalize current block
|
|
||||||
if (currentText.trim()) {
|
|
||||||
blocks.push(createTextBlock(currentText.trim(), currentStyle));
|
|
||||||
currentText = '';
|
|
||||||
currentStyle = 'normal';
|
|
||||||
}
|
|
||||||
} else if (!part.match(/^</)) {
|
|
||||||
// Text content
|
|
||||||
currentText += part;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Handle remaining text
|
|
||||||
if (currentText.trim()) {
|
|
||||||
blocks.push(createTextBlock(currentText.trim(), currentStyle));
|
|
||||||
}
|
|
||||||
|
|
||||||
// If no blocks created, return empty block
|
|
||||||
if (blocks.length === 0) {
|
|
||||||
return [createTextBlock()];
|
|
||||||
}
|
|
||||||
|
|
||||||
return blocks;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper function to generate unique keys
|
|
||||||
function generateKey(): string {
|
|
||||||
return Math.random().toString(36).substr(2, 9);
|
|
||||||
}
|
|
||||||
@@ -1,97 +0,0 @@
|
|||||||
/**
|
|
||||||
* Portable Text Editor Schema Definition
|
|
||||||
* Defines the structure and capabilities of the editor
|
|
||||||
*/
|
|
||||||
|
|
||||||
import { defineSchema } from '@portabletext/editor';
|
|
||||||
import type { SchemaDefinition } from '@portabletext/editor';
|
|
||||||
|
|
||||||
export const editorSchema: SchemaDefinition = defineSchema({
|
|
||||||
// Text decorators (inline formatting)
|
|
||||||
decorators: [
|
|
||||||
{ name: 'strong' },
|
|
||||||
{ name: 'em' },
|
|
||||||
{ name: 'underline' },
|
|
||||||
{ name: 'strike' },
|
|
||||||
{ name: 'code' },
|
|
||||||
],
|
|
||||||
|
|
||||||
// Block styles (paragraph types)
|
|
||||||
styles: [
|
|
||||||
{ name: 'normal' },
|
|
||||||
{ name: 'h1' },
|
|
||||||
{ name: 'h2' },
|
|
||||||
{ name: 'h3' },
|
|
||||||
{ name: 'h4' },
|
|
||||||
{ name: 'h5' },
|
|
||||||
{ name: 'h6' },
|
|
||||||
{ name: 'blockquote' },
|
|
||||||
],
|
|
||||||
|
|
||||||
// List types
|
|
||||||
lists: [
|
|
||||||
{ name: 'bullet' },
|
|
||||||
{ name: 'number' },
|
|
||||||
],
|
|
||||||
|
|
||||||
// Annotations (links, etc.)
|
|
||||||
annotations: [
|
|
||||||
{
|
|
||||||
name: 'link',
|
|
||||||
type: 'object',
|
|
||||||
fields: [
|
|
||||||
{
|
|
||||||
name: 'href',
|
|
||||||
type: 'string',
|
|
||||||
},
|
|
||||||
],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
|
|
||||||
// Block objects (custom content types)
|
|
||||||
blockObjects: [
|
|
||||||
{
|
|
||||||
name: 'image',
|
|
||||||
type: 'object',
|
|
||||||
fields: [
|
|
||||||
{
|
|
||||||
name: 'src',
|
|
||||||
type: 'string',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: 'alt',
|
|
||||||
type: 'string',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: 'caption',
|
|
||||||
type: 'string',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: 'width',
|
|
||||||
type: 'number',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: 'height',
|
|
||||||
type: 'number',
|
|
||||||
},
|
|
||||||
],
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: 'codeBlock',
|
|
||||||
type: 'object',
|
|
||||||
fields: [
|
|
||||||
{
|
|
||||||
name: 'code',
|
|
||||||
type: 'string',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: 'language',
|
|
||||||
type: 'string',
|
|
||||||
},
|
|
||||||
],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
});
|
|
||||||
|
|
||||||
// Type exports for use in components
|
|
||||||
export type EditorSchema = typeof editorSchema;
|
|
||||||
@@ -1,169 +0,0 @@
|
|||||||
/**
|
|
||||||
* Portable Text schema definition matching current RichTextEditor functionality
|
|
||||||
*/
|
|
||||||
|
|
||||||
import type {
|
|
||||||
PortableTextBlock,
|
|
||||||
ArbitraryTypedObject,
|
|
||||||
PortableTextMarkDefinition,
|
|
||||||
PortableTextSpan
|
|
||||||
} from '@portabletext/types';
|
|
||||||
|
|
||||||
// Define custom marks (inline formatting)
|
|
||||||
export interface StrongMark extends PortableTextMarkDefinition {
|
|
||||||
_type: 'strong';
|
|
||||||
}
|
|
||||||
|
|
||||||
export interface EmMark extends PortableTextMarkDefinition {
|
|
||||||
_type: 'em';
|
|
||||||
}
|
|
||||||
|
|
||||||
export interface UnderlineMark extends PortableTextMarkDefinition {
|
|
||||||
_type: 'underline';
|
|
||||||
}
|
|
||||||
|
|
||||||
export interface StrikeMark extends PortableTextMarkDefinition {
|
|
||||||
_type: 'strike';
|
|
||||||
}
|
|
||||||
|
|
||||||
export interface CodeMark extends PortableTextMarkDefinition {
|
|
||||||
_type: 'code';
|
|
||||||
}
|
|
||||||
|
|
||||||
// Custom block types for images (future enhancement)
|
|
||||||
export interface ImageBlock extends ArbitraryTypedObject {
|
|
||||||
_type: 'image';
|
|
||||||
src: string;
|
|
||||||
alt?: string;
|
|
||||||
caption?: string;
|
|
||||||
isProcessing?: boolean;
|
|
||||||
originalUrl?: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Define the schema configuration
|
|
||||||
export const portableTextSchema = {
|
|
||||||
// Block styles (paragraph, headings)
|
|
||||||
styles: [
|
|
||||||
{ title: 'Normal', value: 'normal' },
|
|
||||||
{ title: 'Heading 1', value: 'h1' },
|
|
||||||
{ title: 'Heading 2', value: 'h2' },
|
|
||||||
{ title: 'Heading 3', value: 'h3' },
|
|
||||||
{ title: 'Heading 4', value: 'h4' },
|
|
||||||
{ title: 'Heading 5', value: 'h5' },
|
|
||||||
{ title: 'Heading 6', value: 'h6' },
|
|
||||||
{ title: 'Quote', value: 'blockquote' },
|
|
||||||
],
|
|
||||||
|
|
||||||
// List types
|
|
||||||
lists: [
|
|
||||||
{ title: 'Bullet', value: 'bullet' },
|
|
||||||
{ title: 'Number', value: 'number' },
|
|
||||||
],
|
|
||||||
|
|
||||||
// Marks (inline formatting)
|
|
||||||
marks: {
|
|
||||||
// Decorators
|
|
||||||
decorators: [
|
|
||||||
{ title: 'Strong', value: 'strong' },
|
|
||||||
{ title: 'Emphasis', value: 'em' },
|
|
||||||
{ title: 'Underline', value: 'underline' },
|
|
||||||
{ title: 'Strike', value: 'strike' },
|
|
||||||
{ title: 'Code', value: 'code' },
|
|
||||||
],
|
|
||||||
// Annotations (links, etc.)
|
|
||||||
annotations: [
|
|
||||||
{
|
|
||||||
title: 'URL',
|
|
||||||
name: 'link',
|
|
||||||
type: 'object',
|
|
||||||
fields: [
|
|
||||||
{
|
|
||||||
title: 'URL',
|
|
||||||
name: 'href',
|
|
||||||
type: 'url',
|
|
||||||
},
|
|
||||||
],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
},
|
|
||||||
|
|
||||||
// Custom block types
|
|
||||||
blockTypes: [
|
|
||||||
{
|
|
||||||
title: 'Image',
|
|
||||||
name: 'image',
|
|
||||||
type: 'object',
|
|
||||||
fields: [
|
|
||||||
{ name: 'src', type: 'string', title: 'Image URL' },
|
|
||||||
{ name: 'alt', type: 'string', title: 'Alt Text' },
|
|
||||||
{ name: 'caption', type: 'string', title: 'Caption' },
|
|
||||||
{ name: 'isProcessing', type: 'boolean', title: 'Processing' },
|
|
||||||
{ name: 'originalUrl', type: 'string', title: 'Original URL' },
|
|
||||||
],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
};
|
|
||||||
|
|
||||||
// Type definitions for our Portable Text content
|
|
||||||
export type CustomPortableTextBlock = PortableTextBlock | ImageBlock;
|
|
||||||
|
|
||||||
export type CustomMarkDefinition =
|
|
||||||
| StrongMark
|
|
||||||
| EmMark
|
|
||||||
| UnderlineMark
|
|
||||||
| StrikeMark
|
|
||||||
| CodeMark;
|
|
||||||
|
|
||||||
export type CustomPortableTextSpan = PortableTextSpan & {
|
|
||||||
marks?: string[];
|
|
||||||
};
|
|
||||||
|
|
||||||
// Helper function to create a basic block
|
|
||||||
export function createTextBlock(
|
|
||||||
text: string = '',
|
|
||||||
style: string = 'normal'
|
|
||||||
): PortableTextBlock {
|
|
||||||
return {
|
|
||||||
_type: 'block',
|
|
||||||
_key: generateKey(),
|
|
||||||
style,
|
|
||||||
markDefs: [],
|
|
||||||
children: [
|
|
||||||
{
|
|
||||||
_type: 'span',
|
|
||||||
_key: generateKey(),
|
|
||||||
text,
|
|
||||||
marks: [],
|
|
||||||
},
|
|
||||||
],
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper function to create an image block
|
|
||||||
export function createImageBlock(
|
|
||||||
src: string,
|
|
||||||
alt?: string,
|
|
||||||
caption?: string,
|
|
||||||
isProcessing?: boolean,
|
|
||||||
originalUrl?: string
|
|
||||||
): ImageBlock {
|
|
||||||
return {
|
|
||||||
_type: 'image',
|
|
||||||
_key: generateKey(),
|
|
||||||
src,
|
|
||||||
alt,
|
|
||||||
caption,
|
|
||||||
isProcessing,
|
|
||||||
originalUrl,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper function to generate unique keys
|
|
||||||
function generateKey(): string {
|
|
||||||
return Math.random().toString(36).substr(2, 9);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Default empty content
|
|
||||||
export const emptyPortableTextContent: CustomPortableTextBlock[] = [
|
|
||||||
createTextBlock('', 'normal')
|
|
||||||
];
|
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
import DOMPurify from 'dompurify';
|
import DOMPurify from 'dompurify';
|
||||||
import { configApi } from './api';
|
import { configApi } from './api';
|
||||||
|
import { debug } from './debug';
|
||||||
|
|
||||||
interface SanitizationConfig {
|
interface SanitizationConfig {
|
||||||
allowedTags: string[];
|
allowedTags: string[];
|
||||||
@@ -28,7 +29,7 @@ function filterCssProperties(styleValue: string, allowedProperties: string[]): s
|
|||||||
const isAllowed = allowedProperties.includes(property);
|
const isAllowed = allowedProperties.includes(property);
|
||||||
|
|
||||||
if (!isAllowed) {
|
if (!isAllowed) {
|
||||||
console.log(`CSS property "${property}" was filtered out (not in allowed list)`);
|
debug.log(`CSS property "${property}" was filtered out (not in allowed list)`);
|
||||||
}
|
}
|
||||||
|
|
||||||
return isAllowed;
|
return isAllowed;
|
||||||
@@ -37,9 +38,9 @@ function filterCssProperties(styleValue: string, allowedProperties: string[]): s
|
|||||||
const result = filteredDeclarations.join('; ');
|
const result = filteredDeclarations.join('; ');
|
||||||
|
|
||||||
if (declarations.length !== filteredDeclarations.length) {
|
if (declarations.length !== filteredDeclarations.length) {
|
||||||
console.log(`CSS filtering: ${declarations.length} -> ${filteredDeclarations.length} properties`);
|
debug.log(`CSS filtering: ${declarations.length} -> ${filteredDeclarations.length} properties`);
|
||||||
console.log('Original:', styleValue);
|
debug.log('Original:', styleValue);
|
||||||
console.log('Filtered:', result);
|
debug.log('Filtered:', result);
|
||||||
}
|
}
|
||||||
|
|
||||||
return result;
|
return result;
|
||||||
@@ -152,7 +153,8 @@ function createDOMPurifyConfig(config: SanitizationConfig) {
|
|||||||
const domPurifyConfig: DOMPurify.Config = {
|
const domPurifyConfig: DOMPurify.Config = {
|
||||||
ALLOWED_TAGS: allowedTags,
|
ALLOWED_TAGS: allowedTags,
|
||||||
ALLOWED_ATTR: uniqueAttributes,
|
ALLOWED_ATTR: uniqueAttributes,
|
||||||
ALLOWED_URI_REGEXP: /^(?:(?:https?|#|\/):?\/?)[\w.\-#/?=&%]+$/i,
|
// More permissive URL regex to allow complex query strings and tokens
|
||||||
|
ALLOWED_URI_REGEXP: /^(?:(?:https?|data|#|\/):)?[\s\S]*$/i,
|
||||||
ALLOW_UNKNOWN_PROTOCOLS: false,
|
ALLOW_UNKNOWN_PROTOCOLS: false,
|
||||||
SANITIZE_DOM: true,
|
SANITIZE_DOM: true,
|
||||||
KEEP_CONTENT: true,
|
KEEP_CONTENT: true,
|
||||||
@@ -179,6 +181,75 @@ function createDOMPurifyConfig(config: SanitizationConfig) {
|
|||||||
return domPurifyConfig;
|
return domPurifyConfig;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Preprocess HTML to extract images from figure tags before sanitization
|
||||||
|
*/
|
||||||
|
function preprocessFigureTags(html: string): string {
|
||||||
|
if (!html || html.trim() === '') {
|
||||||
|
return html;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const parser = new DOMParser();
|
||||||
|
const doc = parser.parseFromString(html, 'text/html');
|
||||||
|
const figures = doc.querySelectorAll('figure');
|
||||||
|
|
||||||
|
figures.forEach((figure) => {
|
||||||
|
// Find img tags anywhere within the figure (deep search)
|
||||||
|
const images = figure.querySelectorAll('img');
|
||||||
|
|
||||||
|
if (images.length > 0) {
|
||||||
|
// Extract the first image
|
||||||
|
const img = images[0];
|
||||||
|
|
||||||
|
// Get the src attribute - it might be in the src attribute or data-src
|
||||||
|
const imgSrc = img.getAttribute('src') || img.getAttribute('data-src') || img.src || '';
|
||||||
|
|
||||||
|
if (!imgSrc || imgSrc.trim() === '') {
|
||||||
|
figure.remove();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a clean img element with just the essential attributes
|
||||||
|
const cleanImg = doc.createElement('img');
|
||||||
|
cleanImg.setAttribute('src', imgSrc);
|
||||||
|
|
||||||
|
// Preserve alt text
|
||||||
|
const existingAlt = img.getAttribute('alt') || img.alt;
|
||||||
|
if (existingAlt) {
|
||||||
|
cleanImg.setAttribute('alt', existingAlt);
|
||||||
|
} else {
|
||||||
|
// Check if there's a figcaption to use as alt text
|
||||||
|
const figcaption = figure.querySelector('figcaption');
|
||||||
|
if (figcaption) {
|
||||||
|
const captionText = figcaption.textContent?.trim();
|
||||||
|
if (captionText) {
|
||||||
|
cleanImg.setAttribute('alt', captionText);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Preserve other useful attributes if they exist
|
||||||
|
const width = img.getAttribute('width') || img.width;
|
||||||
|
const height = img.getAttribute('height') || img.height;
|
||||||
|
if (width) cleanImg.setAttribute('width', width.toString());
|
||||||
|
if (height) cleanImg.setAttribute('height', height.toString());
|
||||||
|
|
||||||
|
// Replace the figure element with just the clean img
|
||||||
|
figure.replaceWith(cleanImg);
|
||||||
|
} else {
|
||||||
|
// No images in figure, remove it entirely
|
||||||
|
figure.remove();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return doc.body.innerHTML;
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to preprocess figure tags, returning original HTML:', error);
|
||||||
|
return html;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Sanitize HTML content using shared configuration from backend
|
* Sanitize HTML content using shared configuration from backend
|
||||||
*/
|
*/
|
||||||
@@ -188,12 +259,15 @@ export async function sanitizeHtml(html: string): Promise<string> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
// Preprocess to extract images from figure tags
|
||||||
|
const preprocessed = preprocessFigureTags(html);
|
||||||
|
|
||||||
const config = await fetchSanitizationConfig();
|
const config = await fetchSanitizationConfig();
|
||||||
const domPurifyConfig = createDOMPurifyConfig(config);
|
const domPurifyConfig = createDOMPurifyConfig(config);
|
||||||
|
|
||||||
// Configure DOMPurify with our settings
|
// Configure DOMPurify with our settings
|
||||||
const cleanHtml = DOMPurify.sanitize(html, domPurifyConfig as any);
|
const cleanHtml = DOMPurify.sanitize(preprocessed, domPurifyConfig as any);
|
||||||
|
|
||||||
return cleanHtml.toString();
|
return cleanHtml.toString();
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error during HTML sanitization:', error);
|
console.error('Error during HTML sanitization:', error);
|
||||||
@@ -211,15 +285,18 @@ export function sanitizeHtmlSync(html: string): string {
|
|||||||
return '';
|
return '';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Preprocess to extract images from figure tags
|
||||||
|
const preprocessed = preprocessFigureTags(html);
|
||||||
|
|
||||||
// If we have cached config, use it
|
// If we have cached config, use it
|
||||||
if (cachedConfig) {
|
if (cachedConfig) {
|
||||||
const domPurifyConfig = createDOMPurifyConfig(cachedConfig);
|
const domPurifyConfig = createDOMPurifyConfig(cachedConfig);
|
||||||
return DOMPurify.sanitize(html, domPurifyConfig as any).toString();
|
return DOMPurify.sanitize(preprocessed, domPurifyConfig as any).toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
// If we don't have cached config but there's an ongoing request, wait for it
|
// If we don't have cached config but there's an ongoing request, wait for it
|
||||||
if (configPromise) {
|
if (configPromise) {
|
||||||
console.log('Sanitization config loading in progress, using fallback for now');
|
debug.log('Sanitization config loading in progress, using fallback for now');
|
||||||
} else {
|
} else {
|
||||||
// No config and no ongoing request - try to load it for next time
|
// No config and no ongoing request - try to load it for next time
|
||||||
console.warn('No cached sanitization config available, triggering load for future use');
|
console.warn('No cached sanitization config available, triggering load for future use');
|
||||||
@@ -229,7 +306,7 @@ export function sanitizeHtmlSync(html: string): string {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Use comprehensive fallback configuration that preserves formatting
|
// Use comprehensive fallback configuration that preserves formatting
|
||||||
console.log('Using fallback sanitization configuration with formatting support');
|
debug.log('Using fallback sanitization configuration with formatting support');
|
||||||
const fallbackAllowedCssProperties = [
|
const fallbackAllowedCssProperties = [
|
||||||
'color', 'font-size', 'font-weight',
|
'color', 'font-size', 'font-weight',
|
||||||
'font-style', 'text-align', 'text-decoration', 'margin',
|
'font-style', 'text-align', 'text-decoration', 'margin',
|
||||||
@@ -246,8 +323,10 @@ export function sanitizeHtmlSync(html: string): string {
|
|||||||
'blockquote', 'cite', 'q', 'hr', 'details', 'summary'
|
'blockquote', 'cite', 'q', 'hr', 'details', 'summary'
|
||||||
],
|
],
|
||||||
ALLOWED_ATTR: [
|
ALLOWED_ATTR: [
|
||||||
'class', 'style', 'colspan', 'rowspan', 'src', 'alt', 'width', 'height'
|
'class', 'style', 'colspan', 'rowspan', 'src', 'alt', 'width', 'height', 'href', 'title'
|
||||||
],
|
],
|
||||||
|
// More permissive URL regex to allow complex query strings and tokens
|
||||||
|
ALLOWED_URI_REGEXP: /^(?:(?:https?|data|#|\/):)?[\s\S]*$/i,
|
||||||
ALLOW_UNKNOWN_PROTOCOLS: false,
|
ALLOW_UNKNOWN_PROTOCOLS: false,
|
||||||
SANITIZE_DOM: true,
|
SANITIZE_DOM: true,
|
||||||
KEEP_CONTENT: true,
|
KEEP_CONTENT: true,
|
||||||
@@ -269,8 +348,8 @@ export function sanitizeHtmlSync(html: string): string {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
return DOMPurify.sanitize(html, fallbackConfig as any).toString();
|
return DOMPurify.sanitize(preprocessed, fallbackConfig as any).toString();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
246
frontend/src/utils/imageProcessingProgress.ts
Normal file
246
frontend/src/utils/imageProcessingProgress.ts
Normal file
@@ -0,0 +1,246 @@
|
|||||||
|
/**
|
||||||
|
* Utility for tracking image processing progress
|
||||||
|
*
|
||||||
|
* Usage example:
|
||||||
|
*
|
||||||
|
* // After saving a story, start polling for progress
|
||||||
|
* const progressTracker = new ImageProcessingProgressTracker(storyId);
|
||||||
|
*
|
||||||
|
* progressTracker.onProgress((progress) => {
|
||||||
|
* console.log(`Processing ${progress.processedImages}/${progress.totalImages} images`);
|
||||||
|
* console.log(`Current: ${progress.currentImageUrl}`);
|
||||||
|
* console.log(`Status: ${progress.status}`);
|
||||||
|
* });
|
||||||
|
*
|
||||||
|
* progressTracker.onComplete((finalProgress) => {
|
||||||
|
* console.log('Image processing completed!');
|
||||||
|
* });
|
||||||
|
*
|
||||||
|
* progressTracker.onError((error) => {
|
||||||
|
* console.error('Image processing failed:', error);
|
||||||
|
* });
|
||||||
|
*
|
||||||
|
* progressTracker.start();
|
||||||
|
*/
|
||||||
|
|
||||||
|
export interface ImageProcessingProgress {
|
||||||
|
isProcessing: boolean;
|
||||||
|
totalImages: number;
|
||||||
|
processedImages: number;
|
||||||
|
currentImageUrl: string;
|
||||||
|
status: string;
|
||||||
|
progressPercentage: number;
|
||||||
|
completed: boolean;
|
||||||
|
error: string;
|
||||||
|
message?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type ProgressCallback = (progress: ImageProcessingProgress) => void;
|
||||||
|
export type CompleteCallback = (finalProgress: ImageProcessingProgress) => void;
|
||||||
|
export type ErrorCallback = (error: string) => void;
|
||||||
|
|
||||||
|
export class ImageProcessingProgressTracker {
|
||||||
|
private storyId: string;
|
||||||
|
private pollInterval: number;
|
||||||
|
private timeoutMs: number;
|
||||||
|
private isPolling: boolean = false;
|
||||||
|
private pollTimer: NodeJS.Timeout | null = null;
|
||||||
|
private startTime: number = 0;
|
||||||
|
|
||||||
|
private progressCallbacks: ProgressCallback[] = [];
|
||||||
|
private completeCallbacks: CompleteCallback[] = [];
|
||||||
|
private errorCallbacks: ErrorCallback[] = [];
|
||||||
|
|
||||||
|
constructor(
|
||||||
|
storyId: string,
|
||||||
|
pollInterval: number = 1000, // Poll every 1 second
|
||||||
|
timeoutMs: number = 300000 // 5 minute timeout
|
||||||
|
) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
this.pollInterval = pollInterval;
|
||||||
|
this.timeoutMs = timeoutMs;
|
||||||
|
}
|
||||||
|
|
||||||
|
public onProgress(callback: ProgressCallback): void {
|
||||||
|
this.progressCallbacks.push(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
public onComplete(callback: CompleteCallback): void {
|
||||||
|
this.completeCallbacks.push(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
public onError(callback: ErrorCallback): void {
|
||||||
|
this.errorCallbacks.push(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
public async start(): Promise<void> {
|
||||||
|
if (this.isPolling) {
|
||||||
|
console.warn('Progress tracking already started');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
this.isPolling = true;
|
||||||
|
this.startTime = Date.now();
|
||||||
|
|
||||||
|
console.log(`Starting image processing progress tracking for story ${this.storyId}`);
|
||||||
|
this.poll();
|
||||||
|
}
|
||||||
|
|
||||||
|
public stop(): void {
|
||||||
|
this.isPolling = false;
|
||||||
|
if (this.pollTimer) {
|
||||||
|
clearTimeout(this.pollTimer);
|
||||||
|
this.pollTimer = null;
|
||||||
|
}
|
||||||
|
console.log(`Stopped progress tracking for story ${this.storyId}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
private async poll(): Promise<void> {
|
||||||
|
if (!this.isPolling) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for timeout
|
||||||
|
const elapsed = Date.now() - this.startTime;
|
||||||
|
if (elapsed > this.timeoutMs) {
|
||||||
|
this.handleError('Image processing timed out');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(`/api/stories/${this.storyId}/image-processing-progress`);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const progress: ImageProcessingProgress = await response.json();
|
||||||
|
|
||||||
|
// Call progress callbacks
|
||||||
|
this.progressCallbacks.forEach(callback => {
|
||||||
|
try {
|
||||||
|
callback(progress);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error in progress callback:', error);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Check if processing is complete
|
||||||
|
if (progress.completed) {
|
||||||
|
this.handleComplete(progress);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for errors
|
||||||
|
if (progress.error) {
|
||||||
|
this.handleError(progress.error);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Continue polling if still processing
|
||||||
|
if (progress.isProcessing) {
|
||||||
|
this.pollTimer = setTimeout(() => this.poll(), this.pollInterval);
|
||||||
|
} else {
|
||||||
|
// No active processing - might have finished or never started
|
||||||
|
this.handleComplete(progress);
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
this.handleError(`Failed to fetch progress: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleComplete(finalProgress: ImageProcessingProgress): void {
|
||||||
|
this.stop();
|
||||||
|
console.log(`Image processing completed for story ${this.storyId}`);
|
||||||
|
|
||||||
|
this.completeCallbacks.forEach(callback => {
|
||||||
|
try {
|
||||||
|
callback(finalProgress);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error in complete callback:', error);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleError(error: string): void {
|
||||||
|
this.stop();
|
||||||
|
console.error(`Image processing error for story ${this.storyId}:`, error);
|
||||||
|
|
||||||
|
this.errorCallbacks.forEach(callback => {
|
||||||
|
try {
|
||||||
|
callback(error);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error in error callback:', error);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* React hook for image processing progress
|
||||||
|
*
|
||||||
|
* Note: This hook requires React to be imported in the file where it's used.
|
||||||
|
* To use this hook, import React in your component file:
|
||||||
|
*
|
||||||
|
* import React from 'react';
|
||||||
|
* import { useImageProcessingProgress } from '../utils/imageProcessingProgress';
|
||||||
|
*
|
||||||
|
* Usage:
|
||||||
|
* const { progress, isTracking, startTracking } = useImageProcessingProgress(storyId);
|
||||||
|
*/
|
||||||
|
import React from 'react';
|
||||||
|
|
||||||
|
export function useImageProcessingProgress(storyId: string) {
|
||||||
|
const [progress, setProgress] = React.useState<ImageProcessingProgress | null>(null);
|
||||||
|
const [isTracking, setIsTracking] = React.useState(false);
|
||||||
|
const [tracker, setTracker] = React.useState<ImageProcessingProgressTracker | null>(null);
|
||||||
|
|
||||||
|
const startTracking = React.useCallback(() => {
|
||||||
|
if (tracker) {
|
||||||
|
tracker.stop();
|
||||||
|
}
|
||||||
|
|
||||||
|
const newTracker = new ImageProcessingProgressTracker(storyId);
|
||||||
|
|
||||||
|
newTracker.onProgress((progress) => {
|
||||||
|
setProgress(progress);
|
||||||
|
});
|
||||||
|
|
||||||
|
newTracker.onComplete((finalProgress) => {
|
||||||
|
setProgress(finalProgress);
|
||||||
|
setIsTracking(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
newTracker.onError((error) => {
|
||||||
|
console.error('Image processing error:', error);
|
||||||
|
setIsTracking(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
setTracker(newTracker);
|
||||||
|
setIsTracking(true);
|
||||||
|
newTracker.start();
|
||||||
|
}, [storyId, tracker]);
|
||||||
|
|
||||||
|
const stopTracking = React.useCallback(() => {
|
||||||
|
if (tracker) {
|
||||||
|
tracker.stop();
|
||||||
|
setIsTracking(false);
|
||||||
|
}
|
||||||
|
}, [tracker]);
|
||||||
|
|
||||||
|
React.useEffect(() => {
|
||||||
|
return () => {
|
||||||
|
if (tracker) {
|
||||||
|
tracker.stop();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}, [tracker]);
|
||||||
|
|
||||||
|
return {
|
||||||
|
progress,
|
||||||
|
isTracking,
|
||||||
|
startTracking,
|
||||||
|
stopTracking
|
||||||
|
};
|
||||||
|
}
|
||||||
File diff suppressed because one or more lines are too long
@@ -13,7 +13,7 @@ http {
|
|||||||
|
|
||||||
server {
|
server {
|
||||||
listen 80;
|
listen 80;
|
||||||
client_max_body_size 256M;
|
client_max_body_size 600M;
|
||||||
|
|
||||||
# Frontend routes
|
# Frontend routes
|
||||||
location / {
|
location / {
|
||||||
@@ -55,6 +55,10 @@ http {
|
|||||||
proxy_connect_timeout 900s;
|
proxy_connect_timeout 900s;
|
||||||
proxy_send_timeout 900s;
|
proxy_send_timeout 900s;
|
||||||
proxy_read_timeout 900s;
|
proxy_read_timeout 900s;
|
||||||
|
# Large upload settings
|
||||||
|
client_max_body_size 600M;
|
||||||
|
proxy_request_buffering off;
|
||||||
|
proxy_max_temp_file_size 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
# Static image serving
|
# Static image serving
|
||||||
|
|||||||
94
opensearch.Dockerfile
Normal file
94
opensearch.Dockerfile
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
# Custom OpenSearch Dockerfile with Java 21 for compatibility
|
||||||
|
FROM amazoncorretto:21-alpine AS java-base
|
||||||
|
|
||||||
|
# Download and extract OpenSearch
|
||||||
|
FROM java-base AS opensearch-builder
|
||||||
|
WORKDIR /tmp
|
||||||
|
RUN apk add --no-cache curl tar && \
|
||||||
|
curl -L https://artifacts.opensearch.org/releases/bundle/opensearch/3.2.0/opensearch-3.2.0-linux-x64.tar.gz | \
|
||||||
|
tar -xz && \
|
||||||
|
mv opensearch-3.2.0 /usr/share/opensearch
|
||||||
|
|
||||||
|
# Final runtime image
|
||||||
|
FROM java-base
|
||||||
|
WORKDIR /usr/share/opensearch
|
||||||
|
|
||||||
|
# Create opensearch user
|
||||||
|
RUN addgroup -g 1000 opensearch && \
|
||||||
|
adduser -u 1000 -G opensearch -s /bin/sh -D opensearch
|
||||||
|
|
||||||
|
# Copy OpenSearch from builder stage
|
||||||
|
COPY --from=opensearch-builder --chown=opensearch:opensearch /usr/share/opensearch /usr/share/opensearch
|
||||||
|
|
||||||
|
# Install necessary packages
|
||||||
|
RUN apk add --no-cache bash curl
|
||||||
|
|
||||||
|
# Debug: Check Java installation and set correct paths
|
||||||
|
RUN which java && java -version && \
|
||||||
|
ls -la /usr/lib/jvm/ && \
|
||||||
|
ln -sf /usr/lib/jvm/java-21-amazon-corretto /usr/lib/jvm/default-jvm
|
||||||
|
|
||||||
|
# Set environment variables
|
||||||
|
ENV JAVA_HOME=/usr/lib/jvm/java-21-amazon-corretto
|
||||||
|
ENV OPENSEARCH_JAVA_HOME=/usr/lib/jvm/java-21-amazon-corretto
|
||||||
|
ENV PATH=$PATH:$JAVA_HOME/bin
|
||||||
|
|
||||||
|
# Create required directories and disable security plugin
|
||||||
|
RUN mkdir -p /usr/share/opensearch/data && \
|
||||||
|
mkdir -p /usr/share/opensearch/logs && \
|
||||||
|
echo "plugins.security.disabled: true" >> /usr/share/opensearch/config/opensearch.yml && \
|
||||||
|
echo "discovery.type: single-node" >> /usr/share/opensearch/config/opensearch.yml && \
|
||||||
|
echo "cluster.name: storycove-opensearch" >> /usr/share/opensearch/config/opensearch.yml && \
|
||||||
|
echo "node.name: opensearch-node" >> /usr/share/opensearch/config/opensearch.yml && \
|
||||||
|
echo "bootstrap.memory_lock: false" >> /usr/share/opensearch/config/opensearch.yml && \
|
||||||
|
echo "network.host: 0.0.0.0" >> /usr/share/opensearch/config/opensearch.yml && \
|
||||||
|
echo "logger.level: DEBUG" >> /usr/share/opensearch/config/opensearch.yml && \
|
||||||
|
echo "node.processors: 1" >> /usr/share/opensearch/config/opensearch.yml && \
|
||||||
|
rm -rf /usr/share/opensearch/plugins/opensearch-performance-analyzer && \
|
||||||
|
rm -rf /usr/share/opensearch/agent && \
|
||||||
|
echo "# Custom JVM options for Synology NAS compatibility" > /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-Dlucene.useVectorAPI=false" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-Dorg.apache.lucene.store.MMapDirectory.enableMemorySegments=false" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-Djava.awt.headless=true" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-XX:+UseContainerSupport" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-Dorg.opensearch.bootstrap.start_timeout=300s" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-Dopensearch.logger.level=INFO" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "--add-opens=jdk.unsupported/sun.misc=ALL-UNNAMED" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "--add-opens=java.base/java.util=ALL-UNNAMED" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "--add-opens=java.base/java.lang=ALL-UNNAMED" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "--add-modules=jdk.unsupported" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-XX:+UnlockExperimentalVMOptions" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-XX:-UseVectorApi" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
echo "-Djdk.incubator.vector.VECTOR_ACCESS_OOB_CHECK=0" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
|
||||||
|
sed -i '/javaagent/d' /usr/share/opensearch/config/jvm.options && \
|
||||||
|
echo '#!/bin/bash' > /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Starting OpenSearch with Java 21..."' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Java version:"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'java -version' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Memory info:"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'free -h 2>/dev/null || echo "Memory info not available"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Starting OpenSearch process..."' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Architecture info:"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'uname -a' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "CPU info:"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'grep -E "^(processor|model name|flags)" /proc/cpuinfo | head -10 || echo "CPU info not available"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Using JVM options file: /usr/share/opensearch/config/jvm.options.d/synology.options"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'cat /usr/share/opensearch/config/jvm.options.d/synology.options' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Environment OPENSEARCH_JAVA_OPTS: $OPENSEARCH_JAVA_OPTS"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Attempting to force disable vector operations..."' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'export OPENSEARCH_JAVA_OPTS="$OPENSEARCH_JAVA_OPTS -Dlucene.useVectorAPI=false -Dorg.apache.lucene.store.MMapDirectory.enableMemorySegments=false --limit-modules=java.base,java.logging,java.xml,java.management,java.naming,java.desktop,java.security.jgss,jdk.unsupported"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Final OPENSEARCH_JAVA_OPTS: $OPENSEARCH_JAVA_OPTS"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "Starting OpenSearch binary..."' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'timeout 300s /usr/share/opensearch/bin/opensearch &' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'OPENSEARCH_PID=$!' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'echo "OpenSearch started with PID: $OPENSEARCH_PID"' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
echo 'wait $OPENSEARCH_PID' >> /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
chmod +x /usr/share/opensearch/start-opensearch.sh && \
|
||||||
|
chown -R opensearch:opensearch /usr/share/opensearch
|
||||||
|
|
||||||
|
USER opensearch
|
||||||
|
|
||||||
|
EXPOSE 9200 9300
|
||||||
|
|
||||||
|
# Use startup script for better debugging
|
||||||
|
ENTRYPOINT ["/usr/share/opensearch/start-opensearch.sh"]
|
||||||
31
solr.Dockerfile
Normal file
31
solr.Dockerfile
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
FROM solr:9.9.0
|
||||||
|
|
||||||
|
# Switch to root to set up configuration
|
||||||
|
USER root
|
||||||
|
|
||||||
|
# Copy Solr configurations into the image
|
||||||
|
COPY ./solr/stories /opt/solr-9.9.0/server/solr/configsets/storycove_stories
|
||||||
|
COPY ./solr/authors /opt/solr-9.9.0/server/solr/configsets/storycove_authors
|
||||||
|
|
||||||
|
# Create initialization script using the precreate-core pattern
|
||||||
|
COPY <<EOF /docker-entrypoint-initdb.d/init-cores.sh
|
||||||
|
#!/bin/bash
|
||||||
|
echo "StoryCove: Initializing cores..."
|
||||||
|
|
||||||
|
# Use solr's built-in precreate-core functionality
|
||||||
|
precreate-core storycove_stories /opt/solr-9.9.0/server/solr/configsets/storycove_stories
|
||||||
|
precreate-core storycove_authors /opt/solr-9.9.0/server/solr/configsets/storycove_authors
|
||||||
|
|
||||||
|
echo "StoryCove: Core initialization complete!"
|
||||||
|
EOF
|
||||||
|
|
||||||
|
# Ensure proper permissions and make script executable
|
||||||
|
RUN chown -R solr:solr /opt/solr-9.9.0/server/solr/configsets/ && \
|
||||||
|
chmod +x /docker-entrypoint-initdb.d/init-cores.sh && \
|
||||||
|
chown solr:solr /docker-entrypoint-initdb.d/init-cores.sh
|
||||||
|
|
||||||
|
# Switch back to solr user
|
||||||
|
USER solr
|
||||||
|
|
||||||
|
# Use the default Solr entrypoint
|
||||||
|
CMD ["solr-foreground"]
|
||||||
104
solr/authors/conf/managed-schema
Executable file
104
solr/authors/conf/managed-schema
Executable file
@@ -0,0 +1,104 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" ?>
|
||||||
|
<!--
|
||||||
|
Solr Schema for StoryCove Authors Core
|
||||||
|
Based on AuthorSearchDto data model
|
||||||
|
-->
|
||||||
|
<schema name="storycove-authors" version="1.6">
|
||||||
|
|
||||||
|
<!-- Field Types -->
|
||||||
|
|
||||||
|
<!-- String field type for exact matching -->
|
||||||
|
<fieldType name="string" class="solr.StrField" sortMissingLast="true" />
|
||||||
|
|
||||||
|
<!-- Text field type for full-text search -->
|
||||||
|
<fieldType name="text_general" class="solr.TextField" positionIncrementGap="100">
|
||||||
|
<analyzer type="index">
|
||||||
|
<tokenizer class="solr.StandardTokenizerFactory"/>
|
||||||
|
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
|
||||||
|
<filter class="solr.LowerCaseFilterFactory"/>
|
||||||
|
</analyzer>
|
||||||
|
<analyzer type="query">
|
||||||
|
<tokenizer class="solr.StandardTokenizerFactory"/>
|
||||||
|
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
|
||||||
|
<filter class="solr.SynonymGraphFilterFactory" synonyms="synonyms.txt" ignoreCase="true" expand="true"/>
|
||||||
|
<filter class="solr.LowerCaseFilterFactory"/>
|
||||||
|
</analyzer>
|
||||||
|
</fieldType>
|
||||||
|
|
||||||
|
<!-- Enhanced text field for names -->
|
||||||
|
<fieldType name="text_enhanced" class="solr.TextField" positionIncrementGap="100">
|
||||||
|
<analyzer type="index">
|
||||||
|
<tokenizer class="solr.StandardTokenizerFactory"/>
|
||||||
|
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
|
||||||
|
<filter class="solr.LowerCaseFilterFactory"/>
|
||||||
|
<filter class="solr.EdgeNGramFilterFactory" minGramSize="2" maxGramSize="15"/>
|
||||||
|
</analyzer>
|
||||||
|
<analyzer type="query">
|
||||||
|
<tokenizer class="solr.StandardTokenizerFactory"/>
|
||||||
|
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
|
||||||
|
<filter class="solr.LowerCaseFilterFactory"/>
|
||||||
|
</analyzer>
|
||||||
|
</fieldType>
|
||||||
|
|
||||||
|
<!-- Integer field type -->
|
||||||
|
<fieldType name="pint" class="solr.IntPointField" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Long field type -->
|
||||||
|
<fieldType name="plong" class="solr.LongPointField" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Double field type -->
|
||||||
|
<fieldType name="pdouble" class="solr.DoublePointField" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Date field type -->
|
||||||
|
<fieldType name="pdate" class="solr.DatePointField" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Multi-valued string for URLs -->
|
||||||
|
<fieldType name="strings" class="solr.StrField" sortMissingLast="true" multiValued="true"/>
|
||||||
|
|
||||||
|
<!-- Random sort field type for random ordering -->
|
||||||
|
<fieldType name="random" class="solr.RandomSortField" indexed="true"/>
|
||||||
|
|
||||||
|
<!-- Fields -->
|
||||||
|
|
||||||
|
<!-- Required Fields -->
|
||||||
|
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
|
||||||
|
<field name="_version_" type="plong" indexed="false" stored="false"/>
|
||||||
|
|
||||||
|
<!-- Core Author Fields -->
|
||||||
|
<field name="name" type="text_enhanced" indexed="true" stored="true" required="true"/>
|
||||||
|
<field name="notes" type="text_general" indexed="true" stored="true"/>
|
||||||
|
<field name="authorRating" type="pint" indexed="true" stored="true"/>
|
||||||
|
<field name="averageStoryRating" type="pdouble" indexed="true" stored="true"/>
|
||||||
|
<field name="storyCount" type="pint" indexed="true" stored="true"/>
|
||||||
|
<field name="urls" type="strings" indexed="true" stored="true"/>
|
||||||
|
<field name="avatarImagePath" type="string" indexed="false" stored="true"/>
|
||||||
|
|
||||||
|
<!-- Multi-tenant Library Separation -->
|
||||||
|
<field name="libraryId" type="string" indexed="true" stored="true" required="false" default="default"/>
|
||||||
|
|
||||||
|
<!-- Timestamp Fields -->
|
||||||
|
<field name="createdAt" type="pdate" indexed="true" stored="true"/>
|
||||||
|
<field name="updatedAt" type="pdate" indexed="true" stored="true"/>
|
||||||
|
|
||||||
|
<!-- Search-specific Fields -->
|
||||||
|
<field name="searchScore" type="plong" indexed="false" stored="true"/>
|
||||||
|
|
||||||
|
<!-- Combined search field for general queries -->
|
||||||
|
<field name="text" type="text_general" indexed="true" stored="false" multiValued="true"/>
|
||||||
|
|
||||||
|
<!-- Copy Fields for comprehensive search -->
|
||||||
|
<copyField source="name" dest="text"/>
|
||||||
|
<copyField source="notes" dest="text"/>
|
||||||
|
<copyField source="urls" dest="text"/>
|
||||||
|
|
||||||
|
<!-- Default Search Field -->
|
||||||
|
|
||||||
|
<!-- Dynamic Fields -->
|
||||||
|
|
||||||
|
<!-- Random sort dynamic field for generating random orderings -->
|
||||||
|
<dynamicField name="random_*" type="random" indexed="true" stored="false"/>
|
||||||
|
|
||||||
|
<!-- UniqueKey -->
|
||||||
|
<uniqueKey>id</uniqueKey>
|
||||||
|
|
||||||
|
</schema>
|
||||||
134
solr/authors/conf/solrconfig.xml
Executable file
134
solr/authors/conf/solrconfig.xml
Executable file
@@ -0,0 +1,134 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" ?>
|
||||||
|
<!--
|
||||||
|
Solr Configuration for StoryCove Authors Core
|
||||||
|
Optimized for author search with highlighting and faceting
|
||||||
|
-->
|
||||||
|
<config>
|
||||||
|
<luceneMatchVersion>9.9.0</luceneMatchVersion>
|
||||||
|
|
||||||
|
<!-- DataDir configuration -->
|
||||||
|
<dataDir>${solr.data.dir:}</dataDir>
|
||||||
|
|
||||||
|
<!-- Directory Factory -->
|
||||||
|
<directoryFactory name="DirectoryFactory"
|
||||||
|
class="${solr.directoryFactory:solr.NRTCachingDirectoryFactory}"/>
|
||||||
|
|
||||||
|
<!-- CodecFactory -->
|
||||||
|
<codecFactory class="solr.SchemaCodecFactory"/>
|
||||||
|
|
||||||
|
<!-- Index Configuration -->
|
||||||
|
<indexConfig>
|
||||||
|
<lockType>${solr.lock.type:native}</lockType>
|
||||||
|
<infoStream>true</infoStream>
|
||||||
|
</indexConfig>
|
||||||
|
|
||||||
|
<!-- JMX Configuration -->
|
||||||
|
<jmx />
|
||||||
|
|
||||||
|
<!-- Update Handler -->
|
||||||
|
<updateHandler class="solr.DirectUpdateHandler2">
|
||||||
|
<updateLog>
|
||||||
|
<str name="dir">${solr.ulog.dir:}</str>
|
||||||
|
<int name="numVersionBuckets">${solr.ulog.numVersionBuckets:65536}</int>
|
||||||
|
</updateLog>
|
||||||
|
|
||||||
|
<autoCommit>
|
||||||
|
<maxTime>15000</maxTime>
|
||||||
|
<openSearcher>false</openSearcher>
|
||||||
|
</autoCommit>
|
||||||
|
|
||||||
|
<autoSoftCommit>
|
||||||
|
<maxTime>1000</maxTime>
|
||||||
|
</autoSoftCommit>
|
||||||
|
</updateHandler>
|
||||||
|
|
||||||
|
<!-- Query Configuration -->
|
||||||
|
<query>
|
||||||
|
<maxBooleanClauses>1024</maxBooleanClauses>
|
||||||
|
<filterCache class="solr.CaffeineCache"
|
||||||
|
size="512"
|
||||||
|
initialSize="512"
|
||||||
|
autowarmCount="0"/>
|
||||||
|
<queryResultCache class="solr.CaffeineCache"
|
||||||
|
size="512"
|
||||||
|
initialSize="512"
|
||||||
|
autowarmCount="0"/>
|
||||||
|
<documentCache class="solr.CaffeineCache"
|
||||||
|
size="512"
|
||||||
|
initialSize="512"
|
||||||
|
autowarmCount="0"/>
|
||||||
|
<enableLazyFieldLoading>true</enableLazyFieldLoading>
|
||||||
|
</query>
|
||||||
|
|
||||||
|
<!-- Request Dispatcher -->
|
||||||
|
<requestDispatcher handleSelect="false" >
|
||||||
|
<requestParsers enableRemoteStreaming="true"
|
||||||
|
multipartUploadLimitInKB="2048000"
|
||||||
|
formdataUploadLimitInKB="2048"
|
||||||
|
addHttpRequestToContext="false"/>
|
||||||
|
<httpCaching never304="true" />
|
||||||
|
</requestDispatcher>
|
||||||
|
|
||||||
|
<!-- Request Handlers -->
|
||||||
|
|
||||||
|
<!-- Standard Select Handler -->
|
||||||
|
<requestHandler name="/select" class="solr.SearchHandler">
|
||||||
|
<lst name="defaults">
|
||||||
|
<str name="echoParams">explicit</str>
|
||||||
|
<int name="rows">10</int>
|
||||||
|
<str name="df">text</str>
|
||||||
|
<str name="wt">json</str>
|
||||||
|
<str name="indent">true</str>
|
||||||
|
<str name="hl">true</str>
|
||||||
|
<str name="hl.fl">name,notes</str>
|
||||||
|
<str name="hl.simple.pre"><em></str>
|
||||||
|
<str name="hl.simple.post"></em></str>
|
||||||
|
<str name="hl.fragsize">150</str>
|
||||||
|
<str name="hl.maxAnalyzedChars">51200</str>
|
||||||
|
</lst>
|
||||||
|
</requestHandler>
|
||||||
|
|
||||||
|
<!-- Update Handler -->
|
||||||
|
<requestHandler name="/update" class="solr.UpdateRequestHandler" />
|
||||||
|
|
||||||
|
<!-- Admin Handlers -->
|
||||||
|
<requestHandler name="/admin/ping" class="solr.PingRequestHandler">
|
||||||
|
<lst name="invariants">
|
||||||
|
<str name="q">*:*</str>
|
||||||
|
</lst>
|
||||||
|
<lst name="defaults">
|
||||||
|
<str name="echoParams">all</str>
|
||||||
|
</lst>
|
||||||
|
</requestHandler>
|
||||||
|
|
||||||
|
<!-- Suggester Handler -->
|
||||||
|
<requestHandler name="/suggest" class="solr.SearchHandler" startup="lazy">
|
||||||
|
<lst name="defaults">
|
||||||
|
<str name="suggest">true</str>
|
||||||
|
<str name="suggest.count">10</str>
|
||||||
|
</lst>
|
||||||
|
<arr name="components">
|
||||||
|
<str>suggest</str>
|
||||||
|
</arr>
|
||||||
|
</requestHandler>
|
||||||
|
|
||||||
|
<!-- Search Components -->
|
||||||
|
<searchComponent name="suggest" class="solr.SuggestComponent">
|
||||||
|
<lst name="suggester">
|
||||||
|
<str name="name">authorSuggester</str>
|
||||||
|
<str name="lookupImpl">AnalyzingInfixLookupFactory</str>
|
||||||
|
<str name="dictionaryImpl">DocumentDictionaryFactory</str>
|
||||||
|
<str name="field">name</str>
|
||||||
|
<str name="weightField">storyCount</str>
|
||||||
|
<str name="suggestAnalyzerFieldType">text_general</str>
|
||||||
|
<str name="buildOnStartup">false</str>
|
||||||
|
<str name="buildOnCommit">false</str>
|
||||||
|
</lst>
|
||||||
|
</searchComponent>
|
||||||
|
|
||||||
|
<!-- Response Writers -->
|
||||||
|
<queryResponseWriter name="json" class="solr.JSONResponseWriter">
|
||||||
|
<str name="content-type">application/json; charset=UTF-8</str>
|
||||||
|
</queryResponseWriter>
|
||||||
|
|
||||||
|
</config>
|
||||||
34
solr/authors/conf/stopwords.txt
Executable file
34
solr/authors/conf/stopwords.txt
Executable file
@@ -0,0 +1,34 @@
|
|||||||
|
# English stopwords for author search
|
||||||
|
a
|
||||||
|
an
|
||||||
|
and
|
||||||
|
are
|
||||||
|
as
|
||||||
|
at
|
||||||
|
be
|
||||||
|
but
|
||||||
|
by
|
||||||
|
for
|
||||||
|
if
|
||||||
|
in
|
||||||
|
into
|
||||||
|
is
|
||||||
|
it
|
||||||
|
no
|
||||||
|
not
|
||||||
|
of
|
||||||
|
on
|
||||||
|
or
|
||||||
|
such
|
||||||
|
that
|
||||||
|
the
|
||||||
|
their
|
||||||
|
then
|
||||||
|
there
|
||||||
|
these
|
||||||
|
they
|
||||||
|
this
|
||||||
|
to
|
||||||
|
was
|
||||||
|
will
|
||||||
|
with
|
||||||
9
solr/authors/conf/synonyms.txt
Executable file
9
solr/authors/conf/synonyms.txt
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
# Synonyms for author search
|
||||||
|
# Format: word1,word2,word3 => synonym1,synonym2
|
||||||
|
writer,author,novelist
|
||||||
|
pen name,pseudonym,alias
|
||||||
|
prolific,productive
|
||||||
|
acclaimed,famous,renowned
|
||||||
|
bestselling,popular
|
||||||
|
contemporary,modern
|
||||||
|
classic,traditional
|
||||||
143
solr/stories/conf/managed-schema
Executable file
143
solr/stories/conf/managed-schema
Executable file
@@ -0,0 +1,143 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" ?>
|
||||||
|
<!--
|
||||||
|
Solr Schema for StoryCove Stories Core
|
||||||
|
Based on StorySearchDto data model
|
||||||
|
-->
|
||||||
|
<schema name="storycove-stories" version="1.6">
|
||||||
|
|
||||||
|
<!-- Field Types -->
|
||||||
|
|
||||||
|
<!-- String field type for exact matching -->
|
||||||
|
<fieldType name="string" class="solr.StrField" sortMissingLast="true" />
|
||||||
|
|
||||||
|
<!-- Text field type for full-text search -->
|
||||||
|
<fieldType name="text_general" class="solr.TextField" positionIncrementGap="100">
|
||||||
|
<analyzer type="index">
|
||||||
|
<tokenizer class="solr.StandardTokenizerFactory"/>
|
||||||
|
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
|
||||||
|
<filter class="solr.LowerCaseFilterFactory"/>
|
||||||
|
</analyzer>
|
||||||
|
<analyzer type="query">
|
||||||
|
<tokenizer class="solr.StandardTokenizerFactory"/>
|
||||||
|
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
|
||||||
|
<filter class="solr.SynonymGraphFilterFactory" synonyms="synonyms.txt" ignoreCase="true" expand="true"/>
|
||||||
|
<filter class="solr.LowerCaseFilterFactory"/>
|
||||||
|
</analyzer>
|
||||||
|
</fieldType>
|
||||||
|
|
||||||
|
<!-- Enhanced text field for titles and important content -->
|
||||||
|
<fieldType name="text_enhanced" class="solr.TextField" positionIncrementGap="100">
|
||||||
|
<analyzer type="index">
|
||||||
|
<tokenizer class="solr.StandardTokenizerFactory"/>
|
||||||
|
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
|
||||||
|
<filter class="solr.LowerCaseFilterFactory"/>
|
||||||
|
<filter class="solr.EdgeNGramFilterFactory" minGramSize="2" maxGramSize="15"/>
|
||||||
|
</analyzer>
|
||||||
|
<analyzer type="query">
|
||||||
|
<tokenizer class="solr.StandardTokenizerFactory"/>
|
||||||
|
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
|
||||||
|
<filter class="solr.LowerCaseFilterFactory"/>
|
||||||
|
</analyzer>
|
||||||
|
</fieldType>
|
||||||
|
|
||||||
|
<!-- Integer field type -->
|
||||||
|
<fieldType name="pint" class="solr.IntPointField" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Long field type -->
|
||||||
|
<fieldType name="plong" class="solr.LongPointField" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Double field type -->
|
||||||
|
<fieldType name="pdouble" class="solr.DoublePointField" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Boolean field type -->
|
||||||
|
<fieldType name="boolean" class="solr.BoolField" sortMissingLast="true"/>
|
||||||
|
|
||||||
|
<!-- Date field type -->
|
||||||
|
<fieldType name="pdate" class="solr.DatePointField" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Multi-valued string for tags and faceting -->
|
||||||
|
<fieldType name="strings" class="solr.StrField" sortMissingLast="true" multiValued="true" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Single string for exact matching and faceting -->
|
||||||
|
<fieldType name="string_facet" class="solr.StrField" sortMissingLast="true" docValues="true"/>
|
||||||
|
|
||||||
|
<!-- Random sort field type for random ordering -->
|
||||||
|
<fieldType name="random" class="solr.RandomSortField" indexed="true"/>
|
||||||
|
|
||||||
|
<!-- Fields -->
|
||||||
|
|
||||||
|
<!-- Required Fields -->
|
||||||
|
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
|
||||||
|
<field name="_version_" type="plong" indexed="false" stored="false"/>
|
||||||
|
|
||||||
|
<!-- Core Story Fields -->
|
||||||
|
<field name="title" type="text_enhanced" indexed="true" stored="true" required="true"/>
|
||||||
|
<field name="description" type="text_general" indexed="true" stored="true"/>
|
||||||
|
<field name="sourceUrl" type="string" indexed="true" stored="true"/>
|
||||||
|
<field name="coverPath" type="string" indexed="false" stored="true"/>
|
||||||
|
<field name="wordCount" type="pint" indexed="true" stored="true"/>
|
||||||
|
<field name="rating" type="pint" indexed="true" stored="true"/>
|
||||||
|
<field name="averageStoryRating" type="pdouble" indexed="true" stored="true"/>
|
||||||
|
<field name="volume" type="pint" indexed="true" stored="true"/>
|
||||||
|
|
||||||
|
<!-- Multi-tenant Library Separation -->
|
||||||
|
<field name="libraryId" type="string" indexed="true" stored="true" required="false" default="default"/>
|
||||||
|
|
||||||
|
<!-- Reading Status Fields -->
|
||||||
|
<field name="isRead" type="boolean" indexed="true" stored="true"/>
|
||||||
|
<field name="readingPosition" type="pint" indexed="true" stored="true"/>
|
||||||
|
<field name="lastReadAt" type="pdate" indexed="true" stored="true"/>
|
||||||
|
<field name="lastRead" type="pdate" indexed="true" stored="true"/>
|
||||||
|
|
||||||
|
<!-- Author Fields -->
|
||||||
|
<field name="authorId" type="string" indexed="true" stored="true"/>
|
||||||
|
<field name="authorName" type="text_enhanced" indexed="true" stored="true"/>
|
||||||
|
<field name="authorName_facet" type="string_facet" indexed="true" stored="false"/>
|
||||||
|
|
||||||
|
<!-- Series Fields -->
|
||||||
|
<field name="seriesId" type="string" indexed="true" stored="true"/>
|
||||||
|
<field name="seriesName" type="text_enhanced" indexed="true" stored="true"/>
|
||||||
|
<field name="seriesName_facet" type="string_facet" indexed="true" stored="false"/>
|
||||||
|
|
||||||
|
<!-- Tag Fields -->
|
||||||
|
<field name="tagNames" type="strings" indexed="true" stored="true"/>
|
||||||
|
<field name="tagNames_facet" type="strings" indexed="true" stored="false"/>
|
||||||
|
|
||||||
|
<!-- Timestamp Fields -->
|
||||||
|
<field name="createdAt" type="pdate" indexed="true" stored="true"/>
|
||||||
|
<field name="updatedAt" type="pdate" indexed="true" stored="true"/>
|
||||||
|
<field name="dateAdded" type="pdate" indexed="true" stored="true"/>
|
||||||
|
|
||||||
|
<!-- Search-specific Fields -->
|
||||||
|
<field name="searchScore" type="pdouble" indexed="false" stored="true"/>
|
||||||
|
<field name="highlights" type="strings" indexed="false" stored="true"/>
|
||||||
|
|
||||||
|
<!-- Combined search field for general queries -->
|
||||||
|
<field name="text" type="text_general" indexed="true" stored="false" multiValued="true"/>
|
||||||
|
|
||||||
|
<!-- Copy Fields for comprehensive search -->
|
||||||
|
<copyField source="title" dest="text"/>
|
||||||
|
<copyField source="description" dest="text"/>
|
||||||
|
<copyField source="authorName" dest="text"/>
|
||||||
|
<copyField source="seriesName" dest="text"/>
|
||||||
|
<copyField source="tagNames" dest="text"/>
|
||||||
|
|
||||||
|
<!-- Copy Fields for faceting -->
|
||||||
|
<copyField source="authorName" dest="authorName_facet"/>
|
||||||
|
<copyField source="seriesName" dest="seriesName_facet"/>
|
||||||
|
<copyField source="tagNames" dest="tagNames_facet"/>
|
||||||
|
|
||||||
|
<!-- Copy field for lastRead sorting compatibility -->
|
||||||
|
<copyField source="lastReadAt" dest="lastRead"/>
|
||||||
|
|
||||||
|
<!-- Default Search Field -->
|
||||||
|
|
||||||
|
<!-- Dynamic Fields -->
|
||||||
|
|
||||||
|
<!-- Random sort dynamic field for generating random orderings -->
|
||||||
|
<dynamicField name="random_*" type="random" indexed="true" stored="false"/>
|
||||||
|
|
||||||
|
<!-- UniqueKey -->
|
||||||
|
<uniqueKey>id</uniqueKey>
|
||||||
|
|
||||||
|
</schema>
|
||||||
153
solr/stories/conf/solrconfig.xml
Executable file
153
solr/stories/conf/solrconfig.xml
Executable file
@@ -0,0 +1,153 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" ?>
|
||||||
|
<!--
|
||||||
|
Solr Configuration for StoryCove Stories Core
|
||||||
|
Optimized for story search with highlighting and faceting
|
||||||
|
-->
|
||||||
|
<config>
|
||||||
|
<luceneMatchVersion>9.9.0</luceneMatchVersion>
|
||||||
|
|
||||||
|
<!-- DataDir configuration -->
|
||||||
|
<dataDir>${solr.data.dir:}</dataDir>
|
||||||
|
|
||||||
|
<!-- Directory Factory -->
|
||||||
|
<directoryFactory name="DirectoryFactory"
|
||||||
|
class="${solr.directoryFactory:solr.NRTCachingDirectoryFactory}"/>
|
||||||
|
|
||||||
|
<!-- CodecFactory -->
|
||||||
|
<codecFactory class="solr.SchemaCodecFactory"/>
|
||||||
|
|
||||||
|
<!-- Index Configuration -->
|
||||||
|
<indexConfig>
|
||||||
|
<lockType>${solr.lock.type:native}</lockType>
|
||||||
|
<infoStream>true</infoStream>
|
||||||
|
</indexConfig>
|
||||||
|
|
||||||
|
<!-- JMX Configuration -->
|
||||||
|
<jmx />
|
||||||
|
|
||||||
|
<!-- Update Handler -->
|
||||||
|
<updateHandler class="solr.DirectUpdateHandler2">
|
||||||
|
<updateLog>
|
||||||
|
<str name="dir">${solr.ulog.dir:}</str>
|
||||||
|
<int name="numVersionBuckets">${solr.ulog.numVersionBuckets:65536}</int>
|
||||||
|
</updateLog>
|
||||||
|
|
||||||
|
<autoCommit>
|
||||||
|
<maxTime>15000</maxTime>
|
||||||
|
<openSearcher>false</openSearcher>
|
||||||
|
</autoCommit>
|
||||||
|
|
||||||
|
<autoSoftCommit>
|
||||||
|
<maxTime>1000</maxTime>
|
||||||
|
</autoSoftCommit>
|
||||||
|
</updateHandler>
|
||||||
|
|
||||||
|
<!-- Query Configuration -->
|
||||||
|
<query>
|
||||||
|
<maxBooleanClauses>1024</maxBooleanClauses>
|
||||||
|
<filterCache class="solr.CaffeineCache"
|
||||||
|
size="512"
|
||||||
|
initialSize="512"
|
||||||
|
autowarmCount="0"/>
|
||||||
|
<queryResultCache class="solr.CaffeineCache"
|
||||||
|
size="512"
|
||||||
|
initialSize="512"
|
||||||
|
autowarmCount="0"/>
|
||||||
|
<documentCache class="solr.CaffeineCache"
|
||||||
|
size="512"
|
||||||
|
initialSize="512"
|
||||||
|
autowarmCount="0"/>
|
||||||
|
<enableLazyFieldLoading>true</enableLazyFieldLoading>
|
||||||
|
</query>
|
||||||
|
|
||||||
|
<!-- Request Dispatcher -->
|
||||||
|
<requestDispatcher handleSelect="false" >
|
||||||
|
<requestParsers enableRemoteStreaming="true"
|
||||||
|
multipartUploadLimitInKB="2048000"
|
||||||
|
formdataUploadLimitInKB="2048"
|
||||||
|
addHttpRequestToContext="false"/>
|
||||||
|
<httpCaching never304="true" />
|
||||||
|
</requestDispatcher>
|
||||||
|
|
||||||
|
<!-- Request Handlers -->
|
||||||
|
|
||||||
|
<!-- Standard Select Handler -->
|
||||||
|
<requestHandler name="/select" class="solr.SearchHandler">
|
||||||
|
<lst name="defaults">
|
||||||
|
<str name="echoParams">explicit</str>
|
||||||
|
<int name="rows">10</int>
|
||||||
|
<str name="df">text</str>
|
||||||
|
<str name="wt">json</str>
|
||||||
|
<str name="indent">true</str>
|
||||||
|
<str name="hl">true</str>
|
||||||
|
<str name="hl.fl">title,description</str>
|
||||||
|
<str name="hl.simple.pre"><em></str>
|
||||||
|
<str name="hl.simple.post"></em></str>
|
||||||
|
<str name="hl.fragsize">150</str>
|
||||||
|
<str name="hl.maxAnalyzedChars">51200</str>
|
||||||
|
<str name="facet">true</str>
|
||||||
|
<str name="facet.field">authorName</str>
|
||||||
|
<str name="facet.field">tagNames</str>
|
||||||
|
<str name="facet.field">seriesName</str>
|
||||||
|
<str name="facet.field">rating</str>
|
||||||
|
<str name="facet.field">isRead</str>
|
||||||
|
<str name="facet.mincount">1</str>
|
||||||
|
<str name="facet.sort">count</str>
|
||||||
|
</lst>
|
||||||
|
</requestHandler>
|
||||||
|
|
||||||
|
<!-- Update Handler -->
|
||||||
|
<requestHandler name="/update" class="solr.UpdateRequestHandler" />
|
||||||
|
|
||||||
|
<!-- Admin Handlers -->
|
||||||
|
<requestHandler name="/admin/ping" class="solr.PingRequestHandler">
|
||||||
|
<lst name="invariants">
|
||||||
|
<str name="q">*:*</str>
|
||||||
|
</lst>
|
||||||
|
<lst name="defaults">
|
||||||
|
<str name="echoParams">all</str>
|
||||||
|
</lst>
|
||||||
|
</requestHandler>
|
||||||
|
|
||||||
|
<!-- More Like This Handler -->
|
||||||
|
<requestHandler name="/mlt" class="solr.MoreLikeThisHandler">
|
||||||
|
<lst name="defaults">
|
||||||
|
<str name="mlt.fl">title,description</str>
|
||||||
|
<int name="mlt.mindf">2</int>
|
||||||
|
<int name="mlt.mintf">2</int>
|
||||||
|
<str name="mlt.qf">title^2.0 description^1.0</str>
|
||||||
|
<int name="rows">5</int>
|
||||||
|
</lst>
|
||||||
|
</requestHandler>
|
||||||
|
|
||||||
|
<!-- Suggester Handler -->
|
||||||
|
<requestHandler name="/suggest" class="solr.SearchHandler" startup="lazy">
|
||||||
|
<lst name="defaults">
|
||||||
|
<str name="suggest">true</str>
|
||||||
|
<str name="suggest.count">10</str>
|
||||||
|
</lst>
|
||||||
|
<arr name="components">
|
||||||
|
<str>suggest</str>
|
||||||
|
</arr>
|
||||||
|
</requestHandler>
|
||||||
|
|
||||||
|
<!-- Search Components -->
|
||||||
|
<searchComponent name="suggest" class="solr.SuggestComponent">
|
||||||
|
<lst name="suggester">
|
||||||
|
<str name="name">storySuggester</str>
|
||||||
|
<str name="lookupImpl">AnalyzingInfixLookupFactory</str>
|
||||||
|
<str name="dictionaryImpl">DocumentDictionaryFactory</str>
|
||||||
|
<str name="field">title</str>
|
||||||
|
<str name="weightField">rating</str>
|
||||||
|
<str name="suggestAnalyzerFieldType">text_general</str>
|
||||||
|
<str name="buildOnStartup">false</str>
|
||||||
|
<str name="buildOnCommit">false</str>
|
||||||
|
</lst>
|
||||||
|
</searchComponent>
|
||||||
|
|
||||||
|
<!-- Response Writers -->
|
||||||
|
<queryResponseWriter name="json" class="solr.JSONResponseWriter">
|
||||||
|
<str name="content-type">application/json; charset=UTF-8</str>
|
||||||
|
</queryResponseWriter>
|
||||||
|
|
||||||
|
</config>
|
||||||
34
solr/stories/conf/stopwords.txt
Executable file
34
solr/stories/conf/stopwords.txt
Executable file
@@ -0,0 +1,34 @@
|
|||||||
|
# English stopwords for story search
|
||||||
|
a
|
||||||
|
an
|
||||||
|
and
|
||||||
|
are
|
||||||
|
as
|
||||||
|
at
|
||||||
|
be
|
||||||
|
but
|
||||||
|
by
|
||||||
|
for
|
||||||
|
if
|
||||||
|
in
|
||||||
|
into
|
||||||
|
is
|
||||||
|
it
|
||||||
|
no
|
||||||
|
not
|
||||||
|
of
|
||||||
|
on
|
||||||
|
or
|
||||||
|
such
|
||||||
|
that
|
||||||
|
the
|
||||||
|
their
|
||||||
|
then
|
||||||
|
there
|
||||||
|
these
|
||||||
|
they
|
||||||
|
this
|
||||||
|
to
|
||||||
|
was
|
||||||
|
will
|
||||||
|
with
|
||||||
16
solr/stories/conf/synonyms.txt
Executable file
16
solr/stories/conf/synonyms.txt
Executable file
@@ -0,0 +1,16 @@
|
|||||||
|
# Synonyms for story search
|
||||||
|
# Format: word1,word2,word3 => synonym1,synonym2
|
||||||
|
fantasy,magical,magic
|
||||||
|
sci-fi,science fiction,scifi
|
||||||
|
romance,romantic,love
|
||||||
|
mystery,detective,crime
|
||||||
|
adventure,action
|
||||||
|
horror,scary,frightening
|
||||||
|
drama,dramatic
|
||||||
|
comedy,funny,humor
|
||||||
|
thriller,suspense
|
||||||
|
historical,history
|
||||||
|
contemporary,modern
|
||||||
|
short,brief
|
||||||
|
novel,book
|
||||||
|
story,tale,narrative
|
||||||
Reference in New Issue
Block a user