54 Commits

Author SHA1 Message Date
Stefan Hardegger
4e02cd8eaa fix image 2025-09-30 17:03:49 +02:00
Stefan Hardegger
48b0087b01 fix embedded images on deviantart 2025-09-30 16:18:05 +02:00
Stefan Hardegger
c291559366 Fix Image Processing 2025-09-28 20:06:52 +02:00
Stefan Hardegger
622cf9ac76 fix image processing 2025-09-27 09:29:40 +02:00
Stefan Hardegger
df5e124115 fix image processing 2025-09-27 09:15:01 +02:00
Stefan Hardegger
2b4cb1456f fix orphaned file discovery 2025-09-27 08:46:17 +02:00
Stefan Hardegger
c2e5445196 fix 2025-09-27 08:32:11 +02:00
Stefan Hardegger
360b69effc fix cleanup 2025-09-27 08:15:09 +02:00
Stefan Hardegger
3bc8bb9e0c backup / restore improvement 2025-09-26 22:34:21 +02:00
Stefan Hardegger
7ca4823573 backup / restore improvement 2025-09-26 22:26:26 +02:00
Stefan Hardegger
5325169495 maintenance improvements 2025-09-26 21:41:33 +02:00
Stefan Hardegger
74cdd5dc57 solr random fix 2025-09-26 15:05:27 +02:00
Stefan Hardegger
574f20bfd7 dependency 2025-09-26 08:28:32 +02:00
Stefan Hardegger
c8249c94d6 new editor 2025-09-26 08:22:54 +02:00
Stefan Hardegger
51a1a69b45 solr migration button 2025-09-23 14:57:16 +02:00
Stefan Hardegger
6ee2d67027 solr migration button 2025-09-23 14:42:38 +02:00
Stefan Hardegger
9472210d8b solr migration button 2025-09-23 14:18:56 +02:00
Stefan Hardegger
62f017c4ca solr fix 2025-09-23 13:58:49 +02:00
Stefan Hardegger
857871273d fix pre formatting 2025-09-22 15:43:25 +02:00
Stefan Hardegger
a9521a9da1 fix saving stories. 2025-09-22 13:52:48 +02:00
Stefan Hardegger
1f41974208 ff 2025-09-22 12:43:05 +02:00
Stefan Hardegger
b68fde71c0 ff 2025-09-22 12:28:31 +02:00
Stefan Hardegger
f61be90d5c ff 2025-09-22 10:13:49 +02:00
Stefan Hardegger
87f37567fb replacing opensearch with solr 2025-09-22 09:44:50 +02:00
Stefan Hardegger
9e684a956b ff 2025-09-21 19:25:11 +02:00
Stefan Hardegger
379ef0d209 ff 2025-09-21 19:21:26 +02:00
Stefan Hardegger
b1ff684df6 asd 2025-09-21 19:18:03 +02:00
Stefan Hardegger
0032590030 fix? 2025-09-21 19:13:39 +02:00
Stefan Hardegger
db38d68399 fix? 2025-09-21 19:10:06 +02:00
Stefan Hardegger
48a0865199 fa 2025-09-21 18:04:36 +02:00
Stefan Hardegger
7daed22d2d another try 2025-09-21 17:53:52 +02:00
Stefan Hardegger
6c02b8831f asd 2025-09-21 17:47:03 +02:00
Stefan Hardegger
042f80dd2a another try 2025-09-21 17:38:57 +02:00
Stefan Hardegger
a472c11ac8 fix 2025-09-21 17:30:15 +02:00
Stefan Hardegger
a037dd92af fix 2025-09-21 17:21:49 +02:00
Stefan Hardegger
634de0b6a5 fix 2025-09-21 16:43:47 +02:00
Stefan Hardegger
b4635b56a3 fix 2025-09-21 16:39:41 +02:00
Stefan Hardegger
bfb68e81a8 fix 2025-09-21 16:34:28 +02:00
Stefan Hardegger
1247a3420e fix 2025-09-21 16:23:44 +02:00
Stefan Hardegger
6caee8a007 config 2025-09-21 16:21:53 +02:00
Stefan Hardegger
cf93d3b3a6 opensearch config 2025-09-21 16:14:20 +02:00
Stefan Hardegger
53cb296adc opensearch config 2025-09-21 16:10:07 +02:00
Stefan Hardegger
f71b70d03b opensearch config 2025-09-21 16:07:48 +02:00
Stefan Hardegger
0bdc3f4731 adjustment 2025-09-21 15:59:15 +02:00
Stefan Hardegger
345065c03b missing dependencies 2025-09-21 15:53:03 +02:00
Stefan Hardegger
c50dc618bf build adjustment 2025-09-21 15:47:14 +02:00
Stefan Hardegger
96e6ced8da adjustment 2025-09-21 15:37:48 +02:00
Stefan Hardegger
4738ae3a75 opefully build fix 2025-09-21 15:30:27 +02:00
Stefan Hardegger
591ca5a149 disable opensearch security 2025-09-21 15:08:20 +02:00
Stefan Hardegger
41ff3a9961 correction 2025-09-21 14:55:43 +02:00
Stefan Hardegger
0101c0ca2c bugfixes, and logging cleanup 2025-09-21 14:55:43 +02:00
58bb7f8229 revert a5628019f8
revert revert b1dbd85346

revert richtext replacement
2025-09-21 14:54:39 +02:00
a5628019f8 revert b1dbd85346
revert richtext replacement
2025-09-21 10:13:48 +02:00
Stefan Hardegger
b1dbd85346 richtext replacement 2025-09-21 10:10:04 +02:00
90 changed files with 8114 additions and 8233 deletions

View File

@@ -19,7 +19,7 @@ JWT_SECRET=REPLACE_WITH_SECURE_JWT_SECRET_MINIMUM_32_CHARS
APP_PASSWORD=REPLACE_WITH_SECURE_APP_PASSWORD APP_PASSWORD=REPLACE_WITH_SECURE_APP_PASSWORD
# OpenSearch Configuration # OpenSearch Configuration
OPENSEARCH_PASSWORD=REPLACE_WITH_SECURE_OPENSEARCH_PASSWORD #OPENSEARCH_PASSWORD=REPLACE_WITH_SECURE_OPENSEARCH_PASSWORD
SEARCH_ENGINE=opensearch SEARCH_ENGINE=opensearch
# Image Storage # Image Storage

220
ASYNC_IMAGE_PROCESSING.md Normal file
View File

@@ -0,0 +1,220 @@
# Async Image Processing Implementation
## Overview
The image processing system has been updated to handle external images asynchronously, preventing timeouts when processing stories with many images. This provides real-time progress updates to users showing which images are being processed.
## Backend Components
### 1. `ImageProcessingProgressService`
- Tracks progress for individual story image processing sessions
- Thread-safe with `ConcurrentHashMap` for multi-user support
- Provides progress information: total images, processed count, current image, status, errors
### 2. `AsyncImageProcessingService`
- Handles asynchronous image processing using Spring's `@Async` annotation
- Counts external images before processing
- Provides progress callbacks during processing
- Updates story content when processing completes
- Automatic cleanup of progress data after completion
### 3. Enhanced `ImageService`
- Added `processContentImagesWithProgress()` method with callback support
- Progress callbacks provide real-time updates during image download/processing
- Maintains compatibility with existing synchronous processing
### 4. Updated `StoryController`
- `POST /api/stories` and `PUT /api/stories/{id}` now trigger async image processing
- `GET /api/stories/{id}/image-processing-progress` endpoint for progress polling
- Processing starts immediately after story save and returns control to user
## Frontend Components
### 1. `ImageProcessingProgressTracker` (Utility Class)
```typescript
const tracker = new ImageProcessingProgressTracker(storyId);
tracker.onProgress((progress) => {
console.log(`Processing ${progress.processedImages}/${progress.totalImages}`);
});
tracker.onComplete(() => console.log('Done!'));
tracker.start();
```
### 2. `ImageProcessingProgressComponent` (React Component)
```tsx
<ImageProcessingProgressComponent
storyId={storyId}
autoStart={true}
onComplete={() => refreshStory()}
/>
```
## User Experience
### Before (Synchronous)
1. User saves story with external images
2. Request hangs for 30+ seconds processing images
3. Browser may timeout
4. No feedback about progress
5. User doesn't know if it's working
### After (Asynchronous)
1. User saves story with external images
2. Save completes immediately
3. Progress indicator appears: "Processing 5 images. Currently image 2 of 5..."
4. User can continue using the application
5. Progress updates every second
6. Story automatically refreshes when processing completes
## API Endpoints
### Progress Endpoint
```
GET /api/stories/{id}/image-processing-progress
```
**Response when processing:**
```json
{
"isProcessing": true,
"totalImages": 5,
"processedImages": 2,
"currentImageUrl": "https://example.com/image.jpg",
"status": "Processing image 3 of 5",
"progressPercentage": 40.0,
"completed": false,
"error": ""
}
```
**Response when completed:**
```json
{
"isProcessing": false,
"totalImages": 5,
"processedImages": 5,
"currentImageUrl": "",
"status": "Completed: 5 images processed",
"progressPercentage": 100.0,
"completed": true,
"error": ""
}
```
**Response when no processing:**
```json
{
"isProcessing": false,
"message": "No active image processing"
}
```
## Integration Examples
### React Hook Usage
```tsx
import { useImageProcessingProgress } from '../utils/imageProcessingProgress';
function StoryEditor({ storyId }) {
const { progress, isTracking, startTracking } = useImageProcessingProgress(storyId);
const handleSave = async () => {
await saveStory();
startTracking(); // Start monitoring progress
};
return (
<div>
{isTracking && progress && (
<div className="progress-indicator">
Processing {progress.processedImages}/{progress.totalImages} images...
</div>
)}
<button onClick={handleSave}>Save Story</button>
</div>
);
}
```
### Manual Progress Tracking
```typescript
// After saving a story with external images
const tracker = new ImageProcessingProgressTracker(storyId);
tracker.onProgress((progress) => {
updateProgressBar(progress.progressPercentage);
showStatus(progress.status);
if (progress.currentImageUrl) {
showCurrentImage(progress.currentImageUrl);
}
});
tracker.onComplete((finalProgress) => {
hideProgressBar();
showNotification('Image processing completed!');
refreshStoryContent(); // Reload story with processed images
});
tracker.onError((error) => {
hideProgressBar();
showError(`Image processing failed: ${error}`);
});
tracker.start();
```
## Configuration
### Polling Interval
Default: 1 second (1000ms)
```typescript
const tracker = new ImageProcessingProgressTracker(storyId, 500); // Poll every 500ms
```
### Timeout
Default: 5 minutes (300000ms)
```typescript
const tracker = new ImageProcessingProgressTracker(storyId, 1000, 600000); // 10 minute timeout
```
### Spring Async Configuration
The backend uses Spring's default async executor. For production, consider configuring a custom thread pool in your application properties:
```yaml
spring:
task:
execution:
pool:
core-size: 4
max-size: 8
queue-capacity: 100
```
## Error Handling
### Backend Errors
- Network timeouts downloading images
- Invalid image formats
- Disk space issues
- All errors are logged and returned in progress status
### Frontend Errors
- Network failures during progress polling
- Timeout if processing takes too long
- Graceful degradation - user can continue working
## Benefits
1. **No More Timeouts**: Large image processing operations won't timeout HTTP requests
2. **Better UX**: Users get real-time feedback about processing progress
3. **Improved Performance**: Users can continue using the app while images process
4. **Error Visibility**: Clear error messages when image processing fails
5. **Scalability**: Multiple users can process images simultaneously without blocking
## Future Enhancements
1. **WebSocket Support**: Replace polling with WebSocket for real-time push updates
2. **Batch Processing**: Queue multiple stories for batch image processing
3. **Retry Logic**: Automatic retry for failed image downloads
4. **Progress Persistence**: Save progress to database for recovery after server restart
5. **Image Optimization**: Automatic resize/compress images during processing

View File

@@ -1,889 +0,0 @@
# StoryCove Search Migration Specification: Typesense to OpenSearch
## Executive Summary
This document specifies the migration from Typesense to OpenSearch for the StoryCove application. The migration will be implemented using a parallel approach, maintaining Typesense functionality while gradually transitioning to OpenSearch, ensuring zero downtime and the ability to rollback if needed.
**Migration Goals:**
- Solve random query reliability issues
- Improve complex filtering performance
- Maintain feature parity during transition
- Zero downtime migration
- Improved developer experience
---
## Current State Analysis
### Typesense Implementation Overview
**Service Architecture:**
- `TypesenseService.java` (~2000 lines) - Primary search service
- 3 search indexes: Stories, Authors, Collections
- Multi-library support with dynamic collection names
- Integration with Spring Boot backend
**Core Functionality:**
1. **Full-text Search**: Stories, Authors with complex query building
2. **Random Story Selection**: `_rand()` function with fallback logic
3. **Advanced Filtering**: 15+ filter conditions with boolean logic
4. **Faceting**: Tag aggregations and counts
5. **Autocomplete**: Search suggestions with typeahead
6. **CRUD Operations**: Index/update/delete for all entity types
**Current Issues Identified:**
- `_rand()` function unreliability requiring complex fallback logic
- Complex filter query building with escaping issues
- Limited aggregation capabilities
- Inconsistent API behavior across query patterns
- Multi-collection management complexity
### Data Models and Schema
**Story Index Fields:**
```java
// Core fields
UUID id, String title, String description, String sourceUrl
Integer wordCount, Integer rating, Integer volume
Boolean isRead, LocalDateTime lastReadAt, Integer readingPosition
// Relationships
UUID authorId, String authorName
UUID seriesId, String seriesName
List<String> tagNames
// Metadata
LocalDateTime createdAt, LocalDateTime updatedAt
String coverPath, String sourceDomain
```
**Author Index Fields:**
```java
UUID id, String name, String notes
Integer authorRating, Double averageStoryRating, Integer storyCount
List<String> urls, String avatarImagePath
LocalDateTime createdAt, LocalDateTime updatedAt
```
**Collection Index Fields:**
```java
UUID id, String name, String description
List<String> tagNames, Boolean archived
LocalDateTime createdAt, LocalDateTime updatedAt
Integer storyCount, Integer currentPosition
```
### API Endpoints Current State
**Search Endpoints Analysis:**
**✅ USED by Frontend (Must Implement):**
- `GET /api/stories/search` - Main story search with complex filtering (CRITICAL)
- `GET /api/stories/random` - Random story selection with filters (CRITICAL)
- `GET /api/authors/search-typesense` - Author search (HIGH)
- `GET /api/tags/autocomplete` - Tag suggestions (MEDIUM)
- `POST /api/stories/reindex-typesense` - Admin reindex operations (MEDIUM)
- `POST /api/authors/reindex-typesense` - Admin reindex operations (MEDIUM)
- `POST /api/stories/recreate-typesense-collection` - Admin recreate (MEDIUM)
- `POST /api/authors/recreate-typesense-collection` - Admin recreate (MEDIUM)
**❌ UNUSED by Frontend (Skip Implementation):**
- `GET /api/stories/search/suggestions` - Not used by frontend
- `GET /api/authors/search` - Superseded by typesense version
- `GET /api/series/search` - Not used by frontend
- `GET /api/tags/search` - Superseded by autocomplete
- `POST /api/search/reindex` - Not used by frontend
- `GET /api/search/health` - Not used by frontend
**Scope Reduction: ~40% fewer endpoints to implement**
**Search Parameters (Stories):**
```
query, page, size, authors[], tags[], minRating, maxRating
sortBy, sortDir, facetBy[]
minWordCount, maxWordCount, createdAfter, createdBefore
lastReadAfter, lastReadBefore, unratedOnly, readingStatus
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter
minTagCount, popularOnly, hiddenGemsOnly
```
---
## Target OpenSearch Architecture
### Service Layer Design
**New Components:**
```
OpenSearchService.java - Primary search service (mirrors TypesenseService API)
OpenSearchConfig.java - Configuration and client setup
SearchMigrationService.java - Handles parallel operation during migration
SearchServiceAdapter.java - Abstraction layer for service switching
```
**Index Strategy:**
- **Single-node deployment** for development/small installations
- **Index-per-library** approach: `stories-{libraryId}`, `authors-{libraryId}`, `collections-{libraryId}`
- **Index templates** for consistent mapping across libraries
- **Aliases** for easy switching and zero-downtime updates
### OpenSearch Index Mappings
**Stories Index Mapping:**
```json
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"story_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "stop", "snowball"]
}
}
}
},
"mappings": {
"properties": {
"id": {"type": "keyword"},
"title": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {"keyword": {"type": "keyword"}}
},
"description": {
"type": "text",
"analyzer": "story_analyzer"
},
"authorName": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {"keyword": {"type": "keyword"}}
},
"seriesName": {
"type": "text",
"fields": {"keyword": {"type": "keyword"}}
},
"tagNames": {"type": "keyword"},
"wordCount": {"type": "integer"},
"rating": {"type": "integer"},
"volume": {"type": "integer"},
"isRead": {"type": "boolean"},
"readingPosition": {"type": "integer"},
"lastReadAt": {"type": "date"},
"createdAt": {"type": "date"},
"updatedAt": {"type": "date"},
"coverPath": {"type": "keyword"},
"sourceUrl": {"type": "keyword"},
"sourceDomain": {"type": "keyword"}
}
}
}
```
**Authors Index Mapping:**
```json
{
"mappings": {
"properties": {
"id": {"type": "keyword"},
"name": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {"keyword": {"type": "keyword"}}
},
"notes": {"type": "text"},
"authorRating": {"type": "integer"},
"averageStoryRating": {"type": "float"},
"storyCount": {"type": "integer"},
"urls": {"type": "keyword"},
"avatarImagePath": {"type": "keyword"},
"createdAt": {"type": "date"},
"updatedAt": {"type": "date"}
}
}
}
```
**Collections Index Mapping:**
```json
{
"mappings": {
"properties": {
"id": {"type": "keyword"},
"name": {
"type": "text",
"fields": {"keyword": {"type": "keyword"}}
},
"description": {"type": "text"},
"tagNames": {"type": "keyword"},
"archived": {"type": "boolean"},
"storyCount": {"type": "integer"},
"currentPosition": {"type": "integer"},
"createdAt": {"type": "date"},
"updatedAt": {"type": "date"}
}
}
}
```
### Query Translation Strategy
**Random Story Queries:**
```java
// Typesense (problematic)
String sortBy = seed != null ? "_rand(" + seed + ")" : "_rand()";
// OpenSearch (reliable)
QueryBuilder randomQuery = QueryBuilders.functionScoreQuery(
QueryBuilders.boolQuery().must(filters),
ScoreFunctionBuilders.randomFunction(seed != null ? seed.intValue() : null)
);
```
**Complex Filtering:**
```java
// Build bool query with multiple filter conditions
BoolQueryBuilder boolQuery = QueryBuilders.boolQuery()
.must(QueryBuilders.multiMatchQuery(query, "title", "description", "authorName"))
.filter(QueryBuilders.termsQuery("tagNames", tags))
.filter(QueryBuilders.rangeQuery("wordCount").gte(minWords).lte(maxWords))
.filter(QueryBuilders.rangeQuery("rating").gte(minRating).lte(maxRating));
```
**Faceting/Aggregations:**
```java
// Tags aggregation
AggregationBuilder tagsAgg = AggregationBuilders
.terms("tags")
.field("tagNames")
.size(100);
// Rating ranges
AggregationBuilder ratingRanges = AggregationBuilders
.range("rating_ranges")
.field("rating")
.addRange("unrated", 0, 1)
.addRange("low", 1, 3)
.addRange("high", 4, 6);
```
---
## Revised Implementation Phases (Scope Reduced by 40%)
### Phase 1: Infrastructure Setup (Week 1)
**Objectives:**
- Add OpenSearch to Docker Compose
- Create basic OpenSearch service
- Establish index templates and mappings
- **Focus**: Only stories, authors, and tags indexes (skip series, collections)
**Deliverables:**
1. **Docker Compose Updates:**
```yaml
opensearch:
image: opensearchproject/opensearch:2.11.0
environment:
- discovery.type=single-node
- DISABLE_SECURITY_PLUGIN=true
- OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx1g
ports:
- "9200:9200"
volumes:
- opensearch_data:/usr/share/opensearch/data
```
2. **OpenSearchConfig.java:**
```java
@Configuration
@ConditionalOnProperty(name = "storycove.opensearch.enabled", havingValue = "true")
public class OpenSearchConfig {
@Bean
public OpenSearchClient openSearchClient() {
// Client configuration
}
}
```
3. **Basic Index Creation:**
- Create index templates for stories, authors, collections
- Implement index creation with proper mappings
- Add health check endpoint
**Success Criteria:**
- OpenSearch container starts successfully
- Basic connectivity established
- Index templates created and validated
### Phase 2: Core Service Implementation (Week 2)
**Objectives:**
- Implement OpenSearchService with core functionality
- Create service abstraction layer
- Implement basic search operations
- **Focus**: Only critical endpoints (stories search, random, authors)
**Deliverables:**
1. **OpenSearchService.java** - Core service implementing:
- `indexStory()`, `updateStory()`, `deleteStory()`
- `searchStories()` with basic query support (CRITICAL)
- `getRandomStoryId()` with reliable seed support (CRITICAL)
- `indexAuthor()`, `updateAuthor()`, `deleteAuthor()`
- `searchAuthors()` for authors page (HIGH)
- `bulkIndexStories()`, `bulkIndexAuthors()` for initial data loading
2. **SearchServiceAdapter.java** - Abstraction layer:
```java
@Service
public class SearchServiceAdapter {
@Autowired(required = false)
private TypesenseService typesenseService;
@Autowired(required = false)
private OpenSearchService openSearchService;
@Value("${storycove.search.provider:typesense}")
private String searchProvider;
public SearchResultDto<StorySearchDto> searchStories(...) {
return "opensearch".equals(searchProvider)
? openSearchService.searchStories(...)
: typesenseService.searchStories(...);
}
}
```
3. **Basic Query Implementation:**
- Full-text search across title/description/author
- Basic filtering (tags, rating, word count)
- Pagination and sorting
**Success Criteria:**
- Basic search functionality working
- Service abstraction layer functional
- Can switch between Typesense and OpenSearch via configuration
### Phase 3: Advanced Features Implementation (Week 3)
**Objectives:**
- Implement complex filtering (all 15+ filter types)
- Add random story functionality
- Implement faceting/aggregations
- Add autocomplete/suggestions
**Deliverables:**
1. **Complex Query Builder:**
- All filter conditions from original implementation
- Date range filtering with proper timezone handling
- Boolean logic for reading status, coverage, series filters
2. **Random Story Implementation:**
```java
public Optional<UUID> getRandomStoryId(String searchQuery, List<String> tags, Long seed, ...) {
BoolQueryBuilder baseQuery = buildFilterQuery(searchQuery, tags, ...);
QueryBuilder randomQuery = QueryBuilders.functionScoreQuery(
baseQuery,
ScoreFunctionBuilders.randomFunction(seed != null ? seed.intValue() : null)
);
SearchRequest request = new SearchRequest("stories-" + getCurrentLibraryId())
.source(new SearchSourceBuilder()
.query(randomQuery)
.size(1)
.fetchSource(new String[]{"id"}, null));
// Execute and return result
}
```
3. **Faceting Implementation:**
- Tag aggregations with counts
- Rating range aggregations
- Author aggregations
- Custom facet builders
4. **Autocomplete Service:**
- Suggest-based implementation using completion fields
- Prefix matching for story titles and author names
**Success Criteria:**
- All filter conditions working correctly
- Random story selection reliable with seed support
- Faceting returns accurate counts
- Autocomplete responsive and accurate
### Phase 4: Data Migration & Parallel Operation (Week 4)
**Objectives:**
- Implement bulk data migration from database
- Enable parallel operation (write to both systems)
- Comprehensive testing of OpenSearch functionality
**Deliverables:**
1. **Migration Service:**
```java
@Service
public class SearchMigrationService {
public void performFullMigration() {
// Migrate all libraries
List<Library> libraries = libraryService.findAll();
for (Library library : libraries) {
migrateLibraryData(library);
}
}
private void migrateLibraryData(Library library) {
// Create indexes for library
// Bulk load stories, authors, collections
// Verify data integrity
}
}
```
2. **Dual-Write Implementation:**
- Modify all entity update operations to write to both systems
- Add configuration flag for dual-write mode
- Error handling for partial failures
3. **Data Validation Tools:**
- Compare search result counts between systems
- Validate random story selection consistency
- Check faceting accuracy
**Success Criteria:**
- Complete data migration with 100% accuracy
- Dual-write operations working without errors
- Search result parity between systems verified
### Phase 5: API Integration & Testing (Week 5)
**Objectives:**
- Update controller endpoints to use OpenSearch
- Comprehensive integration testing
- Performance testing and optimization
**Deliverables:**
1. **Controller Updates:**
- Modify controllers to use SearchServiceAdapter
- Add migration controls for gradual rollout
- Implement A/B testing capability
2. **Integration Tests:**
```java
@SpringBootTest
@TestMethodOrder(OrderAnnotation.class)
public class OpenSearchIntegrationTest {
@Test
@Order(1)
void testBasicSearch() {
// Test basic story search functionality
}
@Test
@Order(2)
void testComplexFiltering() {
// Test all 15+ filter conditions
}
@Test
@Order(3)
void testRandomStory() {
// Test random story with and without seed
}
@Test
@Order(4)
void testFaceting() {
// Test aggregation accuracy
}
}
```
3. **Performance Testing:**
- Load testing with realistic data volumes
- Query performance benchmarking
- Memory usage monitoring
**Success Criteria:**
- All integration tests passing
- Performance meets or exceeds Typesense baseline
- Memory usage within acceptable limits (< 2GB)
### Phase 6: Production Rollout & Monitoring (Week 6)
**Objectives:**
- Production deployment with feature flags
- Gradual user migration with monitoring
- Rollback capability testing
**Deliverables:**
1. **Feature Flag Implementation:**
```java
@Component
public class SearchFeatureFlags {
@Value("${storycove.search.opensearch.enabled:false}")
private boolean openSearchEnabled;
@Value("${storycove.search.opensearch.percentage:0}")
private int rolloutPercentage;
public boolean shouldUseOpenSearch(String userId) {
if (!openSearchEnabled) return false;
return userId.hashCode() % 100 < rolloutPercentage;
}
}
```
2. **Monitoring & Alerting:**
- Query performance metrics
- Error rate monitoring
- Search result accuracy validation
- User experience metrics
3. **Rollback Procedures:**
- Immediate rollback to Typesense capability
- Data consistency verification
- Performance rollback triggers
**Success Criteria:**
- Successful production deployment
- Zero user-facing issues during rollout
- Monitoring showing improved performance
- Rollback procedures validated
### Phase 7: Cleanup & Documentation (Week 7)
**Objectives:**
- Remove Typesense dependencies
- Update documentation
- Performance optimization
**Deliverables:**
1. **Code Cleanup:**
- Remove TypesenseService and related classes
- Clean up Docker Compose configuration
- Remove unused dependencies
2. **Documentation Updates:**
- Update deployment documentation
- Search API documentation
- Troubleshooting guides
3. **Performance Tuning:**
- Index optimization
- Query performance tuning
- Resource allocation optimization
**Success Criteria:**
- Typesense completely removed
- Documentation up to date
- Optimized performance in production
---
## Data Migration Strategy
### Pre-Migration Validation
**Data Integrity Checks:**
1. Count validation: Ensure all stories/authors/collections are present
2. Field validation: Verify all required fields are populated
3. Relationship validation: Check author-story and series-story relationships
4. Library separation: Ensure proper multi-library data isolation
**Migration Process:**
1. **Index Creation:**
```java
// Create indexes with proper mappings for each library
for (Library library : libraries) {
String storiesIndex = "stories-" + library.getId();
createIndexWithMapping(storiesIndex, getStoriesMapping());
createIndexWithMapping("authors-" + library.getId(), getAuthorsMapping());
createIndexWithMapping("collections-" + library.getId(), getCollectionsMapping());
}
```
2. **Bulk Data Loading:**
```java
// Load in batches to manage memory usage
int batchSize = 1000;
List<Story> allStories = storyService.findByLibraryId(libraryId);
for (int i = 0; i < allStories.size(); i += batchSize) {
List<Story> batch = allStories.subList(i, Math.min(i + batchSize, allStories.size()));
List<StoryDocument> documents = batch.stream()
.map(this::convertToSearchDocument)
.collect(Collectors.toList());
bulkIndexStories(documents, "stories-" + libraryId);
}
```
3. **Post-Migration Validation:**
- Count comparison between database and OpenSearch
- Spot-check random records for field accuracy
- Test search functionality with known queries
- Verify faceting counts match expected values
### Rollback Strategy
**Immediate Rollback Triggers:**
- Search error rate > 1%
- Query performance degradation > 50%
- Data inconsistency detected
- Memory usage > 4GB sustained
**Rollback Process:**
1. Update feature flag to disable OpenSearch
2. Verify Typesense still operational
3. Clear OpenSearch indexes to free resources
4. Investigate and document issues
**Data Consistency During Rollback:**
- Continue dual-write during investigation
- Re-sync any missed updates to OpenSearch
- Validate data integrity before retry
---
## Testing Strategy
### Unit Tests
**OpenSearchService Unit Tests:**
```java
@ExtendWith(MockitoExtension.class)
class OpenSearchServiceTest {
@Mock private OpenSearchClient client;
@InjectMocks private OpenSearchService service;
@Test
void testSearchStoriesBasicQuery() {
// Mock OpenSearch response
// Test basic search functionality
}
@Test
void testComplexFilterQuery() {
// Test complex boolean query building
}
@Test
void testRandomStorySelection() {
// Test random query with seed
}
}
```
**Query Builder Tests:**
- Test all 15+ filter conditions
- Validate query structure and parameters
- Test edge cases and null handling
### Integration Tests
**Full Search Integration:**
```java
@SpringBootTest
@Testcontainers
class OpenSearchIntegrationTest {
@Container
static OpenSearchContainer opensearch = new OpenSearchContainer("opensearchproject/opensearch:2.11.0");
@Test
void testEndToEndStorySearch() {
// Insert test data
// Perform search via controller
// Validate results
}
}
```
### Performance Tests
**Load Testing Scenarios:**
1. **Concurrent Search Load:**
- 50 concurrent users performing searches
- Mixed query complexity
- Duration: 10 minutes
2. **Bulk Indexing Performance:**
- Index 10,000 stories in batches
- Measure throughput and memory usage
3. **Random Query Performance:**
- 1000 random story requests with different seeds
- Compare with Typesense baseline
### Acceptance Tests
**Functional Requirements:**
- All existing search functionality preserved
- Random story selection improved reliability
- Faceting accuracy maintained
- Multi-library separation working
**Performance Requirements:**
- Search response time < 100ms for 95th percentile
- Random story selection < 50ms
- Index update operations < 10ms
- Memory usage < 2GB in production
---
## Risk Analysis & Mitigation
### Technical Risks
**Risk: OpenSearch Memory Usage**
- *Probability: Medium*
- *Impact: High*
- *Mitigation: Resource monitoring, index optimization, container limits*
**Risk: Query Performance Regression**
- *Probability: Low*
- *Impact: High*
- *Mitigation: Performance testing, query optimization, caching layer*
**Risk: Data Migration Accuracy**
- *Probability: Low*
- *Impact: Critical*
- *Mitigation: Comprehensive validation, dual-write verification, rollback procedures*
**Risk: Complex Filter Compatibility**
- *Probability: Medium*
- *Impact: Medium*
- *Mitigation: Extensive testing, gradual rollout, feature flags*
### Operational Risks
**Risk: Production Deployment Issues**
- *Probability: Medium*
- *Impact: High*
- *Mitigation: Staging environment testing, gradual rollout, immediate rollback capability*
**Risk: Team Learning Curve**
- *Probability: High*
- *Impact: Low*
- *Mitigation: Documentation, training, gradual responsibility transfer*
### Business Continuity
**Zero-Downtime Requirements:**
- Maintain Typesense during entire migration
- Feature flag-based switching
- Immediate rollback capability
- Health monitoring with automated alerts
---
## Success Criteria
### Functional Requirements ✅
- [ ] All search functionality migrated successfully
- [ ] Random story selection working reliably with seeds
- [ ] Complex filtering (15+ conditions) working accurately
- [ ] Faceting/aggregation results match expected values
- [ ] Multi-library support maintained
- [ ] Autocomplete functionality preserved
### Performance Requirements ✅
- [ ] Search response time 100ms (95th percentile)
- [ ] Random story selection 50ms
- [ ] Index operations 10ms
- [ ] Memory usage 2GB sustained
- [ ] Zero search downtime during migration
### Technical Requirements ✅
- [ ] Code quality maintained (test coverage 80%)
- [ ] Documentation updated and comprehensive
- [ ] Monitoring and alerting implemented
- [ ] Rollback procedures tested and validated
- [ ] Typesense dependencies cleanly removed
---
## Timeline Summary
| Phase | Duration | Key Deliverables | Risk Level |
|-------|----------|------------------|------------|
| 1. Infrastructure | 1 week | Docker setup, basic service | Low |
| 2. Core Service | 1 week | Basic search operations | Medium |
| 3. Advanced Features | 1 week | Complex filtering, random queries | High |
| 4. Data Migration | 1 week | Full data migration, dual-write | High |
| 5. API Integration | 1 week | Controller updates, testing | Medium |
| 6. Production Rollout | 1 week | Gradual deployment, monitoring | High |
| 7. Cleanup | 1 week | Remove Typesense, documentation | Low |
**Total Estimated Duration: 7 weeks**
---
## Configuration Management
### Environment Variables
```bash
# OpenSearch Configuration
OPENSEARCH_HOST=opensearch
OPENSEARCH_PORT=9200
OPENSEARCH_USERNAME=admin
OPENSEARCH_PASSWORD=${OPENSEARCH_PASSWORD}
# Feature Flags
STORYCOVE_OPENSEARCH_ENABLED=true
STORYCOVE_SEARCH_PROVIDER=opensearch
STORYCOVE_SEARCH_DUAL_WRITE=true
STORYCOVE_OPENSEARCH_ROLLOUT_PERCENTAGE=100
# Performance Tuning
OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx2g
STORYCOVE_SEARCH_BATCH_SIZE=1000
STORYCOVE_SEARCH_TIMEOUT=30s
```
### Docker Compose Updates
```yaml
# Add to docker-compose.yml
opensearch:
image: opensearchproject/opensearch:2.11.0
environment:
- discovery.type=single-node
- DISABLE_SECURITY_PLUGIN=true
- OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx2g
volumes:
- opensearch_data:/usr/share/opensearch/data
networks:
- storycove-network
volumes:
opensearch_data:
```
---
## Conclusion
This specification provides a comprehensive roadmap for migrating StoryCove from Typesense to OpenSearch. The phased approach ensures minimal risk while delivering improved reliability and performance, particularly for random story queries.
The parallel implementation strategy allows for thorough validation and provides confidence in the migration while maintaining the ability to rollback if issues arise. Upon successful completion, StoryCove will have a more robust and scalable search infrastructure that better supports its growth and feature requirements.
**Next Steps:**
1. Review and approve this specification
2. Set up development environment with OpenSearch
3. Begin Phase 1 implementation
4. Establish monitoring and success metrics
5. Execute migration according to timeline
---
*Document Version: 1.0*
*Last Updated: 2025-01-17*
*Author: Claude Code Assistant*

118
PORTABLE_TEXT_SETUP.md Normal file
View File

@@ -0,0 +1,118 @@
# Portable Text Editor Setup Instructions
## Current Status
⚠️ **Temporarily Reverted to Original Editor**
Due to npm cache permission issues preventing Docker builds, I've temporarily reverted the imports back to `RichTextEditor`. The Portable Text implementation is complete and ready to activate once the npm issue is resolved.
## Files Ready for Portable Text
-`PortableTextEditor.tsx` - Complete implementation
-`schema.ts` - Portable Text schema
-`conversion.ts` - HTML ↔ Portable Text conversion
-`package.json.with-portabletext` - Updated dependencies
## Docker Build Issue Resolution
The error `npm ci` requires `package-lock.json` but npm cache permissions prevent generating it.
### Solution Steps:
1. **Fix npm permissions:**
```bash
sudo chown -R $(whoami) ~/.npm
```
2. **Switch to Portable Text setup:**
```bash
cd frontend
mv package.json package.json.original
mv package.json.with-portabletext package.json
npm install # This will generate package-lock.json
```
3. **Update component imports** (change RichTextEditor → PortableTextEditor):
```typescript
// In src/app/add-story/page.tsx and src/app/stories/[id]/edit/page.tsx
import PortableTextEditor from '../../components/stories/PortableTextEditor';
// And update the JSX to use <PortableTextEditor ... />
```
4. **Build and test:**
```bash
npm run build
docker-compose build
```
## Implementation Complete
**Portable Text Schema** - Defines formatting options matching the original editor
**HTML ↔ Portable Text Conversion** - Seamless conversion between formats
**Sanitization Integration** - Uses existing sanitization strategy
**Component Replacement** - PortableTextEditor replaces RichTextEditor
**Image Processing** - Maintains existing image processing functionality
**Toolbar** - All formatting buttons from original editor
**Keyboard Shortcuts** - Ctrl+B, Ctrl+I, Ctrl+Shift+1-6
## Features Maintained
### 1. **Formatting Options**
- Bold, Italic, Underline, Strike, Code
- Headings H1-H6
- Paragraphs and Blockquotes
- All original toolbar buttons
### 2. **Visual & HTML Modes**
- Visual mode: Structured Portable Text editing
- HTML mode: Direct HTML editing (fallback)
- Live preview in HTML mode
### 3. **Image Processing**
- Existing image processing pipeline maintained
- Background image download and conversion
- Processing status indicators
- Warning system
### 4. **Paste Handling**
- Rich text paste from websites
- Image processing during paste
- HTML sanitization
- Structured content conversion
### 5. **Maximization & Resizing**
- Fullscreen editing mode
- Resizable editor height
- Keyboard shortcuts (Escape to exit)
## Benefits of Portable Text
1. **Structured Content** - Content is stored as JSON, not just HTML
2. **Future-Proof** - Easy to export/migrate content
3. **Better Search** - Structured content works better with Typesense
4. **Extensible** - Easy to add custom block types (images, etc.)
5. **Sanitization** - Inherently safer than HTML parsing
## Next Steps
1. Install the npm packages using one of the methods above
2. Test the editor functionality
3. Verify image processing works correctly
4. Optional: Add custom image block types for enhanced image handling
## File Structure
```
frontend/src/
├── components/stories/
│ ├── PortableTextEditor.tsx # New editor component
│ └── RichTextEditor.tsx # Original (can be removed after testing)
├── lib/portabletext/
│ ├── schema.ts # Portable Text schema and types
│ └── conversion.ts # HTML ↔ Portable Text conversion
└── app/
├── add-story/page.tsx # Updated to use PortableTextEditor
└── stories/[id]/edit/page.tsx # Updated to use PortableTextEditor
```
The implementation is backward compatible and maintains all existing functionality while providing the benefits of structured content editing.

244
SOLR_LIBRARY_MIGRATION.md Normal file
View File

@@ -0,0 +1,244 @@
# Solr Library Separation Migration Guide
This guide explains how to migrate existing StoryCove deployments to support proper library separation in Solr search.
## What Changed
The Solr service has been enhanced to support multi-tenant library separation by:
- Adding a `libraryId` field to all Solr documents
- Filtering all search queries by the current library context
- Ensuring complete data isolation between libraries
## Migration Options
### Option 1: Docker Volume Reset (Recommended for Docker)
**Best for**: Development, staging, and Docker-based deployments where data loss is acceptable.
```bash
# Stop the application
docker-compose down
# Remove only the Solr data volume (preserves database and images)
docker volume rm storycove_solr_data
# Restart - Solr will recreate cores with new schema
docker-compose up -d
# Wait for services to start, then trigger reindex via admin panel
```
**Pros**: Clean, simple, guaranteed to work
**Cons**: Requires downtime, loses existing search index
### Option 2: Schema API Migration (Production Safe)
**Best for**: Production environments where you need to preserve uptime.
**Method A: Automatic (Recommended)**
```bash
# Single endpoint that adds field and migrates data
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
```
**Method B: Manual Steps**
```bash
# Step 1: Add libraryId field via app API
curl -X POST "http://your-app-host/api/admin/search/solr/add-library-field" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
# Step 2: Run migration
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
```
**Method C: Direct Solr API (if app API fails)**
```bash
# Add libraryId field to stories core
curl -X POST "http://your-solr-host:8983/solr/storycove_stories/schema" \
-H "Content-Type: application/json" \
-d '{
"add-field": {
"name": "libraryId",
"type": "string",
"indexed": true,
"stored": true,
"required": false
}
}'
# Add libraryId field to authors core
curl -X POST "http://your-solr-host:8983/solr/storycove_authors/schema" \
-H "Content-Type: application/json" \
-d '{
"add-field": {
"name": "libraryId",
"type": "string",
"indexed": true,
"stored": true,
"required": false
}
}'
# Then run the migration
curl -X POST "http://your-app-host/api/admin/search/solr/migrate-library-schema" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
```
**Pros**: No downtime, preserves service availability, automatic field addition
**Cons**: Requires API access
### Option 3: Application-Level Migration (Recommended for Production)
**Best for**: Production environments with proper admin access.
1. **Deploy the code changes** to your environment
2. **Access the admin panel** of your application
3. **Navigate to search settings**
4. **Use the "Migrate Library Schema" button** or API endpoint:
```
POST /api/admin/search/solr/migrate-library-schema
```
**Pros**: User-friendly, handles all complexity internally
**Cons**: Requires admin access to application
## Step-by-Step Migration Process
### For Docker Deployments
1. **Backup your data** (optional but recommended):
```bash
# Backup database
docker-compose exec postgres pg_dump -U storycove storycove > backup.sql
```
2. **Pull the latest code** with library separation fixes
3. **Choose migration approach**:
- **Quick & Clean**: Use Option 1 (volume reset)
- **Production**: Use Option 2 or 3
4. **Verify migration**:
- Log in with different library passwords
- Perform searches to confirm isolation
- Check that new content gets indexed with library IDs
### For Kubernetes/Production Deployments
1. **Update your deployment** with the new container images
2. **Add the libraryId field** to Solr schema using Option 2
3. **Use the migration endpoint** (Option 3):
```bash
kubectl exec -it deployment/storycove-backend -- \
curl -X POST http://localhost:8080/api/admin/search/solr/migrate-library-schema
```
4. **Monitor logs** for successful migration
## Verification Steps
After migration, verify that library separation is working:
1. **Test with multiple libraries**:
- Log in with Library A password
- Add/search content
- Log in with Library B password
- Confirm Library A content is not visible
2. **Check Solr directly** (if accessible):
```bash
# Should show documents with libraryId field
curl "http://solr:8983/solr/storycove_stories/select?q=*:*&fl=id,title,libraryId&rows=5"
```
3. **Monitor application logs** for any library separation errors
## Troubleshooting
### "unknown field 'libraryId'" Error
**Problem**: `ERROR: [doc=xxx] unknown field 'libraryId'`
**Cause**: The Solr schema doesn't have the libraryId field yet.
**Solutions**:
1. **Use the automated migration** (adds field automatically):
```bash
curl -X POST "http://your-app/api/admin/search/solr/migrate-library-schema"
```
2. **Add field manually first**:
```bash
# Add field via app API
curl -X POST "http://your-app/api/admin/search/solr/add-library-field"
# Then run migration
curl -X POST "http://your-app/api/admin/search/solr/migrate-library-schema"
```
3. **Direct Solr API** (if app API fails):
```bash
# Add to both cores
curl -X POST "http://solr:8983/solr/storycove_stories/schema" \
-H "Content-Type: application/json" \
-d '{"add-field":{"name":"libraryId","type":"string","indexed":true,"stored":true}}'
curl -X POST "http://solr:8983/solr/storycove_authors/schema" \
-H "Content-Type: application/json" \
-d '{"add-field":{"name":"libraryId","type":"string","indexed":true,"stored":true}}'
```
4. **For development**: Use Option 1 (volume reset) for clean restart
### Migration Endpoint Returns Error
Common causes:
- Solr is not available (check connectivity)
- No active library context (ensure user is authenticated)
- Insufficient permissions (check JWT token/authentication)
### Search Results Still Mixed
This indicates incomplete migration:
- Clear all Solr data and reindex completely
- Verify that all documents have libraryId field
- Check that search queries include library filters
## Environment-Specific Notes
### Development
- Use Option 1 (volume reset) for simplicity
- Data loss is acceptable in dev environments
### Staging
- Use Option 2 or 3 to test production migration procedures
- Verify migration process before applying to production
### Production
- **Always backup data first**
- Use Option 2 (Schema API) or Option 3 (Admin endpoint)
- Plan for brief performance impact during reindexing
- Monitor system resources during bulk reindexing
## Performance Considerations
- **Reindexing time**: Depends on data size (typically 1000 docs/second)
- **Memory usage**: May increase during bulk indexing
- **Search performance**: Minimal impact from library filtering
- **Storage**: Slight increase due to libraryId field
## Rollback Plan
If issues occur:
1. **Immediate**: Restart Solr to previous state (if using Option 1)
2. **Schema revert**: Remove libraryId field via Schema API
3. **Code rollback**: Deploy previous version without library separation
4. **Data restore**: Restore from backup if necessary
This migration enables proper multi-tenant isolation while maintaining search performance and functionality.

View File

@@ -2,8 +2,13 @@ FROM openjdk:17-jdk-slim
WORKDIR /app WORKDIR /app
# Install Maven # Install Maven and PostgreSQL 15 client tools
RUN apt-get update && apt-get install -y maven && rm -rf /var/lib/apt/lists/* RUN apt-get update && apt-get install -y wget ca-certificates gnupg maven && \
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | apt-key add - && \
echo "deb http://apt.postgresql.org/pub/repos/apt/ bullseye-pgdg main" > /etc/apt/sources.list.d/pgdg.list && \
apt-get update && \
apt-get install -y postgresql-client-15 && \
rm -rf /var/lib/apt/lists/*
# Copy source code # Copy source code
COPY . . COPY . .

View File

@@ -84,9 +84,25 @@
<artifactId>httpclient5</artifactId> <artifactId>httpclient5</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.opensearch.client</groupId> <groupId>org.apache.solr</groupId>
<artifactId>opensearch-java</artifactId> <artifactId>solr-solrj</artifactId>
<version>3.2.0</version> <version>9.9.0</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-client</artifactId>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-util</artifactId>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-http</artifactId>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-io</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.httpcomponents.core5</groupId> <groupId>org.apache.httpcomponents.core5</groupId>

View File

@@ -2,10 +2,12 @@ package com.storycove;
import org.springframework.boot.SpringApplication; import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.scheduling.annotation.EnableAsync;
import org.springframework.scheduling.annotation.EnableScheduling; import org.springframework.scheduling.annotation.EnableScheduling;
@SpringBootApplication @SpringBootApplication
@EnableScheduling @EnableScheduling
@EnableAsync
public class StoryCoveApplication { public class StoryCoveApplication {
public static void main(String[] args) { public static void main(String[] args) {

View File

@@ -1,211 +0,0 @@
package com.storycove.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
import org.apache.hc.client5.http.auth.AuthScope;
import org.apache.hc.client5.http.auth.UsernamePasswordCredentials;
import org.apache.hc.client5.http.impl.auth.BasicCredentialsProvider;
import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManager;
import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManagerBuilder;
import org.apache.hc.client5.http.ssl.ClientTlsStrategyBuilder;
import org.apache.hc.core5.http.HttpHost;
import org.apache.hc.core5.util.Timeout;
import org.opensearch.client.json.jackson.JacksonJsonpMapper;
import org.opensearch.client.opensearch.OpenSearchClient;
import org.opensearch.client.transport.OpenSearchTransport;
import org.opensearch.client.transport.httpclient5.ApacheHttpClient5TransportBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import javax.net.ssl.SSLContext;
import javax.net.ssl.TrustManager;
import javax.net.ssl.X509TrustManager;
import java.io.FileInputStream;
import java.security.KeyStore;
import java.security.cert.X509Certificate;
@Configuration
public class OpenSearchConfig {
private static final Logger logger = LoggerFactory.getLogger(OpenSearchConfig.class);
private final OpenSearchProperties properties;
public OpenSearchConfig(@Qualifier("openSearchProperties") OpenSearchProperties properties) {
this.properties = properties;
}
@Bean
public OpenSearchClient openSearchClient() throws Exception {
logger.info("Initializing OpenSearch client for profile: {}", properties.getProfile());
// Create credentials provider
BasicCredentialsProvider credentialsProvider = createCredentialsProvider();
// Create SSL context based on environment
SSLContext sslContext = createSSLContext();
// Create connection manager with pooling
PoolingAsyncClientConnectionManager connectionManager = createConnectionManager(sslContext);
// Create custom ObjectMapper for proper date serialization
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule());
objectMapper.disable(com.fasterxml.jackson.databind.SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
// Create the transport with all configurations and custom Jackson mapper
OpenSearchTransport transport = ApacheHttpClient5TransportBuilder
.builder(new HttpHost(properties.getScheme(), properties.getHost(), properties.getPort()))
.setMapper(new JacksonJsonpMapper(objectMapper))
.setHttpClientConfigCallback(httpClientBuilder -> {
// Only set credentials provider if authentication is configured
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
}
httpClientBuilder.setConnectionManager(connectionManager);
// Set timeouts
httpClientBuilder.setDefaultRequestConfig(
org.apache.hc.client5.http.config.RequestConfig.custom()
.setConnectionRequestTimeout(Timeout.ofMilliseconds(properties.getConnection().getTimeout()))
.setResponseTimeout(Timeout.ofMilliseconds(properties.getConnection().getSocketTimeout()))
.build()
);
return httpClientBuilder;
})
.build();
OpenSearchClient client = new OpenSearchClient(transport);
// Test connection
testConnection(client);
return client;
}
private BasicCredentialsProvider createCredentialsProvider() {
BasicCredentialsProvider credentialsProvider = new BasicCredentialsProvider();
// Only set credentials if username and password are provided
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
credentialsProvider.setCredentials(
new AuthScope(properties.getHost(), properties.getPort()),
new UsernamePasswordCredentials(
properties.getUsername(),
properties.getPassword().toCharArray()
)
);
logger.info("OpenSearch credentials configured for user: {}", properties.getUsername());
} else {
logger.info("OpenSearch running without authentication (no credentials configured)");
}
return credentialsProvider;
}
private SSLContext createSSLContext() throws Exception {
SSLContext sslContext;
if (isProduction() && !properties.getSecurity().isTrustAllCertificates()) {
// Production SSL configuration with proper certificate validation
sslContext = createProductionSSLContext();
} else {
// Development SSL configuration (trust all certificates)
sslContext = createDevelopmentSSLContext();
}
return sslContext;
}
private SSLContext createProductionSSLContext() throws Exception {
logger.info("Configuring production SSL context with certificate validation");
SSLContext sslContext = SSLContext.getInstance("TLS");
// Load custom keystore/truststore if provided
if (properties.getSecurity().getTruststorePath() != null) {
KeyStore trustStore = KeyStore.getInstance("JKS");
try (FileInputStream fis = new FileInputStream(properties.getSecurity().getTruststorePath())) {
trustStore.load(fis, properties.getSecurity().getTruststorePassword().toCharArray());
}
javax.net.ssl.TrustManagerFactory tmf =
javax.net.ssl.TrustManagerFactory.getInstance(javax.net.ssl.TrustManagerFactory.getDefaultAlgorithm());
tmf.init(trustStore);
sslContext.init(null, tmf.getTrustManagers(), null);
} else {
// Use default system SSL context for production
sslContext.init(null, null, null);
}
return sslContext;
}
private SSLContext createDevelopmentSSLContext() throws Exception {
logger.warn("Configuring development SSL context - TRUSTING ALL CERTIFICATES (not for production!)");
SSLContext sslContext = SSLContext.getInstance("TLS");
sslContext.init(null, new TrustManager[] {
new X509TrustManager() {
public X509Certificate[] getAcceptedIssuers() { return null; }
public void checkClientTrusted(X509Certificate[] certs, String authType) {}
public void checkServerTrusted(X509Certificate[] certs, String authType) {}
}
}, null);
return sslContext;
}
private PoolingAsyncClientConnectionManager createConnectionManager(SSLContext sslContext) {
PoolingAsyncClientConnectionManagerBuilder builder = PoolingAsyncClientConnectionManagerBuilder.create();
// Configure TLS strategy
if (properties.getScheme().equals("https")) {
if (isProduction() && properties.getSecurity().isSslVerification()) {
// Production TLS with hostname verification
builder.setTlsStrategy(ClientTlsStrategyBuilder.create()
.setSslContext(sslContext)
.build());
} else {
// Development TLS without hostname verification
builder.setTlsStrategy(ClientTlsStrategyBuilder.create()
.setSslContext(sslContext)
.setHostnameVerifier((hostname, session) -> true)
.build());
}
}
PoolingAsyncClientConnectionManager connectionManager = builder.build();
// Configure connection pool settings
connectionManager.setMaxTotal(properties.getConnection().getMaxConnectionsTotal());
connectionManager.setDefaultMaxPerRoute(properties.getConnection().getMaxConnectionsPerRoute());
return connectionManager;
}
private boolean isProduction() {
return "production".equalsIgnoreCase(properties.getProfile());
}
private void testConnection(OpenSearchClient client) {
try {
var response = client.info();
logger.info("OpenSearch connection successful - Version: {}, Cluster: {}",
response.version().number(),
response.clusterName());
} catch (Exception e) {
logger.warn("OpenSearch connection test failed during initialization: {}", e.getMessage());
logger.debug("OpenSearch connection test full error", e);
// Don't throw exception here - let the client be created and handle failures in service methods
}
}
}

View File

@@ -1,164 +0,0 @@
package com.storycove.config;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
@Component
@ConfigurationProperties(prefix = "storycove.opensearch")
public class OpenSearchProperties {
private String host = "localhost";
private int port = 9200;
private String scheme = "https";
private String username = "admin";
private String password;
private String profile = "development";
private Security security = new Security();
private Connection connection = new Connection();
private Indices indices = new Indices();
private Bulk bulk = new Bulk();
private Health health = new Health();
// Getters and setters
public String getHost() { return host; }
public void setHost(String host) { this.host = host; }
public int getPort() { return port; }
public void setPort(int port) { this.port = port; }
public String getScheme() { return scheme; }
public void setScheme(String scheme) { this.scheme = scheme; }
public String getUsername() { return username; }
public void setUsername(String username) { this.username = username; }
public String getPassword() { return password; }
public void setPassword(String password) { this.password = password; }
public String getProfile() { return profile; }
public void setProfile(String profile) { this.profile = profile; }
public Security getSecurity() { return security; }
public void setSecurity(Security security) { this.security = security; }
public Connection getConnection() { return connection; }
public void setConnection(Connection connection) { this.connection = connection; }
public Indices getIndices() { return indices; }
public void setIndices(Indices indices) { this.indices = indices; }
public Bulk getBulk() { return bulk; }
public void setBulk(Bulk bulk) { this.bulk = bulk; }
public Health getHealth() { return health; }
public void setHealth(Health health) { this.health = health; }
public static class Security {
private boolean sslVerification = false;
private boolean trustAllCertificates = true;
private String keystorePath;
private String keystorePassword;
private String truststorePath;
private String truststorePassword;
// Getters and setters
public boolean isSslVerification() { return sslVerification; }
public void setSslVerification(boolean sslVerification) { this.sslVerification = sslVerification; }
public boolean isTrustAllCertificates() { return trustAllCertificates; }
public void setTrustAllCertificates(boolean trustAllCertificates) { this.trustAllCertificates = trustAllCertificates; }
public String getKeystorePath() { return keystorePath; }
public void setKeystorePath(String keystorePath) { this.keystorePath = keystorePath; }
public String getKeystorePassword() { return keystorePassword; }
public void setKeystorePassword(String keystorePassword) { this.keystorePassword = keystorePassword; }
public String getTruststorePath() { return truststorePath; }
public void setTruststorePath(String truststorePath) { this.truststorePath = truststorePath; }
public String getTruststorePassword() { return truststorePassword; }
public void setTruststorePassword(String truststorePassword) { this.truststorePassword = truststorePassword; }
}
public static class Connection {
private int timeout = 30000;
private int socketTimeout = 60000;
private int maxConnectionsPerRoute = 10;
private int maxConnectionsTotal = 30;
private boolean retryOnFailure = true;
private int maxRetries = 3;
// Getters and setters
public int getTimeout() { return timeout; }
public void setTimeout(int timeout) { this.timeout = timeout; }
public int getSocketTimeout() { return socketTimeout; }
public void setSocketTimeout(int socketTimeout) { this.socketTimeout = socketTimeout; }
public int getMaxConnectionsPerRoute() { return maxConnectionsPerRoute; }
public void setMaxConnectionsPerRoute(int maxConnectionsPerRoute) { this.maxConnectionsPerRoute = maxConnectionsPerRoute; }
public int getMaxConnectionsTotal() { return maxConnectionsTotal; }
public void setMaxConnectionsTotal(int maxConnectionsTotal) { this.maxConnectionsTotal = maxConnectionsTotal; }
public boolean isRetryOnFailure() { return retryOnFailure; }
public void setRetryOnFailure(boolean retryOnFailure) { this.retryOnFailure = retryOnFailure; }
public int getMaxRetries() { return maxRetries; }
public void setMaxRetries(int maxRetries) { this.maxRetries = maxRetries; }
}
public static class Indices {
private int defaultShards = 1;
private int defaultReplicas = 0;
private String refreshInterval = "1s";
// Getters and setters
public int getDefaultShards() { return defaultShards; }
public void setDefaultShards(int defaultShards) { this.defaultShards = defaultShards; }
public int getDefaultReplicas() { return defaultReplicas; }
public void setDefaultReplicas(int defaultReplicas) { this.defaultReplicas = defaultReplicas; }
public String getRefreshInterval() { return refreshInterval; }
public void setRefreshInterval(String refreshInterval) { this.refreshInterval = refreshInterval; }
}
public static class Bulk {
private int actions = 1000;
private long size = 5242880; // 5MB
private int timeout = 10000;
private int concurrentRequests = 1;
// Getters and setters
public int getActions() { return actions; }
public void setActions(int actions) { this.actions = actions; }
public long getSize() { return size; }
public void setSize(long size) { this.size = size; }
public int getTimeout() { return timeout; }
public void setTimeout(int timeout) { this.timeout = timeout; }
public int getConcurrentRequests() { return concurrentRequests; }
public void setConcurrentRequests(int concurrentRequests) { this.concurrentRequests = concurrentRequests; }
}
public static class Health {
private int checkInterval = 30000;
private int slowQueryThreshold = 5000;
private boolean enableMetrics = true;
// Getters and setters
public int getCheckInterval() { return checkInterval; }
public void setCheckInterval(int checkInterval) { this.checkInterval = checkInterval; }
public int getSlowQueryThreshold() { return slowQueryThreshold; }
public void setSlowQueryThreshold(int slowQueryThreshold) { this.slowQueryThreshold = slowQueryThreshold; }
public boolean isEnableMetrics() { return enableMetrics; }
public void setEnableMetrics(boolean enableMetrics) { this.enableMetrics = enableMetrics; }
}
}

View File

@@ -0,0 +1,57 @@
package com.storycove.config;
import org.apache.solr.client.solrj.SolrClient;
import org.apache.solr.client.solrj.impl.HttpSolrClient;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@Configuration
@ConditionalOnProperty(
value = "storycove.search.engine",
havingValue = "solr",
matchIfMissing = false
)
public class SolrConfig {
private static final Logger logger = LoggerFactory.getLogger(SolrConfig.class);
private final SolrProperties properties;
public SolrConfig(SolrProperties properties) {
this.properties = properties;
}
@Bean
public SolrClient solrClient() {
logger.info("Initializing Solr client with URL: {}", properties.getUrl());
HttpSolrClient.Builder builder = new HttpSolrClient.Builder(properties.getUrl())
.withConnectionTimeout(properties.getConnection().getTimeout())
.withSocketTimeout(properties.getConnection().getSocketTimeout());
SolrClient client = builder.build();
logger.info("Solr running without authentication");
// Test connection
testConnection(client);
return client;
}
private void testConnection(SolrClient client) {
try {
// Test connection by pinging the server
var response = client.ping();
logger.info("Solr connection successful - Response time: {}ms",
response.getElapsedTime());
} catch (Exception e) {
logger.warn("Solr connection test failed during initialization: {}", e.getMessage());
logger.debug("Solr connection test full error", e);
// Don't throw exception here - let the client be created and handle failures in service methods
}
}
}

View File

@@ -0,0 +1,140 @@
package com.storycove.config;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
@Component
@ConfigurationProperties(prefix = "storycove.solr")
public class SolrProperties {
private String url = "http://localhost:8983/solr";
private String username;
private String password;
private Cores cores = new Cores();
private Connection connection = new Connection();
private Query query = new Query();
private Commit commit = new Commit();
private Health health = new Health();
// Getters and setters
public String getUrl() { return url; }
public void setUrl(String url) { this.url = url; }
public String getUsername() { return username; }
public void setUsername(String username) { this.username = username; }
public String getPassword() { return password; }
public void setPassword(String password) { this.password = password; }
public Cores getCores() { return cores; }
public void setCores(Cores cores) { this.cores = cores; }
public Connection getConnection() { return connection; }
public void setConnection(Connection connection) { this.connection = connection; }
public Query getQuery() { return query; }
public void setQuery(Query query) { this.query = query; }
public Commit getCommit() { return commit; }
public void setCommit(Commit commit) { this.commit = commit; }
public Health getHealth() { return health; }
public void setHealth(Health health) { this.health = health; }
public static class Cores {
private String stories = "storycove_stories";
private String authors = "storycove_authors";
// Getters and setters
public String getStories() { return stories; }
public void setStories(String stories) { this.stories = stories; }
public String getAuthors() { return authors; }
public void setAuthors(String authors) { this.authors = authors; }
}
public static class Connection {
private int timeout = 30000;
private int socketTimeout = 60000;
private int maxConnectionsPerRoute = 10;
private int maxConnectionsTotal = 30;
private boolean retryOnFailure = true;
private int maxRetries = 3;
// Getters and setters
public int getTimeout() { return timeout; }
public void setTimeout(int timeout) { this.timeout = timeout; }
public int getSocketTimeout() { return socketTimeout; }
public void setSocketTimeout(int socketTimeout) { this.socketTimeout = socketTimeout; }
public int getMaxConnectionsPerRoute() { return maxConnectionsPerRoute; }
public void setMaxConnectionsPerRoute(int maxConnectionsPerRoute) { this.maxConnectionsPerRoute = maxConnectionsPerRoute; }
public int getMaxConnectionsTotal() { return maxConnectionsTotal; }
public void setMaxConnectionsTotal(int maxConnectionsTotal) { this.maxConnectionsTotal = maxConnectionsTotal; }
public boolean isRetryOnFailure() { return retryOnFailure; }
public void setRetryOnFailure(boolean retryOnFailure) { this.retryOnFailure = retryOnFailure; }
public int getMaxRetries() { return maxRetries; }
public void setMaxRetries(int maxRetries) { this.maxRetries = maxRetries; }
}
public static class Query {
private int defaultRows = 10;
private int maxRows = 1000;
private String defaultOperator = "AND";
private boolean highlight = true;
private boolean facets = true;
// Getters and setters
public int getDefaultRows() { return defaultRows; }
public void setDefaultRows(int defaultRows) { this.defaultRows = defaultRows; }
public int getMaxRows() { return maxRows; }
public void setMaxRows(int maxRows) { this.maxRows = maxRows; }
public String getDefaultOperator() { return defaultOperator; }
public void setDefaultOperator(String defaultOperator) { this.defaultOperator = defaultOperator; }
public boolean isHighlight() { return highlight; }
public void setHighlight(boolean highlight) { this.highlight = highlight; }
public boolean isFacets() { return facets; }
public void setFacets(boolean facets) { this.facets = facets; }
}
public static class Commit {
private boolean softCommit = true;
private int commitWithin = 1000;
private boolean waitSearcher = false;
// Getters and setters
public boolean isSoftCommit() { return softCommit; }
public void setSoftCommit(boolean softCommit) { this.softCommit = softCommit; }
public int getCommitWithin() { return commitWithin; }
public void setCommitWithin(int commitWithin) { this.commitWithin = commitWithin; }
public boolean isWaitSearcher() { return waitSearcher; }
public void setWaitSearcher(boolean waitSearcher) { this.waitSearcher = waitSearcher; }
}
public static class Health {
private int checkInterval = 30000;
private int slowQueryThreshold = 5000;
private boolean enableMetrics = true;
// Getters and setters
public int getCheckInterval() { return checkInterval; }
public void setCheckInterval(int checkInterval) { this.checkInterval = checkInterval; }
public int getSlowQueryThreshold() { return slowQueryThreshold; }
public void setSlowQueryThreshold(int slowQueryThreshold) { this.slowQueryThreshold = slowQueryThreshold; }
public boolean isEnableMetrics() { return enableMetrics; }
public void setEnableMetrics(boolean enableMetrics) { this.enableMetrics = enableMetrics; }
}
}

View File

@@ -3,7 +3,7 @@ package com.storycove.controller;
import com.storycove.entity.Author; import com.storycove.entity.Author;
import com.storycove.entity.Story; import com.storycove.entity.Story;
import com.storycove.service.AuthorService; import com.storycove.service.AuthorService;
import com.storycove.service.OpenSearchService; import com.storycove.service.SolrService;
import com.storycove.service.SearchServiceAdapter; import com.storycove.service.SearchServiceAdapter;
import com.storycove.service.StoryService; import com.storycove.service.StoryService;
import org.slf4j.Logger; import org.slf4j.Logger;
@@ -16,7 +16,7 @@ import java.util.List;
import java.util.Map; import java.util.Map;
/** /**
* Admin controller for managing OpenSearch operations. * Admin controller for managing Solr operations.
* Provides endpoints for reindexing and index management. * Provides endpoints for reindexing and index management.
*/ */
@RestController @RestController
@@ -35,7 +35,7 @@ public class AdminSearchController {
private AuthorService authorService; private AuthorService authorService;
@Autowired(required = false) @Autowired(required = false)
private OpenSearchService openSearchService; private SolrService solrService;
/** /**
* Get current search status * Get current search status
@@ -48,7 +48,7 @@ public class AdminSearchController {
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"primaryEngine", status.getPrimaryEngine(), "primaryEngine", status.getPrimaryEngine(),
"dualWrite", status.isDualWrite(), "dualWrite", status.isDualWrite(),
"openSearchAvailable", status.isOpenSearchAvailable() "solrAvailable", status.isSolrAvailable()
)); ));
} catch (Exception e) { } catch (Exception e) {
logger.error("Error getting search status", e); logger.error("Error getting search status", e);
@@ -59,17 +59,17 @@ public class AdminSearchController {
} }
/** /**
* Reindex all data in OpenSearch * Reindex all data in Solr
*/ */
@PostMapping("/opensearch/reindex") @PostMapping("/solr/reindex")
public ResponseEntity<Map<String, Object>> reindexOpenSearch() { public ResponseEntity<Map<String, Object>> reindexSolr() {
try { try {
logger.info("Starting OpenSearch full reindex"); logger.info("Starting Solr full reindex");
if (!searchServiceAdapter.isSearchServiceAvailable()) { if (!searchServiceAdapter.isSearchServiceAvailable()) {
return ResponseEntity.badRequest().body(Map.of( return ResponseEntity.badRequest().body(Map.of(
"success", false, "success", false,
"error", "OpenSearch is not available or healthy" "error", "Solr is not available or healthy"
)); ));
} }
@@ -77,14 +77,14 @@ public class AdminSearchController {
List<Story> allStories = storyService.findAllWithAssociations(); List<Story> allStories = storyService.findAllWithAssociations();
List<Author> allAuthors = authorService.findAllWithStories(); List<Author> allAuthors = authorService.findAllWithStories();
// Bulk index directly in OpenSearch // Bulk index directly in Solr
if (openSearchService != null) { if (solrService != null) {
openSearchService.bulkIndexStories(allStories); solrService.bulkIndexStories(allStories);
openSearchService.bulkIndexAuthors(allAuthors); solrService.bulkIndexAuthors(allAuthors);
} else { } else {
return ResponseEntity.badRequest().body(Map.of( return ResponseEntity.badRequest().body(Map.of(
"success", false, "success", false,
"error", "OpenSearch service not available" "error", "Solr service not available"
)); ));
} }
@@ -92,7 +92,7 @@ public class AdminSearchController {
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"success", true, "success", true,
"message", String.format("Reindexed %d stories and %d authors in OpenSearch", "message", String.format("Reindexed %d stories and %d authors in Solr",
allStories.size(), allAuthors.size()), allStories.size(), allAuthors.size()),
"storiesCount", allStories.size(), "storiesCount", allStories.size(),
"authorsCount", allAuthors.size(), "authorsCount", allAuthors.size(),
@@ -100,36 +100,36 @@ public class AdminSearchController {
)); ));
} catch (Exception e) { } catch (Exception e) {
logger.error("Error during OpenSearch reindex", e); logger.error("Error during Solr reindex", e);
return ResponseEntity.internalServerError().body(Map.of( return ResponseEntity.internalServerError().body(Map.of(
"success", false, "success", false,
"error", "OpenSearch reindex failed: " + e.getMessage() "error", "Solr reindex failed: " + e.getMessage()
)); ));
} }
} }
/** /**
* Recreate OpenSearch indices * Recreate Solr indices
*/ */
@PostMapping("/opensearch/recreate") @PostMapping("/solr/recreate")
public ResponseEntity<Map<String, Object>> recreateOpenSearchIndices() { public ResponseEntity<Map<String, Object>> recreateSolrIndices() {
try { try {
logger.info("Starting OpenSearch indices recreation"); logger.info("Starting Solr indices recreation");
if (!searchServiceAdapter.isSearchServiceAvailable()) { if (!searchServiceAdapter.isSearchServiceAvailable()) {
return ResponseEntity.badRequest().body(Map.of( return ResponseEntity.badRequest().body(Map.of(
"success", false, "success", false,
"error", "OpenSearch is not available or healthy" "error", "Solr is not available or healthy"
)); ));
} }
// Recreate indices // Recreate indices
if (openSearchService != null) { if (solrService != null) {
openSearchService.recreateIndices(); solrService.recreateIndices();
} else { } else {
return ResponseEntity.badRequest().body(Map.of( return ResponseEntity.badRequest().body(Map.of(
"success", false, "success", false,
"error", "OpenSearch service not available" "error", "Solr service not available"
)); ));
} }
@@ -138,14 +138,14 @@ public class AdminSearchController {
List<Author> allAuthors = authorService.findAllWithStories(); List<Author> allAuthors = authorService.findAllWithStories();
// Bulk index after recreation // Bulk index after recreation
openSearchService.bulkIndexStories(allStories); solrService.bulkIndexStories(allStories);
openSearchService.bulkIndexAuthors(allAuthors); solrService.bulkIndexAuthors(allAuthors);
int totalIndexed = allStories.size() + allAuthors.size(); int totalIndexed = allStories.size() + allAuthors.size();
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"success", true, "success", true,
"message", String.format("Recreated OpenSearch indices and indexed %d stories and %d authors", "message", String.format("Recreated Solr indices and indexed %d stories and %d authors",
allStories.size(), allAuthors.size()), allStories.size(), allAuthors.size()),
"storiesCount", allStories.size(), "storiesCount", allStories.size(),
"authorsCount", allAuthors.size(), "authorsCount", allAuthors.size(),
@@ -153,10 +153,156 @@ public class AdminSearchController {
)); ));
} catch (Exception e) { } catch (Exception e) {
logger.error("Error during OpenSearch indices recreation", e); logger.error("Error during Solr indices recreation", e);
return ResponseEntity.internalServerError().body(Map.of( return ResponseEntity.internalServerError().body(Map.of(
"success", false, "success", false,
"error", "OpenSearch indices recreation failed: " + e.getMessage() "error", "Solr indices recreation failed: " + e.getMessage()
));
}
}
/**
* Add libraryId field to Solr schema via Schema API.
* This is a prerequisite for library-aware indexing.
*/
@PostMapping("/solr/add-library-field")
public ResponseEntity<Map<String, Object>> addLibraryField() {
try {
logger.info("Starting Solr libraryId field addition");
if (!searchServiceAdapter.isSearchServiceAvailable()) {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "Solr is not available or healthy"
));
}
if (solrService == null) {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "Solr service not available"
));
}
// Add the libraryId field to the schema
try {
solrService.addLibraryIdField();
logger.info("libraryId field added successfully to schema");
return ResponseEntity.ok(Map.of(
"success", true,
"message", "libraryId field added successfully to both stories and authors cores",
"note", "You can now run the library schema migration"
));
} catch (Exception e) {
logger.error("Failed to add libraryId field to schema", e);
return ResponseEntity.internalServerError().body(Map.of(
"success", false,
"error", "Failed to add libraryId field to schema: " + e.getMessage(),
"details", "Check that Solr is accessible and schema is modifiable"
));
}
} catch (Exception e) {
logger.error("Error during libraryId field addition", e);
return ResponseEntity.internalServerError().body(Map.of(
"success", false,
"error", "libraryId field addition failed: " + e.getMessage()
));
}
}
/**
* Migrate to library-aware Solr schema.
* This endpoint handles the migration from non-library-aware to library-aware indexing.
* It clears existing data and reindexes with library context.
*/
@PostMapping("/solr/migrate-library-schema")
public ResponseEntity<Map<String, Object>> migrateLibrarySchema() {
try {
logger.info("Starting Solr library schema migration");
if (!searchServiceAdapter.isSearchServiceAvailable()) {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "Solr is not available or healthy"
));
}
if (solrService == null) {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "Solr service not available"
));
}
logger.info("Adding libraryId field to Solr schema");
// First, add the libraryId field to the schema via Schema API
try {
solrService.addLibraryIdField();
logger.info("libraryId field added successfully to schema");
} catch (Exception e) {
logger.error("Failed to add libraryId field to schema", e);
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "Failed to add libraryId field to schema: " + e.getMessage(),
"details", "The schema must support the libraryId field before migration"
));
}
logger.info("Clearing existing Solr data for library schema migration");
// Clear existing data that doesn't have libraryId
try {
solrService.recreateIndices();
} catch (Exception e) {
logger.warn("Could not recreate indices (expected in production): {}", e.getMessage());
// In production, just clear the data instead
try {
solrService.clearAllDocuments();
logger.info("Cleared all documents from Solr cores");
} catch (Exception clearError) {
logger.error("Failed to clear documents", clearError);
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "Failed to clear existing data: " + clearError.getMessage()
));
}
}
// Get all data and reindex with library context
List<Story> allStories = storyService.findAllWithAssociations();
List<Author> allAuthors = authorService.findAllWithStories();
logger.info("Reindexing {} stories and {} authors with library context",
allStories.size(), allAuthors.size());
// Bulk index everything (will now include libraryId from current library context)
solrService.bulkIndexStories(allStories);
solrService.bulkIndexAuthors(allAuthors);
int totalIndexed = allStories.size() + allAuthors.size();
logger.info("Solr library schema migration completed successfully");
return ResponseEntity.ok(Map.of(
"success", true,
"message", String.format("Library schema migration completed. Reindexed %d stories and %d authors with library context.",
allStories.size(), allAuthors.size()),
"storiesCount", allStories.size(),
"authorsCount", allAuthors.size(),
"totalCount", totalIndexed,
"note", "Ensure libraryId field exists in Solr schema before running this migration"
));
} catch (Exception e) {
logger.error("Error during Solr library schema migration", e);
return ResponseEntity.internalServerError().body(Map.of(
"success", false,
"error", "Library schema migration failed: " + e.getMessage(),
"details", "Make sure the libraryId field has been added to both stories and authors Solr cores"
)); ));
} }
} }

View File

@@ -291,7 +291,7 @@ public class CollectionController {
// Collections are not indexed in search engine yet // Collections are not indexed in search engine yet
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"success", true, "success", true,
"message", "Collections indexing not yet implemented in OpenSearch", "message", "Collections indexing not yet implemented in Solr",
"count", allCollections.size() "count", allCollections.size()
)); ));
} catch (Exception e) { } catch (Exception e) {

View File

@@ -3,27 +3,43 @@ package com.storycove.controller;
import com.storycove.dto.HtmlSanitizationConfigDto; import com.storycove.dto.HtmlSanitizationConfigDto;
import com.storycove.service.HtmlSanitizationService; import com.storycove.service.HtmlSanitizationService;
import com.storycove.service.ImageService; import com.storycove.service.ImageService;
import com.storycove.service.StoryService;
import com.storycove.entity.Story;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value; import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity; import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*; import org.springframework.web.bind.annotation.*;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.Map; import java.util.Map;
import java.util.List;
import java.util.HashMap;
import java.util.Optional;
import java.util.UUID;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.Files;
import java.io.IOException;
@RestController @RestController
@RequestMapping("/api/config") @RequestMapping("/api/config")
public class ConfigController { public class ConfigController {
private static final Logger logger = LoggerFactory.getLogger(ConfigController.class);
private final HtmlSanitizationService htmlSanitizationService; private final HtmlSanitizationService htmlSanitizationService;
private final ImageService imageService; private final ImageService imageService;
private final StoryService storyService;
@Value("${app.reading.speed.default:200}") @Value("${app.reading.speed.default:200}")
private int defaultReadingSpeed; private int defaultReadingSpeed;
@Autowired @Autowired
public ConfigController(HtmlSanitizationService htmlSanitizationService, ImageService imageService) { public ConfigController(HtmlSanitizationService htmlSanitizationService, ImageService imageService, StoryService storyService) {
this.htmlSanitizationService = htmlSanitizationService; this.htmlSanitizationService = htmlSanitizationService;
this.imageService = imageService; this.imageService = imageService;
this.storyService = storyService;
} }
/** /**
@@ -61,27 +77,55 @@ public class ConfigController {
@PostMapping("/cleanup/images/preview") @PostMapping("/cleanup/images/preview")
public ResponseEntity<Map<String, Object>> previewImageCleanup() { public ResponseEntity<Map<String, Object>> previewImageCleanup() {
try { try {
logger.info("Starting image cleanup preview");
ImageService.ContentImageCleanupResult result = imageService.cleanupOrphanedContentImages(true); ImageService.ContentImageCleanupResult result = imageService.cleanupOrphanedContentImages(true);
Map<String, Object> response = Map.of( // Create detailed file information with story relationships
"success", true, logger.info("Processing {} orphaned files for detailed information", result.getOrphanedImages().size());
"orphanedCount", result.getOrphanedImages().size(), List<Map<String, Object>> orphanedFiles = result.getOrphanedImages().stream()
"totalSizeBytes", result.getTotalSizeBytes(), .map(filePath -> {
"formattedSize", result.getFormattedSize(), try {
"foldersToDelete", result.getFoldersToDelete(), return createFileInfo(filePath);
"referencedImagesCount", result.getTotalReferencedImages(), } catch (Exception e) {
"errors", result.getErrors(), logger.error("Error processing file {}: {}", filePath, e.getMessage());
"hasErrors", result.hasErrors(), // Return a basic error entry instead of failing completely
"dryRun", true Map<String, Object> errorEntry = new HashMap<>();
); errorEntry.put("filePath", filePath);
errorEntry.put("fileName", Paths.get(filePath).getFileName().toString());
errorEntry.put("fileSize", 0L);
errorEntry.put("formattedSize", "0 B");
errorEntry.put("storyId", "error");
errorEntry.put("storyTitle", null);
errorEntry.put("storyExists", false);
errorEntry.put("canAccessStory", false);
errorEntry.put("error", e.getMessage());
return errorEntry;
}
})
.toList();
// Use HashMap to avoid Map.of() null value issues
Map<String, Object> response = new HashMap<>();
response.put("success", true);
response.put("orphanedCount", result.getOrphanedImages().size());
response.put("totalSizeBytes", result.getTotalSizeBytes());
response.put("formattedSize", result.getFormattedSize());
response.put("foldersToDelete", result.getFoldersToDelete());
response.put("referencedImagesCount", result.getTotalReferencedImages());
response.put("errors", result.getErrors());
response.put("hasErrors", result.hasErrors());
response.put("dryRun", true);
response.put("orphanedFiles", orphanedFiles);
logger.info("Image cleanup preview completed successfully");
return ResponseEntity.ok(response); return ResponseEntity.ok(response);
} catch (Exception e) { } catch (Exception e) {
return ResponseEntity.status(500).body(Map.of( logger.error("Failed to preview image cleanup", e);
"success", false, Map<String, Object> errorResponse = new HashMap<>();
"error", "Failed to preview image cleanup: " + e.getMessage() errorResponse.put("success", false);
)); errorResponse.put("error", "Failed to preview image cleanup: " + (e.getMessage() != null ? e.getMessage() : e.getClass().getSimpleName()));
return ResponseEntity.status(500).body(errorResponse);
} }
} }
@@ -114,4 +158,89 @@ public class ConfigController {
)); ));
} }
} }
/**
* Create detailed file information for orphaned image including story relationship
*/
private Map<String, Object> createFileInfo(String filePath) {
try {
Path path = Paths.get(filePath);
String fileName = path.getFileName().toString();
long fileSize = Files.exists(path) ? Files.size(path) : 0;
// Extract story UUID from the path (content images are stored in /content/{storyId}/)
String storyId = extractStoryIdFromPath(filePath);
// Look up the story if we have a valid UUID
Story relatedStory = null;
if (storyId != null) {
try {
UUID storyUuid = UUID.fromString(storyId);
relatedStory = storyService.findById(storyUuid);
} catch (Exception e) {
logger.debug("Could not find story with ID {}: {}", storyId, e.getMessage());
}
}
Map<String, Object> fileInfo = new HashMap<>();
fileInfo.put("filePath", filePath);
fileInfo.put("fileName", fileName);
fileInfo.put("fileSize", fileSize);
fileInfo.put("formattedSize", formatBytes(fileSize));
fileInfo.put("storyId", storyId != null ? storyId : "unknown");
fileInfo.put("storyTitle", relatedStory != null ? relatedStory.getTitle() : null);
fileInfo.put("storyExists", relatedStory != null);
fileInfo.put("canAccessStory", relatedStory != null);
return fileInfo;
} catch (Exception e) {
logger.error("Error creating file info for {}: {}", filePath, e.getMessage());
Map<String, Object> errorInfo = new HashMap<>();
errorInfo.put("filePath", filePath);
errorInfo.put("fileName", Paths.get(filePath).getFileName().toString());
errorInfo.put("fileSize", 0L);
errorInfo.put("formattedSize", "0 B");
errorInfo.put("storyId", "error");
errorInfo.put("storyTitle", null);
errorInfo.put("storyExists", false);
errorInfo.put("canAccessStory", false);
errorInfo.put("error", e.getMessage() != null ? e.getMessage() : e.getClass().getSimpleName());
return errorInfo;
}
}
/**
* Extract story ID from content image file path
*/
private String extractStoryIdFromPath(String filePath) {
try {
// Content images are stored in: /path/to/uploads/content/{storyId}/filename.ext
Path path = Paths.get(filePath);
Path parent = path.getParent();
if (parent != null) {
String potentialUuid = parent.getFileName().toString();
// Basic UUID validation (36 characters with dashes in right places)
if (potentialUuid.length() == 36 &&
potentialUuid.charAt(8) == '-' &&
potentialUuid.charAt(13) == '-' &&
potentialUuid.charAt(18) == '-' &&
potentialUuid.charAt(23) == '-') {
return potentialUuid;
}
}
} catch (Exception e) {
// Invalid path or other error
}
return null;
}
/**
* Format file size in human readable format
*/
private String formatBytes(long bytes) {
if (bytes < 1024) return bytes + " B";
if (bytes < 1024 * 1024) return String.format("%.1f KB", bytes / 1024.0);
if (bytes < 1024 * 1024 * 1024) return String.format("%.1f MB", bytes / (1024.0 * 1024.0));
return String.format("%.1f GB", bytes / (1024.0 * 1024.0 * 1024.0));
}
} }

View File

@@ -2,6 +2,8 @@ package com.storycove.controller;
import com.storycove.service.ImageService; import com.storycove.service.ImageService;
import com.storycove.service.LibraryService; import com.storycove.service.LibraryService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.core.io.FileSystemResource; import org.springframework.core.io.FileSystemResource;
import org.springframework.core.io.Resource; import org.springframework.core.io.Resource;
import org.springframework.http.HttpHeaders; import org.springframework.http.HttpHeaders;
@@ -21,6 +23,7 @@ import java.util.Map;
@RestController @RestController
@RequestMapping("/api/files") @RequestMapping("/api/files")
public class FileController { public class FileController {
private static final Logger log = LoggerFactory.getLogger(FileController.class);
private final ImageService imageService; private final ImageService imageService;
private final LibraryService libraryService; private final LibraryService libraryService;
@@ -32,7 +35,7 @@ public class FileController {
private String getCurrentLibraryId() { private String getCurrentLibraryId() {
String libraryId = libraryService.getCurrentLibraryId(); String libraryId = libraryService.getCurrentLibraryId();
System.out.println("FileController - Current Library ID: " + libraryId); log.debug("FileController - Current Library ID: {}", libraryId);
return libraryId != null ? libraryId : "default"; return libraryId != null ? libraryId : "default";
} }
@@ -48,7 +51,7 @@ public class FileController {
String imageUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath; String imageUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
response.put("url", imageUrl); response.put("url", imageUrl);
System.out.println("Upload response - path: " + imagePath + ", url: " + imageUrl); log.debug("Upload response - path: {}, url: {}", imagePath, imageUrl);
return ResponseEntity.ok(response); return ResponseEntity.ok(response);
} catch (IllegalArgumentException e) { } catch (IllegalArgumentException e) {

View File

@@ -12,9 +12,7 @@ import com.storycove.service.*;
import jakarta.validation.Valid; import jakarta.validation.Valid;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.Page; import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.PageRequest; import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable; import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort; import org.springframework.data.domain.Sort;
@@ -46,6 +44,8 @@ public class StoryController {
private final ReadingTimeService readingTimeService; private final ReadingTimeService readingTimeService;
private final EPUBImportService epubImportService; private final EPUBImportService epubImportService;
private final EPUBExportService epubExportService; private final EPUBExportService epubExportService;
private final AsyncImageProcessingService asyncImageProcessingService;
private final ImageProcessingProgressService progressService;
public StoryController(StoryService storyService, public StoryController(StoryService storyService,
AuthorService authorService, AuthorService authorService,
@@ -56,7 +56,9 @@ public class StoryController {
SearchServiceAdapter searchServiceAdapter, SearchServiceAdapter searchServiceAdapter,
ReadingTimeService readingTimeService, ReadingTimeService readingTimeService,
EPUBImportService epubImportService, EPUBImportService epubImportService,
EPUBExportService epubExportService) { EPUBExportService epubExportService,
AsyncImageProcessingService asyncImageProcessingService,
ImageProcessingProgressService progressService) {
this.storyService = storyService; this.storyService = storyService;
this.authorService = authorService; this.authorService = authorService;
this.seriesService = seriesService; this.seriesService = seriesService;
@@ -67,6 +69,8 @@ public class StoryController {
this.readingTimeService = readingTimeService; this.readingTimeService = readingTimeService;
this.epubImportService = epubImportService; this.epubImportService = epubImportService;
this.epubExportService = epubExportService; this.epubExportService = epubExportService;
this.asyncImageProcessingService = asyncImageProcessingService;
this.progressService = progressService;
} }
@GetMapping @GetMapping
@@ -146,6 +150,10 @@ public class StoryController {
updateStoryFromRequest(story, request); updateStoryFromRequest(story, request);
Story savedStory = storyService.createWithTagNames(story, request.getTagNames()); Story savedStory = storyService.createWithTagNames(story, request.getTagNames());
// Process external images in content after saving
savedStory = processExternalImagesIfNeeded(savedStory);
logger.info("Successfully created story: {} (ID: {})", savedStory.getTitle(), savedStory.getId()); logger.info("Successfully created story: {} (ID: {})", savedStory.getTitle(), savedStory.getId());
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedStory)); return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedStory));
} }
@@ -163,6 +171,10 @@ public class StoryController {
} }
Story updatedStory = storyService.updateWithTagNames(id, request); Story updatedStory = storyService.updateWithTagNames(id, request);
// Process external images in content after saving
updatedStory = processExternalImagesIfNeeded(updatedStory);
logger.info("Successfully updated story: {}", updatedStory.getTitle()); logger.info("Successfully updated story: {}", updatedStory.getTitle());
return ResponseEntity.ok(convertToDto(updatedStory)); return ResponseEntity.ok(convertToDto(updatedStory));
} }
@@ -474,7 +486,9 @@ public class StoryController {
story.setTitle(createReq.getTitle()); story.setTitle(createReq.getTitle());
story.setSummary(createReq.getSummary()); story.setSummary(createReq.getSummary());
story.setDescription(createReq.getDescription()); story.setDescription(createReq.getDescription());
story.setContentHtml(sanitizationService.sanitize(createReq.getContentHtml())); story.setContentHtml(sanitizationService.sanitize(createReq.getContentHtml()));
story.setSourceUrl(createReq.getSourceUrl()); story.setSourceUrl(createReq.getSourceUrl());
story.setVolume(createReq.getVolume()); story.setVolume(createReq.getVolume());
@@ -707,6 +721,50 @@ public class StoryController {
return dto; return dto;
} }
private Story processExternalImagesIfNeeded(Story story) {
try {
if (story.getContentHtml() != null && !story.getContentHtml().trim().isEmpty()) {
logger.debug("Starting async image processing for story: {}", story.getId());
// Start async processing - this returns immediately
asyncImageProcessingService.processStoryImagesAsync(story.getId(), story.getContentHtml());
logger.info("Async image processing started for story: {}", story.getId());
}
} catch (Exception e) {
logger.error("Failed to start async image processing for story {}: {}",
story.getId(), e.getMessage(), e);
// Don't fail the entire operation if image processing fails
}
return story;
}
@GetMapping("/{id}/image-processing-progress")
public ResponseEntity<Map<String, Object>> getImageProcessingProgress(@PathVariable UUID id) {
ImageProcessingProgressService.ImageProcessingProgress progress = progressService.getProgress(id);
if (progress == null) {
return ResponseEntity.ok(Map.of(
"isProcessing", false,
"message", "No active image processing"
));
}
Map<String, Object> response = Map.of(
"isProcessing", !progress.isCompleted(),
"totalImages", progress.getTotalImages(),
"processedImages", progress.getProcessedImages(),
"currentImageUrl", progress.getCurrentImageUrl() != null ? progress.getCurrentImageUrl() : "",
"status", progress.getStatus(),
"progressPercentage", progress.getProgressPercentage(),
"completed", progress.isCompleted(),
"error", progress.getErrorMessage() != null ? progress.getErrorMessage() : ""
);
return ResponseEntity.ok(response);
}
@GetMapping("/check-duplicate") @GetMapping("/check-duplicate")
public ResponseEntity<Map<String, Object>> checkDuplicate( public ResponseEntity<Map<String, Object>> checkDuplicate(
@RequestParam String title, @RequestParam String title,

View File

@@ -34,6 +34,18 @@ public class SearchResultDto<T> {
this.facets = facets; this.facets = facets;
} }
// Simple constructor for basic search results with facet list
public SearchResultDto(List<T> results, long totalHits, int resultCount, List<FacetCountDto> facetsList) {
this.results = results;
this.totalHits = totalHits;
this.page = 0;
this.perPage = resultCount;
this.query = "";
this.searchTimeMs = 0;
// Convert list to map if needed - for now just set empty map
this.facets = java.util.Collections.emptyMap();
}
// Getters and Setters // Getters and Setters
public List<T> getResults() { public List<T> getResults() {
return results; return results;

View File

@@ -0,0 +1,34 @@
package com.storycove.event;
import org.springframework.context.ApplicationEvent;
import java.util.UUID;
/**
* Event published when a story's content is created or updated
*/
public class StoryContentUpdatedEvent extends ApplicationEvent {
private final UUID storyId;
private final String contentHtml;
private final boolean isNewStory;
public StoryContentUpdatedEvent(Object source, UUID storyId, String contentHtml, boolean isNewStory) {
super(source);
this.storyId = storyId;
this.contentHtml = contentHtml;
this.isNewStory = isNewStory;
}
public UUID getStoryId() {
return storyId;
}
public String getContentHtml() {
return contentHtml;
}
public boolean isNewStory() {
return isNewStory;
}
}

View File

@@ -0,0 +1,122 @@
package com.storycove.service;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Service;
import java.util.UUID;
import java.util.concurrent.CompletableFuture;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
@Service
public class AsyncImageProcessingService {
private static final Logger logger = LoggerFactory.getLogger(AsyncImageProcessingService.class);
private final ImageService imageService;
private final StoryService storyService;
private final ImageProcessingProgressService progressService;
@Autowired
public AsyncImageProcessingService(ImageService imageService,
StoryService storyService,
ImageProcessingProgressService progressService) {
this.imageService = imageService;
this.storyService = storyService;
this.progressService = progressService;
}
@Async
public CompletableFuture<Void> processStoryImagesAsync(UUID storyId, String contentHtml) {
logger.info("Starting async image processing for story: {}", storyId);
try {
// Count external images first
int externalImageCount = countExternalImages(contentHtml);
if (externalImageCount == 0) {
logger.debug("No external images found for story {}", storyId);
return CompletableFuture.completedFuture(null);
}
// Start progress tracking
ImageProcessingProgressService.ImageProcessingProgress progress =
progressService.startProgress(storyId, externalImageCount);
// Process images with progress updates
ImageService.ContentImageProcessingResult result =
processImagesWithProgress(contentHtml, storyId, progress);
// Update story with processed content if changed
if (!result.getProcessedContent().equals(contentHtml)) {
progressService.updateProgress(storyId, progress.getTotalImages(),
"Saving processed content", "Updating story content");
storyService.updateContentOnly(storyId, result.getProcessedContent());
progressService.completeProgress(storyId,
String.format("Completed: %d images processed", result.getDownloadedImages().size()));
logger.info("Async image processing completed for story {}: {} images processed",
storyId, result.getDownloadedImages().size());
} else {
progressService.completeProgress(storyId, "Completed: No images needed processing");
}
// Clean up progress after a delay to allow frontend to see completion
CompletableFuture.runAsync(() -> {
try {
Thread.sleep(5000); // 5 seconds delay
progressService.removeProgress(storyId);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
});
} catch (Exception e) {
logger.error("Async image processing failed for story {}: {}", storyId, e.getMessage(), e);
progressService.setError(storyId, e.getMessage());
}
return CompletableFuture.completedFuture(null);
}
private int countExternalImages(String contentHtml) {
if (contentHtml == null || contentHtml.trim().isEmpty()) {
return 0;
}
Pattern imgPattern = Pattern.compile("<img[^>]+src=[\"']([^\"']+)[\"'][^>]*>", Pattern.CASE_INSENSITIVE);
Matcher matcher = imgPattern.matcher(contentHtml);
int count = 0;
while (matcher.find()) {
String src = matcher.group(1);
if (isExternalUrl(src)) {
count++;
}
}
return count;
}
private boolean isExternalUrl(String url) {
return url != null &&
(url.startsWith("http://") || url.startsWith("https://")) &&
!url.contains("/api/files/images/");
}
private ImageService.ContentImageProcessingResult processImagesWithProgress(
String contentHtml, UUID storyId, ImageProcessingProgressService.ImageProcessingProgress progress) {
// Use a custom version of processContentImages that provides progress callbacks
return imageService.processContentImagesWithProgress(contentHtml, storyId,
(currentUrl, processedCount, totalCount) -> {
progressService.updateProgress(storyId, processedCount, currentUrl,
String.format("Processing image %d of %d", processedCount + 1, totalCount));
});
}
}

View File

@@ -132,7 +132,7 @@ public class AuthorService {
validateAuthorForCreate(author); validateAuthorForCreate(author);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Index in OpenSearch // Index in Solr
searchServiceAdapter.indexAuthor(savedAuthor); searchServiceAdapter.indexAuthor(savedAuthor);
return savedAuthor; return savedAuthor;
@@ -150,7 +150,7 @@ public class AuthorService {
updateAuthorFields(existingAuthor, authorUpdates); updateAuthorFields(existingAuthor, authorUpdates);
Author savedAuthor = authorRepository.save(existingAuthor); Author savedAuthor = authorRepository.save(existingAuthor);
// Update in OpenSearch // Update in Solr
searchServiceAdapter.updateAuthor(savedAuthor); searchServiceAdapter.updateAuthor(savedAuthor);
return savedAuthor; return savedAuthor;
@@ -166,7 +166,7 @@ public class AuthorService {
authorRepository.delete(author); authorRepository.delete(author);
// Remove from OpenSearch // Remove from Solr
searchServiceAdapter.deleteAuthor(id); searchServiceAdapter.deleteAuthor(id);
} }
@@ -175,7 +175,7 @@ public class AuthorService {
author.addUrl(url); author.addUrl(url);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Update in OpenSearch // Update in Solr
searchServiceAdapter.updateAuthor(savedAuthor); searchServiceAdapter.updateAuthor(savedAuthor);
return savedAuthor; return savedAuthor;
@@ -186,7 +186,7 @@ public class AuthorService {
author.removeUrl(url); author.removeUrl(url);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Update in OpenSearch // Update in Solr
searchServiceAdapter.updateAuthor(savedAuthor); searchServiceAdapter.updateAuthor(savedAuthor);
return savedAuthor; return savedAuthor;
@@ -221,7 +221,7 @@ public class AuthorService {
logger.debug("Saved author rating: {} for author: {}", logger.debug("Saved author rating: {} for author: {}",
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName()); refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
// Update in OpenSearch // Update in Solr
searchServiceAdapter.updateAuthor(refreshedAuthor); searchServiceAdapter.updateAuthor(refreshedAuthor);
return refreshedAuthor; return refreshedAuthor;
@@ -265,7 +265,7 @@ public class AuthorService {
author.setAvatarImagePath(avatarPath); author.setAvatarImagePath(avatarPath);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Update in OpenSearch // Update in Solr
searchServiceAdapter.updateAuthor(savedAuthor); searchServiceAdapter.updateAuthor(savedAuthor);
return savedAuthor; return savedAuthor;
@@ -276,7 +276,7 @@ public class AuthorService {
author.setAvatarImagePath(null); author.setAvatarImagePath(null);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Update in OpenSearch // Update in Solr
searchServiceAdapter.updateAuthor(savedAuthor); searchServiceAdapter.updateAuthor(savedAuthor);
return savedAuthor; return savedAuthor;

View File

@@ -55,8 +55,8 @@ public class CollectionService {
*/ */
public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) { public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) {
// Collections are currently handled at database level, not indexed in search engine // Collections are currently handled at database level, not indexed in search engine
// Return empty result for now as collections search is not implemented in OpenSearch // Return empty result for now as collections search is not implemented in Solr
logger.warn("Collections search not yet implemented in OpenSearch, returning empty results"); logger.warn("Collections search not yet implemented in Solr, returning empty results");
return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0); return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0);
} }

View File

@@ -70,6 +70,75 @@ public class DatabaseManagementService implements ApplicationContextAware {
this.applicationContext = applicationContext; this.applicationContext = applicationContext;
} }
// Helper methods to extract database connection details
private String extractDatabaseUrl() {
try (Connection connection = getDataSource().getConnection()) {
return connection.getMetaData().getURL();
} catch (SQLException e) {
throw new RuntimeException("Failed to extract database URL", e);
}
}
private String extractDatabaseHost() {
String url = extractDatabaseUrl();
// Extract host from jdbc:postgresql://host:port/database
if (url.startsWith("jdbc:postgresql://")) {
String hostPort = url.substring("jdbc:postgresql://".length());
if (hostPort.contains("/")) {
hostPort = hostPort.substring(0, hostPort.indexOf("/"));
}
if (hostPort.contains(":")) {
return hostPort.substring(0, hostPort.indexOf(":"));
}
return hostPort;
}
return "localhost"; // fallback
}
private String extractDatabasePort() {
String url = extractDatabaseUrl();
// Extract port from jdbc:postgresql://host:port/database
if (url.startsWith("jdbc:postgresql://")) {
String hostPort = url.substring("jdbc:postgresql://".length());
if (hostPort.contains("/")) {
hostPort = hostPort.substring(0, hostPort.indexOf("/"));
}
if (hostPort.contains(":")) {
return hostPort.substring(hostPort.indexOf(":") + 1);
}
}
return "5432"; // default PostgreSQL port
}
private String extractDatabaseName() {
String url = extractDatabaseUrl();
// Extract database name from jdbc:postgresql://host:port/database
if (url.startsWith("jdbc:postgresql://")) {
String remaining = url.substring("jdbc:postgresql://".length());
if (remaining.contains("/")) {
String dbPart = remaining.substring(remaining.indexOf("/") + 1);
// Remove any query parameters
if (dbPart.contains("?")) {
dbPart = dbPart.substring(0, dbPart.indexOf("?"));
}
return dbPart;
}
}
return "storycove"; // fallback
}
private String extractDatabaseUsername() {
// Get from environment variable or default
return System.getenv("SPRING_DATASOURCE_USERNAME") != null ?
System.getenv("SPRING_DATASOURCE_USERNAME") : "storycove";
}
private String extractDatabasePassword() {
// Get from environment variable or default
return System.getenv("SPRING_DATASOURCE_PASSWORD") != null ?
System.getenv("SPRING_DATASOURCE_PASSWORD") : "password";
}
/** /**
* Create a comprehensive backup including database and files in ZIP format * Create a comprehensive backup including database and files in ZIP format
*/ */
@@ -97,6 +166,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
/** /**
* Restore from complete backup (ZIP format) * Restore from complete backup (ZIP format)
*/ */
@Transactional(timeout = 1800) // 30 minutes timeout for large backup restores
public void restoreFromCompleteBackup(InputStream backupStream) throws IOException, SQLException { public void restoreFromCompleteBackup(InputStream backupStream) throws IOException, SQLException {
String currentLibraryId = libraryService.getCurrentLibraryId(); String currentLibraryId = libraryService.getCurrentLibraryId();
System.err.println("Starting complete backup restore for library: " + currentLibraryId); System.err.println("Starting complete backup restore for library: " + currentLibraryId);
@@ -171,157 +241,177 @@ public class DatabaseManagementService implements ApplicationContextAware {
} }
public Resource createBackup() throws SQLException, IOException { public Resource createBackup() throws SQLException, IOException {
StringBuilder sqlDump = new StringBuilder(); // Use PostgreSQL's native pg_dump for reliable backup
String dbHost = extractDatabaseHost();
String dbPort = extractDatabasePort();
String dbName = extractDatabaseName();
String dbUser = extractDatabaseUsername();
String dbPassword = extractDatabasePassword();
try (Connection connection = getDataSource().getConnection()) { // Create temporary file for backup
// Add header Path tempBackupFile = Files.createTempFile("storycove_backup_", ".sql");
sqlDump.append("-- StoryCove Database Backup\n");
sqlDump.append("-- Generated at: ").append(new java.util.Date()).append("\n\n");
// Disable foreign key checks during restore (PostgreSQL syntax) try {
sqlDump.append("SET session_replication_role = replica;\n\n"); // Build pg_dump command
ProcessBuilder pb = new ProcessBuilder(
// List of tables in dependency order (parents first for insertion) "pg_dump",
List<String> insertTables = Arrays.asList( "--host=" + dbHost,
"authors", "series", "tags", "collections", "--port=" + dbPort,
"stories", "story_tags", "author_urls", "collection_stories" "--username=" + dbUser,
"--dbname=" + dbName,
"--no-password",
"--verbose",
"--clean",
"--if-exists",
"--create",
"--file=" + tempBackupFile.toString()
); );
// TRUNCATE in reverse order (children first) // Set PGPASSWORD environment variable
List<String> truncateTables = Arrays.asList( Map<String, String> env = pb.environment();
"collection_stories", "author_urls", "story_tags", env.put("PGPASSWORD", dbPassword);
"stories", "collections", "tags", "series", "authors"
);
// Generate TRUNCATE statements for each table (assuming tables already exist) System.err.println("Starting PostgreSQL backup using pg_dump...");
for (String tableName : truncateTables) { Process process = pb.start();
sqlDump.append("-- Truncate Table: ").append(tableName).append("\n");
sqlDump.append("TRUNCATE TABLE \"").append(tableName).append("\" CASCADE;\n");
}
sqlDump.append("\n");
// Generate INSERT statements in dependency order // Capture output
for (String tableName : insertTables) { try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()))) {
sqlDump.append("-- Data for Table: ").append(tableName).append("\n"); String line;
while ((line = reader.readLine()) != null) {
// Get table data System.err.println("pg_dump: " + line);
try (PreparedStatement stmt = connection.prepareStatement("SELECT * FROM \"" + tableName + "\"");
ResultSet rs = stmt.executeQuery()) {
ResultSetMetaData metaData = rs.getMetaData();
int columnCount = metaData.getColumnCount();
// Build column names for INSERT statement
StringBuilder columnNames = new StringBuilder();
for (int i = 1; i <= columnCount; i++) {
if (i > 1) columnNames.append(", ");
columnNames.append("\"").append(metaData.getColumnName(i)).append("\"");
}
while (rs.next()) {
sqlDump.append("INSERT INTO \"").append(tableName).append("\" (")
.append(columnNames).append(") VALUES (");
for (int i = 1; i <= columnCount; i++) {
if (i > 1) sqlDump.append(", ");
Object value = rs.getObject(i);
sqlDump.append(formatSqlValue(value));
}
sqlDump.append(");\n");
}
} }
sqlDump.append("\n");
} }
// Re-enable foreign key checks (PostgreSQL syntax) int exitCode = process.waitFor();
sqlDump.append("SET session_replication_role = DEFAULT;\n"); if (exitCode != 0) {
} throw new RuntimeException("pg_dump failed with exit code: " + exitCode);
}
byte[] backupData = sqlDump.toString().getBytes(StandardCharsets.UTF_8); System.err.println("PostgreSQL backup completed successfully");
return new ByteArrayResource(backupData);
// Read the backup file into memory
byte[] backupData = Files.readAllBytes(tempBackupFile);
return new ByteArrayResource(backupData);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
throw new RuntimeException("Backup process was interrupted", e);
} finally {
// Clean up temporary file
try {
Files.deleteIfExists(tempBackupFile);
} catch (IOException e) {
System.err.println("Warning: Could not delete temporary backup file: " + e.getMessage());
}
}
} }
@Transactional @Transactional(timeout = 1800) // 30 minutes timeout for large backup restores
public void restoreFromBackup(InputStream backupStream) throws IOException, SQLException { public void restoreFromBackup(InputStream backupStream) throws IOException, SQLException {
// Read the SQL file // Use PostgreSQL's native psql for reliable restore
StringBuilder sqlContent = new StringBuilder(); String dbHost = extractDatabaseHost();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(backupStream, StandardCharsets.UTF_8))) { String dbPort = extractDatabasePort();
String line; String dbName = extractDatabaseName();
while ((line = reader.readLine()) != null) { String dbUser = extractDatabaseUsername();
// Skip comments and empty lines String dbPassword = extractDatabasePassword();
if (!line.trim().startsWith("--") && !line.trim().isEmpty()) {
sqlContent.append(line).append("\n"); // Create temporary file for the backup
Path tempBackupFile = Files.createTempFile("storycove_restore_", ".sql");
try {
// Write backup stream to temporary file
System.err.println("Writing backup data to temporary file...");
try (InputStream input = backupStream;
OutputStream output = Files.newOutputStream(tempBackupFile)) {
byte[] buffer = new byte[8192];
int bytesRead;
while ((bytesRead = input.read(buffer)) != -1) {
output.write(buffer, 0, bytesRead);
} }
} }
}
// Execute the SQL statements System.err.println("Starting PostgreSQL restore using psql...");
try (Connection connection = getDataSource().getConnection()) {
connection.setAutoCommit(false);
try { // Build psql command to restore the backup
// Ensure database schema exists before restoring data ProcessBuilder pb = new ProcessBuilder(
ensureDatabaseSchemaExists(connection); "psql",
"--host=" + dbHost,
"--port=" + dbPort,
"--username=" + dbUser,
"--dbname=" + dbName,
"--no-password",
"--echo-errors",
"--file=" + tempBackupFile.toString()
);
// Parse SQL statements properly (handle semicolons inside string literals) // Set PGPASSWORD environment variable
List<String> statements = parseStatements(sqlContent.toString()); Map<String, String> env = pb.environment();
env.put("PGPASSWORD", dbPassword);
int successCount = 0; Process process = pb.start();
for (String statement : statements) {
String trimmedStatement = statement.trim();
if (!trimmedStatement.isEmpty()) {
try (PreparedStatement stmt = connection.prepareStatement(trimmedStatement)) {
stmt.executeUpdate();
successCount++;
} catch (SQLException e) {
// Log detailed error information for failed statements
System.err.println("ERROR: Failed to execute SQL statement #" + (successCount + 1));
System.err.println("Error: " + e.getMessage());
System.err.println("SQL State: " + e.getSQLState());
System.err.println("Error Code: " + e.getErrorCode());
// Show the problematic statement (first 500 chars) // Capture output
String statementPreview = trimmedStatement.length() > 500 ? try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getErrorStream()));
trimmedStatement.substring(0, 500) + "..." : trimmedStatement; BufferedReader outputReader = new BufferedReader(new InputStreamReader(process.getInputStream()))) {
System.err.println("Statement: " + statementPreview);
throw e; // Re-throw to trigger rollback // Read stderr in a separate thread
Thread errorThread = new Thread(() -> {
try {
String line;
while ((line = reader.readLine()) != null) {
System.err.println("psql stderr: " + line);
} }
} catch (IOException e) {
System.err.println("Error reading psql stderr: " + e.getMessage());
} }
});
errorThread.start();
// Read stdout
String line;
while ((line = outputReader.readLine()) != null) {
System.err.println("psql stdout: " + line);
} }
connection.commit(); errorThread.join();
System.err.println("Restore completed successfully. Executed " + successCount + " SQL statements."); }
// Reindex search after successful restore int exitCode = process.waitFor();
try { if (exitCode != 0) {
String currentLibraryId = libraryService.getCurrentLibraryId(); throw new RuntimeException("psql restore failed with exit code: " + exitCode);
System.err.println("Starting search reindex after successful restore for library: " + currentLibraryId); }
if (currentLibraryId == null) {
System.err.println("ERROR: No current library set during restore - cannot reindex search!");
throw new IllegalStateException("No current library active during restore");
}
// Manually trigger reindexing using the correct database connection System.err.println("PostgreSQL restore completed successfully");
System.err.println("Triggering manual reindex from library-specific database for library: " + currentLibraryId);
reindexStoriesAndAuthorsFromCurrentDatabase();
// Note: Collections collection will be recreated when needed by the service // Reindex search after successful restore
System.err.println("Search reindex completed successfully for library: " + currentLibraryId); try {
} catch (Exception e) { String currentLibraryId = libraryService.getCurrentLibraryId();
// Log the error but don't fail the restore System.err.println("Starting search reindex after successful restore for library: " + currentLibraryId);
System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage()); if (currentLibraryId == null) {
e.printStackTrace(); System.err.println("ERROR: No current library set during restore - cannot reindex search!");
throw new IllegalStateException("No current library active during restore");
} }
} catch (SQLException e) { // Manually trigger reindexing using the correct database connection
connection.rollback(); System.err.println("Triggering manual reindex from library-specific database for library: " + currentLibraryId);
throw e; reindexStoriesAndAuthorsFromCurrentDatabase();
} finally {
connection.setAutoCommit(true); // Note: Collections collection will be recreated when needed by the service
System.err.println("Search reindex completed successfully for library: " + currentLibraryId);
} catch (Exception e) {
// Log the error but don't fail the restore
System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
e.printStackTrace();
}
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
throw new RuntimeException("Restore process was interrupted", e);
} finally {
// Clean up temporary file
try {
Files.deleteIfExists(tempBackupFile);
} catch (IOException e) {
System.err.println("Warning: Could not delete temporary restore file: " + e.getMessage());
} }
} }
} }
@@ -449,7 +539,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
/** /**
* Clear all data AND files (for complete restore) * Clear all data AND files (for complete restore)
*/ */
@Transactional @Transactional(timeout = 600) // 10 minutes timeout for clearing large datasets
public int clearAllDataAndFiles() { public int clearAllDataAndFiles() {
// First clear the database // First clear the database
int totalDeleted = clearAllData(); int totalDeleted = clearAllData();

View File

@@ -16,6 +16,8 @@ import nl.siegmann.epublib.epub.EpubReader;
import org.jsoup.Jsoup; import org.jsoup.Jsoup;
import org.jsoup.nodes.Document; import org.jsoup.nodes.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service; import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional; import org.springframework.transaction.annotation.Transactional;
@@ -30,6 +32,7 @@ import java.util.Optional;
@Service @Service
@Transactional @Transactional
public class EPUBImportService { public class EPUBImportService {
private static final Logger log = LoggerFactory.getLogger(EPUBImportService.class);
private final StoryService storyService; private final StoryService storyService;
private final AuthorService authorService; private final AuthorService authorService;
@@ -87,12 +90,12 @@ public class EPUBImportService {
savedStory = storyService.update(savedStory.getId(), savedStory); savedStory = storyService.update(savedStory.getId(), savedStory);
// Log the image processing results // Log the image processing results
System.out.println("EPUB Import - Image processing completed for story " + savedStory.getId() + log.debug("EPUB Import - Image processing completed for story {}. Downloaded {} images.",
". Downloaded " + imageResult.getDownloadedImages().size() + " images."); savedStory.getId(), imageResult.getDownloadedImages().size());
if (imageResult.hasWarnings()) { if (imageResult.hasWarnings()) {
System.out.println("EPUB Import - Image processing warnings: " + log.debug("EPUB Import - Image processing warnings: {}",
String.join(", ", imageResult.getWarnings())); String.join(", ", imageResult.getWarnings()));
} }
} }
} catch (Exception e) { } catch (Exception e) {
@@ -282,7 +285,7 @@ public class EPUBImportService {
if (language != null && !language.trim().isEmpty()) { if (language != null && !language.trim().isEmpty()) {
// Store as metadata in story description if needed // Store as metadata in story description if needed
// For now, we'll just log it for potential future use // For now, we'll just log it for potential future use
System.out.println("EPUB Language: " + language); log.debug("EPUB Language: {}", language);
} }
// Extract publisher information // Extract publisher information
@@ -290,14 +293,14 @@ public class EPUBImportService {
if (publishers != null && !publishers.isEmpty()) { if (publishers != null && !publishers.isEmpty()) {
String publisher = publishers.get(0); String publisher = publishers.get(0);
// Could append to description or store separately in future // Could append to description or store separately in future
System.out.println("EPUB Publisher: " + publisher); log.debug("EPUB Publisher: {}", publisher);
} }
// Extract publication date // Extract publication date
List<nl.siegmann.epublib.domain.Date> dates = metadata.getDates(); List<nl.siegmann.epublib.domain.Date> dates = metadata.getDates();
if (dates != null && !dates.isEmpty()) { if (dates != null && !dates.isEmpty()) {
for (nl.siegmann.epublib.domain.Date date : dates) { for (nl.siegmann.epublib.domain.Date date : dates) {
System.out.println("EPUB Date (" + date.getEvent() + "): " + date.getValue()); log.debug("EPUB Date ({}): {}", date.getEvent(), date.getValue());
} }
} }
@@ -305,7 +308,7 @@ public class EPUBImportService {
List<nl.siegmann.epublib.domain.Identifier> identifiers = metadata.getIdentifiers(); List<nl.siegmann.epublib.domain.Identifier> identifiers = metadata.getIdentifiers();
if (identifiers != null && !identifiers.isEmpty()) { if (identifiers != null && !identifiers.isEmpty()) {
for (nl.siegmann.epublib.domain.Identifier identifier : identifiers) { for (nl.siegmann.epublib.domain.Identifier identifier : identifiers) {
System.out.println("EPUB Identifier (" + identifier.getScheme() + "): " + identifier.getValue()); log.debug("EPUB Identifier ({}): {}", identifier.getScheme(), identifier.getValue());
} }
} }
} }

View File

@@ -137,12 +137,63 @@ public class HtmlSanitizationService {
return config; return config;
} }
/**
* Preprocess HTML to extract images from figure tags before sanitization
*/
private String preprocessFigureTags(String html) {
if (html == null || html.trim().isEmpty()) {
return html;
}
try {
org.jsoup.nodes.Document doc = Jsoup.parse(html);
org.jsoup.select.Elements figures = doc.select("figure");
for (org.jsoup.nodes.Element figure : figures) {
// Find img tags within the figure
org.jsoup.select.Elements images = figure.select("img");
if (!images.isEmpty()) {
// Extract the first image and replace the figure with it
org.jsoup.nodes.Element img = images.first();
// Check if there's a figcaption to preserve as alt text
org.jsoup.select.Elements figcaptions = figure.select("figcaption");
if (!figcaptions.isEmpty() && !img.hasAttr("alt")) {
String captionText = figcaptions.first().text();
if (captionText != null && !captionText.trim().isEmpty()) {
img.attr("alt", captionText);
}
}
// Replace the figure element with just the img
figure.replaceWith(img.clone());
logger.debug("Extracted image from figure tag: {}", img.attr("src"));
} else {
// No images in figure, remove it entirely
figure.remove();
logger.debug("Removed figure tag without images");
}
}
return doc.body().html();
} catch (Exception e) {
logger.warn("Failed to preprocess figure tags, returning original HTML: {}", e.getMessage());
return html;
}
}
public String sanitize(String html) { public String sanitize(String html) {
if (html == null || html.trim().isEmpty()) { if (html == null || html.trim().isEmpty()) {
return ""; return "";
} }
logger.info("Content before sanitization: "+html); logger.info("Content before sanitization: "+html);
String saniztedHtml = Jsoup.clean(html, allowlist.preserveRelativeLinks(true));
// Preprocess to extract images from figure tags
String preprocessed = preprocessFigureTags(html);
String saniztedHtml = Jsoup.clean(preprocessed, allowlist.preserveRelativeLinks(true));
logger.info("Content after sanitization: "+saniztedHtml); logger.info("Content after sanitization: "+saniztedHtml);
return saniztedHtml; return saniztedHtml;
} }

View File

@@ -0,0 +1,108 @@
package com.storycove.service;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import java.util.Map;
import java.util.UUID;
import java.util.concurrent.ConcurrentHashMap;
@Service
public class ImageProcessingProgressService {
private static final Logger logger = LoggerFactory.getLogger(ImageProcessingProgressService.class);
private final Map<UUID, ImageProcessingProgress> progressMap = new ConcurrentHashMap<>();
public static class ImageProcessingProgress {
private final UUID storyId;
private final int totalImages;
private volatile int processedImages;
private volatile String currentImageUrl;
private volatile String status;
private volatile boolean completed;
private volatile String errorMessage;
public ImageProcessingProgress(UUID storyId, int totalImages) {
this.storyId = storyId;
this.totalImages = totalImages;
this.processedImages = 0;
this.status = "Starting";
this.completed = false;
}
// Getters
public UUID getStoryId() { return storyId; }
public int getTotalImages() { return totalImages; }
public int getProcessedImages() { return processedImages; }
public String getCurrentImageUrl() { return currentImageUrl; }
public String getStatus() { return status; }
public boolean isCompleted() { return completed; }
public String getErrorMessage() { return errorMessage; }
public double getProgressPercentage() {
return totalImages > 0 ? (double) processedImages / totalImages * 100 : 100;
}
// Setters
public void setProcessedImages(int processedImages) { this.processedImages = processedImages; }
public void setCurrentImageUrl(String currentImageUrl) { this.currentImageUrl = currentImageUrl; }
public void setStatus(String status) { this.status = status; }
public void setCompleted(boolean completed) { this.completed = completed; }
public void setErrorMessage(String errorMessage) { this.errorMessage = errorMessage; }
public void incrementProcessed() {
this.processedImages++;
}
}
public ImageProcessingProgress startProgress(UUID storyId, int totalImages) {
ImageProcessingProgress progress = new ImageProcessingProgress(storyId, totalImages);
progressMap.put(storyId, progress);
logger.info("Started image processing progress tracking for story {} with {} images", storyId, totalImages);
return progress;
}
public ImageProcessingProgress getProgress(UUID storyId) {
return progressMap.get(storyId);
}
public void updateProgress(UUID storyId, int processedImages, String currentImageUrl, String status) {
ImageProcessingProgress progress = progressMap.get(storyId);
if (progress != null) {
progress.setProcessedImages(processedImages);
progress.setCurrentImageUrl(currentImageUrl);
progress.setStatus(status);
logger.debug("Updated progress for story {}: {}/{} - {}", storyId, processedImages, progress.getTotalImages(), status);
}
}
public void completeProgress(UUID storyId, String finalStatus) {
ImageProcessingProgress progress = progressMap.get(storyId);
if (progress != null) {
progress.setCompleted(true);
progress.setStatus(finalStatus);
logger.info("Completed image processing for story {}: {}", storyId, finalStatus);
}
}
public void setError(UUID storyId, String errorMessage) {
ImageProcessingProgress progress = progressMap.get(storyId);
if (progress != null) {
progress.setErrorMessage(errorMessage);
progress.setStatus("Error: " + errorMessage);
progress.setCompleted(true);
logger.error("Image processing error for story {}: {}", storyId, errorMessage);
}
}
public void removeProgress(UUID storyId) {
progressMap.remove(storyId);
logger.debug("Removed progress tracking for story {}", storyId);
}
public boolean isProcessing(UUID storyId) {
ImageProcessingProgress progress = progressMap.get(storyId);
return progress != null && !progress.isCompleted();
}
}

View File

@@ -4,6 +4,8 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value; import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.event.EventListener;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Service; import org.springframework.stereotype.Service;
import org.springframework.web.multipart.MultipartFile; import org.springframework.web.multipart.MultipartFile;
@@ -21,6 +23,8 @@ import java.util.List;
import java.util.regex.Matcher; import java.util.regex.Matcher;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import com.storycove.event.StoryContentUpdatedEvent;
@Service @Service
public class ImageService { public class ImageService {
@@ -43,6 +47,12 @@ public class ImageService {
@Autowired @Autowired
private StoryService storyService; private StoryService storyService;
@Autowired
private AuthorService authorService;
@Autowired
private CollectionService collectionService;
private String getUploadDir() { private String getUploadDir() {
String libraryPath = libraryService.getCurrentImagePath(); String libraryPath = libraryService.getCurrentImagePath();
return baseUploadDir + libraryPath; return baseUploadDir + libraryPath;
@@ -248,14 +258,14 @@ public class ImageService {
* Process HTML content and download all referenced images, replacing URLs with local paths * Process HTML content and download all referenced images, replacing URLs with local paths
*/ */
public ContentImageProcessingResult processContentImages(String htmlContent, UUID storyId) { public ContentImageProcessingResult processContentImages(String htmlContent, UUID storyId) {
logger.info("Processing content images for story: {}, content length: {}", storyId, logger.debug("Processing content images for story: {}, content length: {}", storyId,
htmlContent != null ? htmlContent.length() : 0); htmlContent != null ? htmlContent.length() : 0);
List<String> warnings = new ArrayList<>(); List<String> warnings = new ArrayList<>();
List<String> downloadedImages = new ArrayList<>(); List<String> downloadedImages = new ArrayList<>();
if (htmlContent == null || htmlContent.trim().isEmpty()) { if (htmlContent == null || htmlContent.trim().isEmpty()) {
logger.info("No content to process for story: {}", storyId); logger.debug("No content to process for story: {}", storyId);
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages); return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
} }
@@ -273,18 +283,18 @@ public class ImageService {
String imageUrl = matcher.group(1); String imageUrl = matcher.group(1);
imageCount++; imageCount++;
logger.info("Found image #{}: {} in tag: {}", imageCount, imageUrl, fullImgTag); logger.debug("Found image #{}: {} in tag: {}", imageCount, imageUrl, fullImgTag);
try { try {
// Skip if it's already a local path or data URL // Skip if it's already a local path or data URL
if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) { if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) {
logger.info("Skipping local/data URL: {}", imageUrl); logger.debug("Skipping local/data URL: {}", imageUrl);
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag)); matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
continue; continue;
} }
externalImageCount++; externalImageCount++;
logger.info("Processing external image #{}: {}", externalImageCount, imageUrl); logger.debug("Processing external image #{}: {}", externalImageCount, imageUrl);
// Download and store the image // Download and store the image
String localPath = downloadImageFromUrl(imageUrl, storyId); String localPath = downloadImageFromUrl(imageUrl, storyId);
@@ -292,7 +302,7 @@ public class ImageService {
// Generate local URL // Generate local URL
String localUrl = getLocalImageUrl(storyId, localPath); String localUrl = getLocalImageUrl(storyId, localPath);
logger.info("Downloaded image: {} -> {}", imageUrl, localUrl); logger.debug("Downloaded image: {} -> {}", imageUrl, localUrl);
// Replace the src attribute with the local path - handle both single and double quotes // Replace the src attribute with the local path - handle both single and double quotes
String newImgTag = fullImgTag String newImgTag = fullImgTag
@@ -305,7 +315,7 @@ public class ImageService {
newImgTag = fullImgTag.replaceAll("src\\s*=\\s*[\"']?" + Pattern.quote(imageUrl) + "[\"']?", "src=\"" + localUrl + "\""); newImgTag = fullImgTag.replaceAll("src\\s*=\\s*[\"']?" + Pattern.quote(imageUrl) + "[\"']?", "src=\"" + localUrl + "\"");
} }
logger.info("Replaced img tag: {} -> {}", fullImgTag, newImgTag); logger.debug("Replaced img tag: {} -> {}", fullImgTag, newImgTag);
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag)); matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
} catch (Exception e) { } catch (Exception e) {
@@ -324,6 +334,101 @@ public class ImageService {
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages); return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
} }
/**
* Functional interface for progress callbacks during image processing
*/
@FunctionalInterface
public interface ImageProcessingProgressCallback {
void onProgress(String currentImageUrl, int processedCount, int totalCount);
}
/**
* Process content images with progress callbacks for async processing
*/
public ContentImageProcessingResult processContentImagesWithProgress(String htmlContent, UUID storyId, ImageProcessingProgressCallback progressCallback) {
logger.debug("Processing content images with progress for story: {}, content length: {}", storyId,
htmlContent != null ? htmlContent.length() : 0);
List<String> warnings = new ArrayList<>();
List<String> downloadedImages = new ArrayList<>();
if (htmlContent == null || htmlContent.trim().isEmpty()) {
logger.debug("No content to process for story: {}", storyId);
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
}
// Find all img tags with src attributes
Pattern imgPattern = Pattern.compile("<img[^>]+src=[\"']([^\"']+)[\"'][^>]*>", Pattern.CASE_INSENSITIVE);
Matcher matcher = imgPattern.matcher(htmlContent);
// First pass: count external images
List<String> externalImages = new ArrayList<>();
Matcher countMatcher = imgPattern.matcher(htmlContent);
while (countMatcher.find()) {
String imageUrl = countMatcher.group(1);
if (!imageUrl.startsWith("/") && !imageUrl.startsWith("data:")) {
externalImages.add(imageUrl);
}
}
int totalExternalImages = externalImages.size();
int processedCount = 0;
StringBuffer processedContent = new StringBuffer();
matcher.reset(); // Reset the matcher for processing
while (matcher.find()) {
String fullImgTag = matcher.group(0);
String imageUrl = matcher.group(1);
logger.debug("Found image: {} in tag: {}", imageUrl, fullImgTag);
try {
// Skip if it's already a local path or data URL
if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) {
logger.debug("Skipping local/data URL: {}", imageUrl);
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
continue;
}
// Call progress callback
if (progressCallback != null) {
progressCallback.onProgress(imageUrl, processedCount, totalExternalImages);
}
logger.debug("Processing external image #{}: {}", processedCount + 1, imageUrl);
// Download and store the image
String localPath = downloadImageFromUrl(imageUrl, storyId);
downloadedImages.add(localPath);
// Generate local URL
String localUrl = getLocalImageUrl(storyId, localPath);
logger.debug("Downloaded image: {} -> {}", imageUrl, localUrl);
// Replace the src attribute with the local path
String newImgTag = fullImgTag
.replaceFirst("src=\"" + Pattern.quote(imageUrl) + "\"", "src=\"" + localUrl + "\"")
.replaceFirst("src='" + Pattern.quote(imageUrl) + "'", "src='" + localUrl + "'");
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
processedCount++;
} catch (Exception e) {
logger.warn("Failed to download image: {} - Error: {}", imageUrl, e.getMessage());
warnings.add("Failed to download image: " + imageUrl + " - " + e.getMessage());
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
}
}
matcher.appendTail(processedContent);
logger.info("Processed {} external images for story: {} (Total: {}, Downloaded: {}, Warnings: {})",
processedCount, storyId, processedCount, downloadedImages.size(), warnings.size());
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
}
/** /**
* Download an image from a URL and store it locally * Download an image from a URL and store it locally
*/ */
@@ -388,7 +493,7 @@ public class ImageService {
return "/api/files/images/default/" + imagePath; return "/api/files/images/default/" + imagePath;
} }
String localUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath; String localUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
logger.info("Generated local image URL: {} for story: {}", localUrl, storyId); logger.debug("Generated local image URL: {} for story: {}", localUrl, storyId);
return localUrl; return localUrl;
} }
@@ -437,25 +542,26 @@ public class ImageService {
int foldersToDelete = 0; int foldersToDelete = 0;
// Step 1: Collect all image references from all story content // Step 1: Collect all image references from all story content
logger.info("Scanning all story content for image references..."); logger.debug("Scanning all story content for image references...");
referencedImages = collectAllImageReferences(); referencedImages = collectAllImageReferences();
logger.info("Found {} unique image references in story content", referencedImages.size()); logger.debug("Found {} unique image references in story content", referencedImages.size());
try { try {
// Step 2: Scan the content images directory // Step 2: Scan the content images directory
Path contentImagesDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory()); Path contentImagesDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory());
if (!Files.exists(contentImagesDir)) { if (!Files.exists(contentImagesDir)) {
logger.info("Content images directory does not exist: {}", contentImagesDir); logger.debug("Content images directory does not exist: {}", contentImagesDir);
return new ContentImageCleanupResult(orphanedImages, 0, 0, referencedImages.size(), errors, dryRun); return new ContentImageCleanupResult(orphanedImages, 0, 0, referencedImages.size(), errors, dryRun);
} }
logger.info("Scanning content images directory: {}", contentImagesDir); logger.debug("Scanning content images directory: {}", contentImagesDir);
// Walk through all story directories // Walk through all story directories
Files.walk(contentImagesDir, 2) Files.walk(contentImagesDir, 2)
.filter(Files::isDirectory) .filter(Files::isDirectory)
.filter(path -> !path.equals(contentImagesDir)) // Skip the root content directory .filter(path -> !path.equals(contentImagesDir)) // Skip the root content directory
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system directories
.forEach(storyDir -> { .forEach(storyDir -> {
try { try {
String storyId = storyDir.getFileName().toString(); String storyId = storyDir.getFileName().toString();
@@ -465,11 +571,13 @@ public class ImageService {
boolean storyExists = storyService.findByIdOptional(UUID.fromString(storyId)).isPresent(); boolean storyExists = storyService.findByIdOptional(UUID.fromString(storyId)).isPresent();
if (!storyExists) { if (!storyExists) {
logger.info("Found orphaned story directory (story deleted): {}", storyId); logger.debug("Found orphaned story directory (story deleted): {}", storyId);
// Mark entire directory for deletion // Mark entire directory for deletion
try { try {
Files.walk(storyDir) Files.walk(storyDir)
.filter(Files::isRegularFile) .filter(Files::isRegularFile)
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system files
.filter(path -> isValidImageFile(path)) // Only process actual image files
.forEach(file -> { .forEach(file -> {
try { try {
long size = Files.size(file); long size = Files.size(file);
@@ -489,13 +597,18 @@ public class ImageService {
try { try {
Files.walk(storyDir) Files.walk(storyDir)
.filter(Files::isRegularFile) .filter(Files::isRegularFile)
.filter(path -> !isSynologySystemPath(path)) // Skip Synology system files
.filter(path -> isValidImageFile(path)) // Only process actual image files
.forEach(imageFile -> { .forEach(imageFile -> {
try { try {
String imagePath = getRelativeImagePath(imageFile); String filename = imageFile.getFileName().toString();
if (!referencedImages.contains(imagePath)) { // Only consider it orphaned if it's not in our referenced filenames
logger.debug("Found orphaned image: {}", imagePath); if (!referencedImages.contains(filename)) {
logger.debug("Found orphaned image: {}", filename);
orphanedImages.add(imageFile.toString()); orphanedImages.add(imageFile.toString());
} else {
logger.debug("Image file is referenced, keeping: {}", filename);
} }
} catch (Exception e) { } catch (Exception e) {
errors.add("Error checking image file " + imageFile + ": " + e.getMessage()); errors.add("Error checking image file " + imageFile + ": " + e.getMessage());
@@ -535,7 +648,7 @@ public class ImageService {
// Step 3: Delete orphaned files if not dry run // Step 3: Delete orphaned files if not dry run
if (!dryRun && !orphanedImages.isEmpty()) { if (!dryRun && !orphanedImages.isEmpty()) {
logger.info("Deleting {} orphaned images...", orphanedImages.size()); logger.debug("Deleting {} orphaned images...", orphanedImages.size());
Set<Path> directoriesToCheck = new HashSet<>(); Set<Path> directoriesToCheck = new HashSet<>();
@@ -557,7 +670,7 @@ public class ImageService {
try { try {
if (Files.exists(dir) && isDirEmpty(dir)) { if (Files.exists(dir) && isDirEmpty(dir)) {
Files.delete(dir); Files.delete(dir);
logger.info("Deleted empty story directory: {}", dir); logger.debug("Deleted empty story directory: {}", dir);
} }
} catch (IOException e) { } catch (IOException e) {
errors.add("Failed to delete empty directory " + dir + ": " + e.getMessage()); errors.add("Failed to delete empty directory " + dir + ": " + e.getMessage());
@@ -577,10 +690,10 @@ public class ImageService {
} }
/** /**
* Collect all image references from all story content * Collect all image filenames referenced in content (UUID-based filenames only)
*/ */
private Set<String> collectAllImageReferences() { private Set<String> collectAllImageReferences() {
Set<String> referencedImages = new HashSet<>(); Set<String> referencedFilenames = new HashSet<>();
try { try {
// Get all stories // Get all stories
@@ -590,27 +703,70 @@ public class ImageService {
Pattern imagePattern = Pattern.compile("src=[\"']([^\"']*(?:content/[^\"']*\\.(jpg|jpeg|png)))[\"']", Pattern.CASE_INSENSITIVE); Pattern imagePattern = Pattern.compile("src=[\"']([^\"']*(?:content/[^\"']*\\.(jpg|jpeg|png)))[\"']", Pattern.CASE_INSENSITIVE);
for (com.storycove.entity.Story story : allStories) { for (com.storycove.entity.Story story : allStories) {
// Add story cover image filename if present
if (story.getCoverPath() != null && !story.getCoverPath().trim().isEmpty()) {
String filename = extractFilename(story.getCoverPath());
if (filename != null) {
referencedFilenames.add(filename);
logger.debug("Found cover image filename in story {}: {}", story.getId(), filename);
}
}
// Add author avatar image filename if present
if (story.getAuthor() != null && story.getAuthor().getAvatarImagePath() != null && !story.getAuthor().getAvatarImagePath().trim().isEmpty()) {
String filename = extractFilename(story.getAuthor().getAvatarImagePath());
if (filename != null) {
referencedFilenames.add(filename);
logger.debug("Found avatar image filename for author {}: {}", story.getAuthor().getId(), filename);
}
}
// Add content images from HTML
if (story.getContentHtml() != null) { if (story.getContentHtml() != null) {
Matcher matcher = imagePattern.matcher(story.getContentHtml()); Matcher matcher = imagePattern.matcher(story.getContentHtml());
while (matcher.find()) { while (matcher.find()) {
String imageSrc = matcher.group(1); String imageSrc = matcher.group(1);
// Convert to relative path format that matches our file system // Extract just the filename from the URL
String relativePath = convertSrcToRelativePath(imageSrc); String filename = extractFilename(imageSrc);
if (relativePath != null) { if (filename != null && isUuidBasedFilename(filename)) {
referencedImages.add(relativePath); referencedFilenames.add(filename);
logger.debug("Found image reference in story {}: {}", story.getId(), relativePath); logger.debug("Found content image filename in story {}: {}", story.getId(), filename);
} }
} }
} }
} }
// Also get all authors separately to catch avatars for authors without stories
List<com.storycove.entity.Author> allAuthors = authorService.findAll();
for (com.storycove.entity.Author author : allAuthors) {
if (author.getAvatarImagePath() != null && !author.getAvatarImagePath().trim().isEmpty()) {
String filename = extractFilename(author.getAvatarImagePath());
if (filename != null) {
referencedFilenames.add(filename);
logger.debug("Found standalone avatar image filename for author {}: {}", author.getId(), filename);
}
}
}
// Also get all collections to catch cover images
List<com.storycove.entity.Collection> allCollections = collectionService.findAllWithTags();
for (com.storycove.entity.Collection collection : allCollections) {
if (collection.getCoverImagePath() != null && !collection.getCoverImagePath().trim().isEmpty()) {
String filename = extractFilename(collection.getCoverImagePath());
if (filename != null) {
referencedFilenames.add(filename);
logger.debug("Found collection cover image filename for collection {}: {}", collection.getId(), filename);
}
}
}
} catch (Exception e) { } catch (Exception e) {
logger.error("Error collecting image references from stories", e); logger.error("Error collecting image references from stories", e);
} }
return referencedImages; return referencedFilenames;
} }
/** /**
@@ -629,6 +785,64 @@ public class ImageService {
return null; return null;
} }
/**
* Convert absolute file path to relative path from upload directory
*/
private String convertAbsolutePathToRelative(String absolutePath) {
try {
if (absolutePath == null || absolutePath.trim().isEmpty()) {
return null;
}
Path absPath = Paths.get(absolutePath);
Path uploadDirPath = Paths.get(getUploadDir());
// If the path is already relative to upload dir, return as-is
if (!absPath.isAbsolute()) {
return absolutePath.replace('\\', '/');
}
// Try to make it relative to the upload directory
if (absPath.startsWith(uploadDirPath)) {
Path relativePath = uploadDirPath.relativize(absPath);
return relativePath.toString().replace('\\', '/');
}
// If it's not under upload directory, check if it's library-specific path
String libraryPath = libraryService.getCurrentImagePath();
Path baseUploadPath = Paths.get(baseUploadDir);
if (absPath.startsWith(baseUploadPath)) {
Path relativePath = baseUploadPath.relativize(absPath);
String relativeStr = relativePath.toString().replace('\\', '/');
// Remove library prefix if present to make it library-agnostic for comparison
if (relativeStr.startsWith(libraryPath.substring(1))) { // Remove leading slash from library path
return relativeStr.substring(libraryPath.length() - 1); // Keep the leading slash
}
return relativeStr;
}
// Fallback: just use the filename portion if it's in the right structure
String fileName = absPath.getFileName().toString();
if (fileName.matches(".*\\.(jpg|jpeg|png)$")) {
// Try to preserve directory structure if it looks like covers/ or avatars/
Path parent = absPath.getParent();
if (parent != null) {
String parentName = parent.getFileName().toString();
if (parentName.equals("covers") || parentName.equals("avatars")) {
return parentName + "/" + fileName;
}
}
return fileName;
}
} catch (Exception e) {
logger.debug("Failed to convert absolute path to relative: {}", absolutePath, e);
}
return null;
}
/** /**
* Get relative image path from absolute file path * Get relative image path from absolute file path
*/ */
@@ -741,4 +955,115 @@ public class ImageService {
return String.format("%.1f GB", totalSizeBytes / (1024.0 * 1024.0 * 1024.0)); return String.format("%.1f GB", totalSizeBytes / (1024.0 * 1024.0 * 1024.0));
} }
} }
/**
* Check if a path is a Synology system path that should be ignored
*/
private boolean isSynologySystemPath(Path path) {
String pathStr = path.toString();
String fileName = path.getFileName().toString();
// Skip Synology metadata directories and files
return pathStr.contains("@eaDir") ||
fileName.startsWith("@") ||
fileName.contains("@SynoEAStream") ||
fileName.startsWith(".") ||
fileName.equals("Thumbs.db") ||
fileName.equals(".DS_Store");
}
/**
* Check if a file is a valid image file (not a system/metadata file)
*/
private boolean isValidImageFile(Path path) {
if (isSynologySystemPath(path)) {
return false;
}
String fileName = path.getFileName().toString().toLowerCase();
return fileName.endsWith(".jpg") ||
fileName.endsWith(".jpeg") ||
fileName.endsWith(".png") ||
fileName.endsWith(".gif") ||
fileName.endsWith(".webp");
}
/**
* Extract filename from a path or URL
*/
private String extractFilename(String pathOrUrl) {
if (pathOrUrl == null || pathOrUrl.trim().isEmpty()) {
return null;
}
try {
// Remove query parameters if present
if (pathOrUrl.contains("?")) {
pathOrUrl = pathOrUrl.substring(0, pathOrUrl.indexOf("?"));
}
// Get the last part after slash
String filename = pathOrUrl.substring(pathOrUrl.lastIndexOf("/") + 1);
// Remove any special Synology suffixes
filename = filename.replace("@SynoEAStream", "");
return filename.trim().isEmpty() ? null : filename;
} catch (Exception e) {
logger.debug("Failed to extract filename from: {}", pathOrUrl);
return null;
}
}
/**
* Check if a filename follows UUID pattern (indicates it's our generated file)
*/
private boolean isUuidBasedFilename(String filename) {
if (filename == null || filename.trim().isEmpty()) {
return false;
}
// Remove extension
String nameWithoutExt = filename;
int lastDot = filename.lastIndexOf(".");
if (lastDot > 0) {
nameWithoutExt = filename.substring(0, lastDot);
}
// Check if it matches UUID pattern (8-4-4-4-12 hex characters)
return nameWithoutExt.matches("[0-9a-fA-F]{8}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{4}-[0-9a-fA-F]{12}");
}
/**
* Event listener for story content updates - processes external images asynchronously
*/
@EventListener
@Async
public void handleStoryContentUpdated(StoryContentUpdatedEvent event) {
logger.info("Processing images for {} story {} after content update",
event.isNewStory() ? "new" : "updated", event.getStoryId());
try {
ContentImageProcessingResult result = processContentImages(event.getContentHtml(), event.getStoryId());
// If content was changed, we need to update the story (but this could cause circular events)
// Instead, let's just log the results for now and let the controller handle updates if needed
if (result.hasWarnings()) {
logger.warn("Image processing warnings for story {}: {}", event.getStoryId(), result.getWarnings());
}
if (!result.getDownloadedImages().isEmpty()) {
logger.info("Downloaded {} external images for story {}: {}",
result.getDownloadedImages().size(), event.getStoryId(), result.getDownloadedImages());
}
// TODO: If content was changed, we might need a way to update the story without triggering another event
if (!result.getProcessedContent().equals(event.getContentHtml())) {
logger.info("Story {} content was processed and external images were replaced with local URLs", event.getStoryId());
// For now, just log that processing occurred - the original content processing already handles updates
}
} catch (Exception e) {
logger.error("Failed to process images for story {}: {}", event.getStoryId(), e.getMessage(), e);
}
}
} }

View File

@@ -115,7 +115,7 @@ public class LibraryService implements ApplicationContextAware {
/** /**
* Switch to library after authentication with forced reindexing * Switch to library after authentication with forced reindexing
* This ensures OpenSearch is always up-to-date after login * This ensures Solr is always up-to-date after login
*/ */
public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception { public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception {
logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId); logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId);
@@ -144,9 +144,9 @@ public class LibraryService implements ApplicationContextAware {
String previousLibraryId = currentLibraryId; String previousLibraryId = currentLibraryId;
if (libraryId.equals(currentLibraryId) && forceReindex) { if (libraryId.equals(currentLibraryId) && forceReindex) {
logger.info("Forcing reindex for current library: {} ({})", library.getName(), libraryId); logger.debug("Forcing reindex for current library: {} ({})", library.getName(), libraryId);
} else { } else {
logger.info("Switching to library: {} ({})", library.getName(), libraryId); logger.debug("Switching to library: {} ({})", library.getName(), libraryId);
} }
// Close current resources // Close current resources
@@ -154,15 +154,15 @@ public class LibraryService implements ApplicationContextAware {
// Set new active library (datasource routing handled by SmartRoutingDataSource) // Set new active library (datasource routing handled by SmartRoutingDataSource)
currentLibraryId = libraryId; currentLibraryId = libraryId;
// OpenSearch indexes are global - no per-library initialization needed // Solr indexes are global - no per-library initialization needed
logger.info("Library switched to OpenSearch mode for library: {}", libraryId); logger.debug("Library switched to Solr mode for library: {}", libraryId);
logger.info("Successfully switched to library: {}", library.getName()); logger.info("Successfully switched to library: {}", library.getName());
// Perform complete reindex AFTER library switch is fully complete // Perform complete reindex AFTER library switch is fully complete
// This ensures database routing is properly established // This ensures database routing is properly established
if (forceReindex || !libraryId.equals(previousLibraryId)) { if (forceReindex || !libraryId.equals(previousLibraryId)) {
logger.info("Starting post-switch OpenSearch reindex for library: {}", libraryId); logger.debug("Starting post-switch Solr reindex for library: {}", libraryId);
// Run reindex asynchronously to avoid blocking authentication response // Run reindex asynchronously to avoid blocking authentication response
// and allow time for database routing to fully stabilize // and allow time for database routing to fully stabilize
@@ -171,7 +171,7 @@ public class LibraryService implements ApplicationContextAware {
try { try {
// Give routing time to stabilize // Give routing time to stabilize
Thread.sleep(500); Thread.sleep(500);
logger.info("Starting async OpenSearch reindex for library: {}", finalLibraryId); logger.debug("Starting async Solr reindex for library: {}", finalLibraryId);
SearchServiceAdapter searchService = applicationContext.getBean(SearchServiceAdapter.class); SearchServiceAdapter searchService = applicationContext.getBean(SearchServiceAdapter.class);
// Get all stories and authors for reindexing // Get all stories and authors for reindexing
@@ -184,12 +184,12 @@ public class LibraryService implements ApplicationContextAware {
searchService.bulkIndexStories(allStories); searchService.bulkIndexStories(allStories);
searchService.bulkIndexAuthors(allAuthors); searchService.bulkIndexAuthors(allAuthors);
logger.info("Completed async OpenSearch reindexing for library: {} ({} stories, {} authors)", logger.info("Completed async Solr reindexing for library: {} ({} stories, {} authors)",
finalLibraryId, allStories.size(), allAuthors.size()); finalLibraryId, allStories.size(), allAuthors.size());
} catch (Exception e) { } catch (Exception e) {
logger.warn("Failed to async reindex OpenSearch for library {}: {}", finalLibraryId, e.getMessage()); logger.warn("Failed to async reindex Solr for library {}: {}", finalLibraryId, e.getMessage());
} }
}, "OpenSearchReindex-" + libraryId).start(); }, "SolrReindex-" + libraryId).start();
} }
} }
@@ -342,10 +342,10 @@ public class LibraryService implements ApplicationContextAware {
library.setInitialized((Boolean) data.getOrDefault("initialized", false)); library.setInitialized((Boolean) data.getOrDefault("initialized", false));
libraries.put(id, library); libraries.put(id, library);
logger.info("Loaded library: {} ({})", library.getName(), id); logger.debug("Loaded library: {} ({})", library.getName(), id);
} }
} else { } else {
logger.info("No libraries configuration file found, will create default"); logger.debug("No libraries configuration file found, will create default");
} }
} catch (IOException e) { } catch (IOException e) {
logger.error("Failed to load libraries configuration", e); logger.error("Failed to load libraries configuration", e);
@@ -411,7 +411,7 @@ public class LibraryService implements ApplicationContextAware {
String json = objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(config); String json = objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(config);
Files.writeString(Paths.get(LIBRARIES_CONFIG_PATH), json); Files.writeString(Paths.get(LIBRARIES_CONFIG_PATH), json);
logger.info("Saved libraries configuration"); logger.debug("Saved libraries configuration");
} catch (IOException e) { } catch (IOException e) {
logger.error("Failed to save libraries configuration", e); logger.error("Failed to save libraries configuration", e);
} }
@@ -419,7 +419,7 @@ public class LibraryService implements ApplicationContextAware {
private DataSource createDataSource(String dbName) { private DataSource createDataSource(String dbName) {
String url = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName); String url = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
logger.info("Creating DataSource for: {}", url); logger.debug("Creating DataSource for: {}", url);
// First, ensure the database exists // First, ensure the database exists
ensureDatabaseExists(dbName); ensureDatabaseExists(dbName);
@@ -459,7 +459,7 @@ public class LibraryService implements ApplicationContextAware {
preparedStatement.setString(1, dbName); preparedStatement.setString(1, dbName);
try (var resultSet = preparedStatement.executeQuery()) { try (var resultSet = preparedStatement.executeQuery()) {
if (resultSet.next()) { if (resultSet.next()) {
logger.info("Database {} already exists", dbName); logger.debug("Database {} already exists", dbName);
return; // Database exists, nothing to do return; // Database exists, nothing to do
} }
} }
@@ -488,7 +488,7 @@ public class LibraryService implements ApplicationContextAware {
} }
private void initializeNewDatabaseSchema(String dbName) { private void initializeNewDatabaseSchema(String dbName) {
logger.info("Initializing schema for new database: {}", dbName); logger.debug("Initializing schema for new database: {}", dbName);
// Create a temporary DataSource for the new database to initialize schema // Create a temporary DataSource for the new database to initialize schema
String newDbUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName); String newDbUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
@@ -505,7 +505,7 @@ public class LibraryService implements ApplicationContextAware {
// Use Hibernate to create the schema // Use Hibernate to create the schema
// This mimics what Spring Boot does during startup // This mimics what Spring Boot does during startup
createSchemaUsingHibernate(tempDataSource); createSchemaUsingHibernate(tempDataSource);
logger.info("Schema initialized for database: {}", dbName); logger.debug("Schema initialized for database: {}", dbName);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to initialize schema for database {}: {}", dbName, e.getMessage()); logger.error("Failed to initialize schema for database {}: {}", dbName, e.getMessage());
@@ -520,15 +520,15 @@ public class LibraryService implements ApplicationContextAware {
} }
try { try {
logger.info("Initializing resources for new library: {}", library.getName()); logger.debug("Initializing resources for new library: {}", library.getName());
// 1. Create image directory structure // 1. Create image directory structure
initializeImageDirectories(library); initializeImageDirectories(library);
// 2. OpenSearch indexes are global and managed automatically // 2. Solr indexes are global and managed automatically
// No per-library initialization needed for OpenSearch // No per-library initialization needed for Solr
logger.info("Successfully initialized resources for library: {}", library.getName()); logger.debug("Successfully initialized resources for library: {}", library.getName());
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to initialize resources for library {}: {}", libraryId, e.getMessage()); logger.error("Failed to initialize resources for library {}: {}", libraryId, e.getMessage());
@@ -544,16 +544,16 @@ public class LibraryService implements ApplicationContextAware {
if (!java.nio.file.Files.exists(libraryImagePath)) { if (!java.nio.file.Files.exists(libraryImagePath)) {
java.nio.file.Files.createDirectories(libraryImagePath); java.nio.file.Files.createDirectories(libraryImagePath);
logger.info("Created image directory: {}", imagePath); logger.debug("Created image directory: {}", imagePath);
// Create subdirectories for different image types // Create subdirectories for different image types
java.nio.file.Files.createDirectories(libraryImagePath.resolve("stories")); java.nio.file.Files.createDirectories(libraryImagePath.resolve("stories"));
java.nio.file.Files.createDirectories(libraryImagePath.resolve("authors")); java.nio.file.Files.createDirectories(libraryImagePath.resolve("authors"));
java.nio.file.Files.createDirectories(libraryImagePath.resolve("collections")); java.nio.file.Files.createDirectories(libraryImagePath.resolve("collections"));
logger.info("Created image subdirectories for library: {}", library.getId()); logger.debug("Created image subdirectories for library: {}", library.getId());
} else { } else {
logger.info("Image directory already exists: {}", imagePath); logger.debug("Image directory already exists: {}", imagePath);
} }
} catch (Exception e) { } catch (Exception e) {
@@ -749,7 +749,7 @@ public class LibraryService implements ApplicationContextAware {
statement.executeUpdate(sql); statement.executeUpdate(sql);
} }
logger.info("Successfully created all database tables and constraints"); logger.debug("Successfully created all database tables and constraints");
} catch (SQLException e) { } catch (SQLException e) {
logger.error("Failed to create database schema", e); logger.error("Failed to create database schema", e);
@@ -760,7 +760,7 @@ public class LibraryService implements ApplicationContextAware {
private void closeCurrentResources() { private void closeCurrentResources() {
// No need to close datasource - SmartRoutingDataSource handles this // No need to close datasource - SmartRoutingDataSource handles this
// OpenSearch service is managed by Spring - no explicit cleanup needed // Solr service is managed by Spring - no explicit cleanup needed
// Don't clear currentLibraryId here - only when explicitly switching // Don't clear currentLibraryId here - only when explicitly switching
} }

View File

@@ -1,133 +0,0 @@
package com.storycove.service;
import com.storycove.config.OpenSearchProperties;
import org.opensearch.client.opensearch.OpenSearchClient;
import org.opensearch.client.opensearch.cluster.HealthRequest;
import org.opensearch.client.opensearch.cluster.HealthResponse;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.actuate.health.Health;
import org.springframework.boot.actuate.health.HealthIndicator;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Service;
import java.time.LocalDateTime;
import java.util.concurrent.atomic.AtomicReference;
@Service
@ConditionalOnProperty(name = "storycove.search.engine", havingValue = "opensearch")
public class OpenSearchHealthService implements HealthIndicator {
private static final Logger logger = LoggerFactory.getLogger(OpenSearchHealthService.class);
private final OpenSearchClient openSearchClient;
private final OpenSearchProperties properties;
private final AtomicReference<Health> lastKnownHealth = new AtomicReference<>(Health.unknown().build());
private LocalDateTime lastCheckTime = LocalDateTime.now();
@Autowired
public OpenSearchHealthService(OpenSearchClient openSearchClient, OpenSearchProperties properties) {
this.openSearchClient = openSearchClient;
this.properties = properties;
}
@Override
public Health health() {
return lastKnownHealth.get();
}
@Scheduled(fixedDelayString = "#{@openSearchProperties.health.checkInterval}")
public void performHealthCheck() {
try {
HealthResponse clusterHealth = openSearchClient.cluster().health(
HealthRequest.of(h -> h.timeout(t -> t.time("10s")))
);
Health.Builder healthBuilder = Health.up()
.withDetail("cluster_name", clusterHealth.clusterName())
.withDetail("status", clusterHealth.status().jsonValue())
.withDetail("number_of_nodes", clusterHealth.numberOfNodes())
.withDetail("number_of_data_nodes", clusterHealth.numberOfDataNodes())
.withDetail("active_primary_shards", clusterHealth.activePrimaryShards())
.withDetail("active_shards", clusterHealth.activeShards())
.withDetail("relocating_shards", clusterHealth.relocatingShards())
.withDetail("initializing_shards", clusterHealth.initializingShards())
.withDetail("unassigned_shards", clusterHealth.unassignedShards())
.withDetail("last_check", LocalDateTime.now());
// Check if cluster status is concerning
switch (clusterHealth.status()) {
case Red:
healthBuilder = Health.down()
.withDetail("reason", "Cluster status is RED - some primary shards are unassigned");
break;
case Yellow:
if (isProduction()) {
healthBuilder = Health.down()
.withDetail("reason", "Cluster status is YELLOW - some replica shards are unassigned (critical in production)");
} else {
// Yellow is acceptable in development (single node clusters)
healthBuilder.withDetail("warning", "Cluster status is YELLOW - acceptable for development");
}
break;
case Green:
// All good
break;
}
lastKnownHealth.set(healthBuilder.build());
lastCheckTime = LocalDateTime.now();
if (properties.getHealth().isEnableMetrics()) {
logMetrics(clusterHealth);
}
} catch (Exception e) {
logger.error("OpenSearch health check failed", e);
Health unhealthyStatus = Health.down()
.withDetail("error", e.getMessage())
.withDetail("last_successful_check", lastCheckTime)
.withDetail("current_time", LocalDateTime.now())
.build();
lastKnownHealth.set(unhealthyStatus);
}
}
private void logMetrics(HealthResponse clusterHealth) {
logger.info("OpenSearch Cluster Metrics - Status: {}, Nodes: {}, Active Shards: {}, Unassigned: {}",
clusterHealth.status().jsonValue(),
clusterHealth.numberOfNodes(),
clusterHealth.activeShards(),
clusterHealth.unassignedShards());
}
private boolean isProduction() {
return "production".equalsIgnoreCase(properties.getProfile());
}
/**
* Manual health check for immediate status
*/
public boolean isClusterHealthy() {
Health currentHealth = lastKnownHealth.get();
return currentHealth.getStatus() == org.springframework.boot.actuate.health.Status.UP;
}
/**
* Get detailed cluster information
*/
public String getClusterInfo() {
try {
var info = openSearchClient.info();
return String.format("OpenSearch %s (Cluster: %s, Lucene: %s)",
info.version().number(),
info.clusterName(),
info.version().luceneVersion());
} catch (Exception e) {
return "Unable to retrieve cluster information: " + e.getMessage();
}
}
}

View File

@@ -16,7 +16,7 @@ import java.util.UUID;
/** /**
* Service adapter that provides a unified interface for search operations. * Service adapter that provides a unified interface for search operations.
* *
* This adapter directly delegates to OpenSearchService. * This adapter directly delegates to SolrService.
*/ */
@Service @Service
public class SearchServiceAdapter { public class SearchServiceAdapter {
@@ -24,7 +24,7 @@ public class SearchServiceAdapter {
private static final Logger logger = LoggerFactory.getLogger(SearchServiceAdapter.class); private static final Logger logger = LoggerFactory.getLogger(SearchServiceAdapter.class);
@Autowired @Autowired
private OpenSearchService openSearchService; private SolrService solrService;
// =============================== // ===============================
// SEARCH OPERATIONS // SEARCH OPERATIONS
@@ -46,11 +46,20 @@ public class SearchServiceAdapter {
String sourceDomain, String seriesFilter, String sourceDomain, String seriesFilter,
Integer minTagCount, Boolean popularOnly, Integer minTagCount, Boolean popularOnly,
Boolean hiddenGemsOnly) { Boolean hiddenGemsOnly) {
return openSearchService.searchStories(query, tags, author, series, minWordCount, maxWordCount, logger.info("SearchServiceAdapter: delegating search to SolrService");
minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy, try {
createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus, SearchResultDto<StorySearchDto> result = solrService.searchStories(query, tags, author, series, minWordCount, maxWordCount,
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly, minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy,
hiddenGemsOnly); createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus,
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly,
hiddenGemsOnly);
logger.info("SearchServiceAdapter: received result with {} stories and {} facets",
result.getResults().size(), result.getFacets().size());
return result;
} catch (Exception e) {
logger.error("SearchServiceAdapter: error during search", e);
throw e;
}
} }
/** /**
@@ -60,7 +69,7 @@ public class SearchServiceAdapter {
String series, Integer minWordCount, Integer maxWordCount, String series, Integer minWordCount, Integer maxWordCount,
Float minRating, Boolean isRead, Boolean isFavorite, Float minRating, Boolean isRead, Boolean isFavorite,
Long seed) { Long seed) {
return openSearchService.getRandomStories(count, tags, author, series, minWordCount, maxWordCount, return solrService.getRandomStories(count, tags, author, series, minWordCount, maxWordCount,
minRating, isRead, isFavorite, seed); minRating, isRead, isFavorite, seed);
} }
@@ -69,7 +78,7 @@ public class SearchServiceAdapter {
*/ */
public void recreateIndices() { public void recreateIndices() {
try { try {
openSearchService.recreateIndices(); solrService.recreateIndices();
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to recreate search indices", e); logger.error("Failed to recreate search indices", e);
throw new RuntimeException("Failed to recreate search indices", e); throw new RuntimeException("Failed to recreate search indices", e);
@@ -93,21 +102,21 @@ public class SearchServiceAdapter {
* Get random story ID with unified interface * Get random story ID with unified interface
*/ */
public String getRandomStoryId(Long seed) { public String getRandomStoryId(Long seed) {
return openSearchService.getRandomStoryId(seed); return solrService.getRandomStoryId(seed);
} }
/** /**
* Search authors with unified interface * Search authors with unified interface
*/ */
public List<AuthorSearchDto> searchAuthors(String query, int limit) { public List<AuthorSearchDto> searchAuthors(String query, int limit) {
return openSearchService.searchAuthors(query, limit); return solrService.searchAuthors(query, limit);
} }
/** /**
* Get tag suggestions with unified interface * Get tag suggestions with unified interface
*/ */
public List<String> getTagSuggestions(String query, int limit) { public List<String> getTagSuggestions(String query, int limit) {
return openSearchService.getTagSuggestions(query, limit); return solrService.getTagSuggestions(query, limit);
} }
// =============================== // ===============================
@@ -115,88 +124,88 @@ public class SearchServiceAdapter {
// =============================== // ===============================
/** /**
* Index a story in OpenSearch * Index a story in Solr
*/ */
public void indexStory(Story story) { public void indexStory(Story story) {
try { try {
openSearchService.indexStory(story); solrService.indexStory(story);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to index story {}", story.getId(), e); logger.error("Failed to index story {}", story.getId(), e);
} }
} }
/** /**
* Update a story in OpenSearch * Update a story in Solr
*/ */
public void updateStory(Story story) { public void updateStory(Story story) {
try { try {
openSearchService.updateStory(story); solrService.updateStory(story);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to update story {}", story.getId(), e); logger.error("Failed to update story {}", story.getId(), e);
} }
} }
/** /**
* Delete a story from OpenSearch * Delete a story from Solr
*/ */
public void deleteStory(UUID storyId) { public void deleteStory(UUID storyId) {
try { try {
openSearchService.deleteStory(storyId); solrService.deleteStory(storyId);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to delete story {}", storyId, e); logger.error("Failed to delete story {}", storyId, e);
} }
} }
/** /**
* Index an author in OpenSearch * Index an author in Solr
*/ */
public void indexAuthor(Author author) { public void indexAuthor(Author author) {
try { try {
openSearchService.indexAuthor(author); solrService.indexAuthor(author);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to index author {}", author.getId(), e); logger.error("Failed to index author {}", author.getId(), e);
} }
} }
/** /**
* Update an author in OpenSearch * Update an author in Solr
*/ */
public void updateAuthor(Author author) { public void updateAuthor(Author author) {
try { try {
openSearchService.updateAuthor(author); solrService.updateAuthor(author);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to update author {}", author.getId(), e); logger.error("Failed to update author {}", author.getId(), e);
} }
} }
/** /**
* Delete an author from OpenSearch * Delete an author from Solr
*/ */
public void deleteAuthor(UUID authorId) { public void deleteAuthor(UUID authorId) {
try { try {
openSearchService.deleteAuthor(authorId); solrService.deleteAuthor(authorId);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to delete author {}", authorId, e); logger.error("Failed to delete author {}", authorId, e);
} }
} }
/** /**
* Bulk index stories in OpenSearch * Bulk index stories in Solr
*/ */
public void bulkIndexStories(List<Story> stories) { public void bulkIndexStories(List<Story> stories) {
try { try {
openSearchService.bulkIndexStories(stories); solrService.bulkIndexStories(stories);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to bulk index {} stories", stories.size(), e); logger.error("Failed to bulk index {} stories", stories.size(), e);
} }
} }
/** /**
* Bulk index authors in OpenSearch * Bulk index authors in Solr
*/ */
public void bulkIndexAuthors(List<Author> authors) { public void bulkIndexAuthors(List<Author> authors) {
try { try {
openSearchService.bulkIndexAuthors(authors); solrService.bulkIndexAuthors(authors);
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to bulk index {} authors", authors.size(), e); logger.error("Failed to bulk index {} authors", authors.size(), e);
} }
@@ -210,14 +219,14 @@ public class SearchServiceAdapter {
* Check if search service is available and healthy * Check if search service is available and healthy
*/ */
public boolean isSearchServiceAvailable() { public boolean isSearchServiceAvailable() {
return openSearchService.testConnection(); return solrService.testConnection();
} }
/** /**
* Get current search engine name * Get current search engine name
*/ */
public String getCurrentSearchEngine() { public String getCurrentSearchEngine() {
return "opensearch"; return "solr";
} }
/** /**
@@ -228,10 +237,10 @@ public class SearchServiceAdapter {
} }
/** /**
* Check if we can switch to OpenSearch * Check if we can switch to Solr
*/ */
public boolean canSwitchToOpenSearch() { public boolean canSwitchToSolr() {
return true; // Already using OpenSearch return true; // Already using Solr
} }
/** /**
@@ -246,10 +255,10 @@ public class SearchServiceAdapter {
*/ */
public SearchStatus getSearchStatus() { public SearchStatus getSearchStatus() {
return new SearchStatus( return new SearchStatus(
"opensearch", "solr",
false, // no dual-write false, // no dual-write
false, // no typesense false, // no typesense
openSearchService.testConnection() solrService.testConnection()
); );
} }
@@ -260,19 +269,19 @@ public class SearchServiceAdapter {
private final String primaryEngine; private final String primaryEngine;
private final boolean dualWrite; private final boolean dualWrite;
private final boolean typesenseAvailable; private final boolean typesenseAvailable;
private final boolean openSearchAvailable; private final boolean solrAvailable;
public SearchStatus(String primaryEngine, boolean dualWrite, public SearchStatus(String primaryEngine, boolean dualWrite,
boolean typesenseAvailable, boolean openSearchAvailable) { boolean typesenseAvailable, boolean solrAvailable) {
this.primaryEngine = primaryEngine; this.primaryEngine = primaryEngine;
this.dualWrite = dualWrite; this.dualWrite = dualWrite;
this.typesenseAvailable = typesenseAvailable; this.typesenseAvailable = typesenseAvailable;
this.openSearchAvailable = openSearchAvailable; this.solrAvailable = solrAvailable;
} }
public String getPrimaryEngine() { return primaryEngine; } public String getPrimaryEngine() { return primaryEngine; }
public boolean isDualWrite() { return dualWrite; } public boolean isDualWrite() { return dualWrite; }
public boolean isTypesenseAvailable() { return typesenseAvailable; } public boolean isTypesenseAvailable() { return typesenseAvailable; }
public boolean isOpenSearchAvailable() { return openSearchAvailable; } public boolean isSolrAvailable() { return solrAvailable; }
} }
} }

File diff suppressed because it is too large Load Diff

View File

@@ -422,6 +422,18 @@ public class StoryService {
return updatedStory; return updatedStory;
} }
public Story updateContentOnly(UUID id, String contentHtml) {
Story existingStory = findById(id);
existingStory.setContentHtml(contentHtml);
Story updatedStory = storyRepository.save(existingStory);
// Update in search engine since content changed
searchServiceAdapter.updateStory(updatedStory);
return updatedStory;
}
public void delete(UUID id) { public void delete(UUID id) {
Story story = findById(id); Story story = findById(id);

View File

@@ -4,6 +4,11 @@ spring:
username: ${SPRING_DATASOURCE_USERNAME:storycove} username: ${SPRING_DATASOURCE_USERNAME:storycove}
password: ${SPRING_DATASOURCE_PASSWORD:password} password: ${SPRING_DATASOURCE_PASSWORD:password}
driver-class-name: org.postgresql.Driver driver-class-name: org.postgresql.Driver
hikari:
connection-timeout: 60000 # 60 seconds
idle-timeout: 300000 # 5 minutes
max-lifetime: 1800000 # 30 minutes
maximum-pool-size: 20
jpa: jpa:
hibernate: hibernate:
@@ -16,8 +21,8 @@ spring:
servlet: servlet:
multipart: multipart:
max-file-size: 256MB # Increased for backup restore max-file-size: 600MB # Increased for large backup restore (425MB+)
max-request-size: 260MB # Slightly higher to account for form data max-request-size: 610MB # Slightly higher to account for form data
jackson: jackson:
serialization: serialization:
@@ -27,6 +32,8 @@ spring:
server: server:
port: 8080 port: 8080
tomcat:
max-http-request-size: 650MB # Tomcat HTTP request size limit (separate from multipart)
storycove: storycove:
app: app:
@@ -39,54 +46,46 @@ storycove:
auth: auth:
password: ${APP_PASSWORD} # REQUIRED: No default password for security password: ${APP_PASSWORD} # REQUIRED: No default password for security
search: search:
engine: opensearch # OpenSearch is the only search engine engine: solr # Apache Solr search engine
opensearch: solr:
# Connection settings # Connection settings
host: ${OPENSEARCH_HOST:localhost} url: ${SOLR_URL:http://solr:8983/solr}
port: ${OPENSEARCH_PORT:9200} username: ${SOLR_USERNAME:}
scheme: ${OPENSEARCH_SCHEME:http} password: ${SOLR_PASSWORD:}
username: ${OPENSEARCH_USERNAME:}
password: ${OPENSEARCH_PASSWORD:} # Empty when security is disabled
# Environment-specific configuration # Core configuration
profile: ${SPRING_PROFILES_ACTIVE:development} # development, staging, production cores:
stories: ${SOLR_STORIES_CORE:storycove_stories}
authors: ${SOLR_AUTHORS_CORE:storycove_authors}
# Security settings # Connection settings
security:
ssl-verification: ${OPENSEARCH_SSL_VERIFICATION:false}
trust-all-certificates: ${OPENSEARCH_TRUST_ALL_CERTS:true}
keystore-path: ${OPENSEARCH_KEYSTORE_PATH:}
keystore-password: ${OPENSEARCH_KEYSTORE_PASSWORD:}
truststore-path: ${OPENSEARCH_TRUSTSTORE_PATH:}
truststore-password: ${OPENSEARCH_TRUSTSTORE_PASSWORD:}
# Connection pool settings
connection: connection:
timeout: ${OPENSEARCH_CONNECTION_TIMEOUT:30000} # 30 seconds timeout: ${SOLR_CONNECTION_TIMEOUT:30000} # 30 seconds
socket-timeout: ${OPENSEARCH_SOCKET_TIMEOUT:60000} # 60 seconds socket-timeout: ${SOLR_SOCKET_TIMEOUT:60000} # 60 seconds
max-connections-per-route: ${OPENSEARCH_MAX_CONN_PER_ROUTE:10} max-connections-per-route: ${SOLR_MAX_CONN_PER_ROUTE:10}
max-connections-total: ${OPENSEARCH_MAX_CONN_TOTAL:30} max-connections-total: ${SOLR_MAX_CONN_TOTAL:30}
retry-on-failure: ${OPENSEARCH_RETRY_ON_FAILURE:true} retry-on-failure: ${SOLR_RETRY_ON_FAILURE:true}
max-retries: ${OPENSEARCH_MAX_RETRIES:3} max-retries: ${SOLR_MAX_RETRIES:3}
# Index settings # Query settings
indices: query:
default-shards: ${OPENSEARCH_DEFAULT_SHARDS:1} default-rows: ${SOLR_DEFAULT_ROWS:10}
default-replicas: ${OPENSEARCH_DEFAULT_REPLICAS:0} max-rows: ${SOLR_MAX_ROWS:1000}
refresh-interval: ${OPENSEARCH_REFRESH_INTERVAL:1s} default-operator: ${SOLR_DEFAULT_OPERATOR:AND}
highlight: ${SOLR_ENABLE_HIGHLIGHT:true}
facets: ${SOLR_ENABLE_FACETS:true}
# Bulk operations # Commit settings
bulk: commit:
actions: ${OPENSEARCH_BULK_ACTIONS:1000} soft-commit: ${SOLR_SOFT_COMMIT:true}
size: ${OPENSEARCH_BULK_SIZE:5242880} # 5MB commit-within: ${SOLR_COMMIT_WITHIN:1000} # 1 second
timeout: ${OPENSEARCH_BULK_TIMEOUT:10000} # 10 seconds wait-searcher: ${SOLR_WAIT_SEARCHER:false}
concurrent-requests: ${OPENSEARCH_BULK_CONCURRENT:1}
# Health and monitoring # Health and monitoring
health: health:
check-interval: ${OPENSEARCH_HEALTH_CHECK_INTERVAL:30000} # 30 seconds check-interval: ${SOLR_HEALTH_CHECK_INTERVAL:30000} # 30 seconds
slow-query-threshold: ${OPENSEARCH_SLOW_QUERY_THRESHOLD:5000} # 5 seconds slow-query-threshold: ${SOLR_SLOW_QUERY_THRESHOLD:5000} # 5 seconds
enable-metrics: ${OPENSEARCH_ENABLE_METRICS:true} enable-metrics: ${SOLR_ENABLE_METRICS:true}
images: images:
storage-path: ${IMAGE_STORAGE_PATH:/app/images} storage-path: ${IMAGE_STORAGE_PATH:/app/images}
@@ -100,8 +99,8 @@ management:
show-details: when-authorized show-details: when-authorized
show-components: always show-components: always
health: health:
opensearch: solr:
enabled: ${OPENSEARCH_HEALTH_ENABLED:true} enabled: ${SOLR_HEALTH_ENABLED:true}
logging: logging:
level: level:

View File

@@ -1,178 +0,0 @@
# OpenSearch Configuration - Best Practices Implementation
## Overview
This directory contains a production-ready OpenSearch configuration following industry best practices for security, scalability, and maintainability.
## Architecture
### 📁 Directory Structure
```
opensearch/
├── config/
│ ├── opensearch-development.yml # Development-specific settings
│ └── opensearch-production.yml # Production-specific settings
├── mappings/
│ ├── stories-mapping.json # Story index mapping
│ ├── authors-mapping.json # Author index mapping
│ └── collections-mapping.json # Collection index mapping
├── templates/
│ ├── stories-template.json # Index template for stories_*
│ └── index-lifecycle-policy.json # ILM policy for index management
└── README.md # This file
```
## ✅ Best Practices Implemented
### 🔒 **Security**
- **Environment-Aware SSL Configuration**
- Production: Full certificate validation with custom truststore support
- Development: Optional certificate validation for local development
- **Proper Authentication**: Basic auth with secure credential management
- **Connection Security**: TLS 1.3 support with hostname verification
### 🏗️ **Configuration Management**
- **Externalized Configuration**: JSON/YAML files instead of hardcoded values
- **Environment-Specific Settings**: Different configs for dev/staging/prod
- **Type-Safe Properties**: Strongly-typed configuration classes
- **Validation**: Configuration validation at startup
### 📈 **Scalability & Performance**
- **Connection Pooling**: Configurable connection pool with timeout management
- **Environment-Aware Sharding**:
- Development: 1 shard, 0 replicas (single node)
- Production: 3 shards, 1 replica (high availability)
- **Bulk Operations**: Optimized bulk indexing with configurable batch sizes
- **Index Templates**: Automatic application of settings to new indexes
### 🔄 **Index Lifecycle Management**
- **Automated Index Rollover**: Based on size, document count, and age
- **Hot-Warm-Cold Architecture**: Optimized storage costs
- **Retention Policies**: Automatic cleanup of old data
- **Force Merge**: Optimization in warm phase
### 📊 **Monitoring & Observability**
- **Health Checks**: Automatic cluster health monitoring
- **Spring Boot Actuator**: Health endpoints for monitoring systems
- **Metrics Collection**: Configurable performance metrics
- **Slow Query Detection**: Configurable thresholds for query performance
### 🛡️ **Error Handling & Resilience**
- **Connection Retry Logic**: Automatic retry with backoff
- **Circuit Breaker Pattern**: Fail-fast for unhealthy clusters
- **Graceful Degradation**: Graceful handling when OpenSearch unavailable
- **Detailed Error Logging**: Comprehensive error tracking
## 🚀 Usage
### Development Environment
```yaml
# application-development.yml
storycove:
opensearch:
profile: development
security:
ssl-verification: false
trust-all-certificates: true
indices:
default-shards: 1
default-replicas: 0
```
### Production Environment
```yaml
# application-production.yml
storycove:
opensearch:
profile: production
security:
ssl-verification: true
trust-all-certificates: false
truststore-path: /etc/ssl/opensearch-truststore.jks
indices:
default-shards: 3
default-replicas: 1
```
## 📋 Environment Variables
### Required
- `OPENSEARCH_PASSWORD`: Admin password for OpenSearch cluster
### Optional (with sensible defaults)
- `OPENSEARCH_HOST`: Cluster hostname (default: localhost)
- `OPENSEARCH_PORT`: Cluster port (default: 9200)
- `OPENSEARCH_USERNAME`: Admin username (default: admin)
- `OPENSEARCH_SSL_VERIFICATION`: Enable SSL verification (default: false for dev)
- `OPENSEARCH_MAX_CONN_TOTAL`: Max connections (default: 30 for dev, 200 for prod)
## 🎯 Index Templates
Index templates automatically apply configuration to new indexes:
```json
{
"index_patterns": ["stories_*"],
"template": {
"settings": {
"number_of_shards": "#{ENV_SPECIFIC}",
"analysis": {
"analyzer": {
"story_analyzer": {
"type": "standard",
"stopwords": "_english_"
}
}
}
}
}
}
```
## 🔍 Health Monitoring
Access health information:
- **Application Health**: `/actuator/health`
- **OpenSearch Specific**: `/actuator/health/opensearch`
- **Detailed Metrics**: Available when `enable-metrics: true`
## 🔄 Deployment Strategy
Recommended deployment approach:
1. **Development**: Test OpenSearch configuration locally
2. **Staging**: Validate performance and accuracy in staging environment
3. **Production**: Deploy with proper monitoring and backup procedures
## 🛠️ Troubleshooting
### Common Issues
1. **SSL Certificate Errors**
- Development: Set `trust-all-certificates: true`
- Production: Provide valid truststore path
2. **Connection Timeouts**
- Increase `connection.timeout` values
- Check network connectivity and firewall rules
3. **Index Creation Failures**
- Verify cluster health with `/actuator/health/opensearch`
- Check OpenSearch logs for detailed error messages
4. **Performance Issues**
- Monitor slow queries with configurable thresholds
- Adjust bulk operation settings
- Review shard allocation and replica settings
## 🔮 Future Enhancements
- **Multi-Cluster Support**: Connect to multiple OpenSearch clusters
- **Advanced Security**: Integration with OpenSearch Security plugin
- **Custom Analyzers**: Domain-specific text analysis
- **Index Aliases**: Zero-downtime index updates
- **Machine Learning**: Integration with OpenSearch ML features
---
This configuration provides a solid foundation that scales from development to enterprise production environments while maintaining security, performance, and operational excellence.

View File

@@ -1,32 +0,0 @@
# OpenSearch Development Configuration
opensearch:
cluster:
name: "storycove-dev"
initial_master_nodes: ["opensearch-node"]
# Development settings - single node, minimal resources
indices:
default_settings:
number_of_shards: 1
number_of_replicas: 0
refresh_interval: "1s"
# Security settings for development
security:
ssl_verification: false
trust_all_certificates: true
# Connection settings
connection:
timeout: "30s"
socket_timeout: "60s"
max_connections_per_route: 10
max_connections_total: 30
# Index management
index_management:
auto_create_templates: true
template_patterns:
stories: "stories_*"
authors: "authors_*"
collections: "collections_*"

View File

@@ -1,60 +0,0 @@
# OpenSearch Production Configuration
opensearch:
cluster:
name: "storycove-prod"
# Production settings - multi-shard, with replicas
indices:
default_settings:
number_of_shards: 3
number_of_replicas: 1
refresh_interval: "30s"
max_result_window: 50000
# Index lifecycle policies
lifecycle:
hot_phase_duration: "7d"
warm_phase_duration: "30d"
cold_phase_duration: "90d"
delete_after: "1y"
# Security settings for production
security:
ssl_verification: true
trust_all_certificates: false
certificate_verification: true
tls_version: "TLSv1.3"
# Connection settings
connection:
timeout: "10s"
socket_timeout: "30s"
max_connections_per_route: 50
max_connections_total: 200
retry_on_failure: true
max_retries: 3
retry_delay: "1s"
# Performance tuning
performance:
bulk_actions: 1000
bulk_size: "5MB"
bulk_timeout: "10s"
concurrent_requests: 4
# Monitoring and observability
monitoring:
health_check_interval: "30s"
slow_query_threshold: "5s"
enable_metrics: true
# Index management
index_management:
auto_create_templates: true
template_patterns:
stories: "stories_*"
authors: "authors_*"
collections: "collections_*"
retention_policy:
enabled: true
default_retention: "1y"

View File

@@ -1,79 +0,0 @@
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"name_analyzer": {
"type": "standard",
"stopwords": "_english_"
},
"autocomplete_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "edge_ngram"]
}
},
"filter": {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 20
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"name": {
"type": "text",
"analyzer": "name_analyzer",
"fields": {
"autocomplete": {
"type": "text",
"analyzer": "autocomplete_analyzer"
},
"keyword": {
"type": "keyword"
}
}
},
"bio": {
"type": "text",
"analyzer": "name_analyzer"
},
"urls": {
"type": "keyword"
},
"imageUrl": {
"type": "keyword"
},
"storyCount": {
"type": "integer"
},
"averageRating": {
"type": "float"
},
"totalWordCount": {
"type": "long"
},
"totalReadingTime": {
"type": "integer"
},
"createdAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"updatedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"libraryId": {
"type": "keyword"
}
}
}
}

View File

@@ -1,73 +0,0 @@
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"collection_analyzer": {
"type": "standard",
"stopwords": "_english_"
},
"autocomplete_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "edge_ngram"]
}
},
"filter": {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 20
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"name": {
"type": "text",
"analyzer": "collection_analyzer",
"fields": {
"autocomplete": {
"type": "text",
"analyzer": "autocomplete_analyzer"
},
"keyword": {
"type": "keyword"
}
}
},
"description": {
"type": "text",
"analyzer": "collection_analyzer"
},
"storyCount": {
"type": "integer"
},
"totalWordCount": {
"type": "long"
},
"averageRating": {
"type": "float"
},
"isPublic": {
"type": "boolean"
},
"createdAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"updatedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"libraryId": {
"type": "keyword"
}
}
}
}

View File

@@ -1,120 +0,0 @@
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"story_analyzer": {
"type": "standard",
"stopwords": "_english_"
},
"autocomplete_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "edge_ngram"]
}
},
"filter": {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 20
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"title": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"autocomplete": {
"type": "text",
"analyzer": "autocomplete_analyzer"
},
"keyword": {
"type": "keyword"
}
}
},
"content": {
"type": "text",
"analyzer": "story_analyzer"
},
"summary": {
"type": "text",
"analyzer": "story_analyzer"
},
"authorNames": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"authorIds": {
"type": "keyword"
},
"tagNames": {
"type": "keyword"
},
"seriesTitle": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"seriesId": {
"type": "keyword"
},
"wordCount": {
"type": "integer"
},
"rating": {
"type": "float"
},
"readingTime": {
"type": "integer"
},
"language": {
"type": "keyword"
},
"status": {
"type": "keyword"
},
"createdAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"updatedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"publishedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"isRead": {
"type": "boolean"
},
"isFavorite": {
"type": "boolean"
},
"readingProgress": {
"type": "float"
},
"libraryId": {
"type": "keyword"
}
}
}
}

View File

@@ -1,77 +0,0 @@
{
"policy": {
"description": "StoryCove index lifecycle policy",
"default_state": "hot",
"states": [
{
"name": "hot",
"actions": [
{
"rollover": {
"min_size": "50gb",
"min_doc_count": 1000000,
"min_age": "7d"
}
}
],
"transitions": [
{
"state_name": "warm",
"conditions": {
"min_age": "7d"
}
}
]
},
{
"name": "warm",
"actions": [
{
"replica_count": {
"number_of_replicas": 0
}
},
{
"force_merge": {
"max_num_segments": 1
}
}
],
"transitions": [
{
"state_name": "cold",
"conditions": {
"min_age": "30d"
}
}
]
},
{
"name": "cold",
"actions": [],
"transitions": [
{
"state_name": "delete",
"conditions": {
"min_age": "365d"
}
}
]
},
{
"name": "delete",
"actions": [
{
"delete": {}
}
]
}
],
"ism_template": [
{
"index_patterns": ["stories_*", "authors_*", "collections_*"],
"priority": 100
}
]
}
}

View File

@@ -1,124 +0,0 @@
{
"index_patterns": ["stories_*"],
"priority": 1,
"template": {
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"story_analyzer": {
"type": "standard",
"stopwords": "_english_"
},
"autocomplete_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "edge_ngram"]
}
},
"filter": {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 20
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"title": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"autocomplete": {
"type": "text",
"analyzer": "autocomplete_analyzer"
},
"keyword": {
"type": "keyword"
}
}
},
"content": {
"type": "text",
"analyzer": "story_analyzer"
},
"summary": {
"type": "text",
"analyzer": "story_analyzer"
},
"authorNames": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"authorIds": {
"type": "keyword"
},
"tagNames": {
"type": "keyword"
},
"seriesTitle": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"seriesId": {
"type": "keyword"
},
"wordCount": {
"type": "integer"
},
"rating": {
"type": "float"
},
"readingTime": {
"type": "integer"
},
"language": {
"type": "keyword"
},
"status": {
"type": "keyword"
},
"createdAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"updatedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"publishedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"isRead": {
"type": "boolean"
},
"isFavorite": {
"type": "boolean"
},
"readingProgress": {
"type": "float"
},
"libraryId": {
"type": "keyword"
}
}
}
}
}

View File

@@ -19,11 +19,14 @@ storycove:
auth: auth:
password: test-password password: test-password
search: search:
engine: opensearch engine: solr
opensearch: solr:
host: localhost host: localhost
port: 9200 port: 8983
scheme: http scheme: http
cores:
stories: storycove_stories
authors: storycove_authors
images: images:
storage-path: /tmp/test-images storage-path: /tmp/test-images

View File

@@ -34,10 +34,10 @@ services:
- SPRING_DATASOURCE_USERNAME=storycove - SPRING_DATASOURCE_USERNAME=storycove
- SPRING_DATASOURCE_PASSWORD=${DB_PASSWORD} - SPRING_DATASOURCE_PASSWORD=${DB_PASSWORD}
- JWT_SECRET=${JWT_SECRET} - JWT_SECRET=${JWT_SECRET}
- OPENSEARCH_HOST=opensearch - SOLR_HOST=solr
- OPENSEARCH_PORT=9200 - SOLR_PORT=8983
- OPENSEARCH_SCHEME=http - SOLR_SCHEME=http
- SEARCH_ENGINE=${SEARCH_ENGINE:-opensearch} - SEARCH_ENGINE=${SEARCH_ENGINE:-solr}
- IMAGE_STORAGE_PATH=/app/images - IMAGE_STORAGE_PATH=/app/images
- APP_PASSWORD=${APP_PASSWORD} - APP_PASSWORD=${APP_PASSWORD}
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925} - STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
@@ -45,8 +45,10 @@ services:
- images_data:/app/images - images_data:/app/images
- library_config:/app/config - library_config:/app/config
depends_on: depends_on:
- postgres postgres:
- opensearch condition: service_started
solr:
condition: service_started
networks: networks:
- storycove-network - storycove-network
@@ -65,45 +67,38 @@ services:
- storycove-network - storycove-network
opensearch: solr:
image: opensearchproject/opensearch:3.2.0 build:
# No port mapping - only accessible within the Docker network context: .
dockerfile: solr.Dockerfile
ports:
- "8983:8983" # Expose Solr Admin UI for development
environment: environment:
- cluster.name=storycove-opensearch - SOLR_HEAP=512m
- node.name=opensearch-node - SOLR_JAVA_MEM=-Xms256m -Xmx512m
- discovery.type=single-node
- bootstrap.memory_lock=false
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
- "DISABLE_INSTALL_DEMO_CONFIG=true"
- "DISABLE_SECURITY_PLUGIN=true"
ulimits:
memlock:
soft: -1
hard: -1
nofile:
soft: 65536
hard: 65536
volumes: volumes:
- opensearch_data:/usr/share/opensearch/data - solr_data:/var/solr
deploy:
resources:
limits:
memory: 1G
reservations:
memory: 512M
stop_grace_period: 30s
healthcheck:
test: ["CMD-SHELL", "curl -f http://localhost:8983/solr/admin/ping || exit 1"]
interval: 30s
timeout: 10s
retries: 5
start_period: 60s
networks: networks:
- storycove-network - storycove-network
restart: unless-stopped restart: unless-stopped
opensearch-dashboards:
image: opensearchproject/opensearch-dashboards:3.2.0
ports:
- "5601:5601" # Expose OpenSearch Dashboard
environment:
- OPENSEARCH_HOSTS=http://opensearch:9200
- "DISABLE_SECURITY_DASHBOARDS_PLUGIN=true"
depends_on:
- opensearch
networks:
- storycove-network
volumes: volumes:
postgres_data: postgres_data:
opensearch_data: solr_data:
images_data: images_data:
library_config: library_config:
@@ -122,7 +117,7 @@ configs:
} }
server { server {
listen 80; listen 80;
client_max_body_size 256M; client_max_body_size 600M;
location / { location / {
proxy_pass http://frontend; proxy_pass http://frontend;
proxy_http_version 1.1; proxy_http_version 1.1;
@@ -140,9 +135,13 @@ configs:
proxy_set_header X-Real-IP $$remote_addr; proxy_set_header X-Real-IP $$remote_addr;
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $$scheme; proxy_set_header X-Forwarded-Proto $$scheme;
proxy_connect_timeout 60s; proxy_connect_timeout 900s;
proxy_send_timeout 60s; proxy_send_timeout 900s;
proxy_read_timeout 60s; proxy_read_timeout 900s;
# Large upload settings
client_max_body_size 600M;
proxy_request_buffering off;
proxy_max_temp_file_size 0;
} }
location /images/ { location /images/ {
alias /app/images/; alias /app/images/;

View File

@@ -9,7 +9,7 @@ RUN apk add --no-cache dumb-init
COPY package*.json ./ COPY package*.json ./
# Install dependencies with optimized settings # Install dependencies with optimized settings
RUN npm ci --prefer-offline --no-audit --frozen-lockfile RUN npm install --prefer-offline --no-audit --legacy-peer-deps
# Build stage # Build stage
FROM node:18-alpine AS builder FROM node:18-alpine AS builder
@@ -20,12 +20,23 @@ COPY --from=deps /app/node_modules ./node_modules
COPY . . COPY . .
# Set Node.js memory limit for build # Set Node.js memory limit for build
ENV NODE_OPTIONS="--max-old-space-size=1024" ENV NODE_OPTIONS="--max-old-space-size=2048"
ENV NEXT_TELEMETRY_DISABLED=1 ENV NEXT_TELEMETRY_DISABLED=1
# Build the application # List files to ensure everything is copied correctly
RUN ls -la
# Force clean build - remove any cached build artifacts
RUN rm -rf .next || true
# Build the application with verbose logging
RUN npm run build RUN npm run build
# Verify the build output exists
RUN ls -la .next/ || (echo ".next directory not found!" && exit 1)
RUN ls -la .next/standalone/ || (echo ".next/standalone directory not found!" && cat build.log && exit 1)
RUN ls -la .next/static/ || (echo ".next/static directory not found!" && exit 1)
# Production stage # Production stage
FROM node:18-alpine AS runner FROM node:18-alpine AS runner
WORKDIR /app WORKDIR /app

View File

@@ -2,4 +2,4 @@
/// <reference types="next/image-types/global" /> /// <reference types="next/image-types/global" />
// NOTE: This file should not be edited // NOTE: This file should not be edited
// see https://nextjs.org/docs/basic-features/typescript for more information. // see https://nextjs.org/docs/app/building-your-application/configuring/typescript for more information.

View File

@@ -2,6 +2,7 @@
const nextConfig = { const nextConfig = {
// Enable standalone output for optimized Docker builds // Enable standalone output for optimized Docker builds
output: 'standalone', output: 'standalone',
// Note: Body size limits are handled by nginx and backend, not Next.js frontend
// Removed Next.js rewrites since nginx handles all API routing // Removed Next.js rewrites since nginx handles all API routing
webpack: (config, { isServer }) => { webpack: (config, { isServer }) => {
// Exclude cheerio and its dependencies from client-side bundling // Exclude cheerio and its dependencies from client-side bundling

File diff suppressed because it is too large Load Diff

View File

@@ -10,18 +10,23 @@
"type-check": "tsc --noEmit" "type-check": "tsc --noEmit"
}, },
"dependencies": { "dependencies": {
"@heroicons/react": "^2.2.0", "@heroicons/react": "^2.2.0",
"autoprefixer": "^10.4.16", "autoprefixer": "^10.4.16",
"axios": "^1.11.0", "axios": "^1.7.7",
"cheerio": "^1.0.0-rc.12", "cheerio": "^1.0.0-rc.12",
"dompurify": "^3.2.6", "dompurify": "^3.2.6",
"next": "14.0.0", "next": "^14.2.32",
"postcss": "^8.4.31", "postcss": "^8.4.31",
"react": "^18", "react": "^18",
"react-dom": "^18", "react-dom": "^18",
"react-dropzone": "^14.2.3", "react-dropzone": "^14.2.3",
"server-only": "^0.0.1", "rxjs": "^7.8.1",
"tailwindcss": "^3.3.0" "server-only": "^0.0.1",
"slate": "^0.118.1",
"slate-react": "^0.117.4",
"slate-history": "^0.113.1",
"slate-dom": "^0.117.0",
"tailwindcss": "^3.3.0"
}, },
"devDependencies": { "devDependencies": {
"@types/dompurify": "^3.0.5", "@types/dompurify": "^3.0.5",

View File

@@ -0,0 +1,37 @@
{
"name": "storycove-frontend",
"version": "0.1.0",
"private": true,
"scripts": {
"dev": "next dev",
"build": "next build",
"start": "next start",
"lint": "next lint",
"type-check": "tsc --noEmit"
},
"dependencies": {
"@heroicons/react": "^2.2.0",
"@portabletext/react": "4.0.3",
"@portabletext/types": "2.0.14",
"autoprefixer": "^10.4.16",
"axios": "^1.11.0",
"cheerio": "^1.0.0-rc.12",
"dompurify": "^3.2.6",
"next": "14.0.0",
"postcss": "^8.4.31",
"react": "^18",
"react-dom": "^18",
"react-dropzone": "^14.2.3",
"server-only": "^0.0.1",
"tailwindcss": "^3.3.0"
},
"devDependencies": {
"@types/dompurify": "^3.0.5",
"@types/node": "^20",
"@types/react": "^18",
"@types/react-dom": "^18",
"eslint": "^8",
"eslint-config-next": "14.0.0",
"typescript": "^5"
}
}

View File

@@ -0,0 +1,550 @@
'use client';
import { useState, useEffect } from 'react';
import { useRouter, useSearchParams } from 'next/navigation';
import { useAuth } from '../../contexts/AuthContext';
import { Input, Textarea } from '../../components/ui/Input';
import Button from '../../components/ui/Button';
import TagInput from '../../components/stories/TagInput';
import SlateEditor from '../../components/stories/SlateEditor';
import ImageUpload from '../../components/ui/ImageUpload';
import AuthorSelector from '../../components/stories/AuthorSelector';
import SeriesSelector from '../../components/stories/SeriesSelector';
import { storyApi, authorApi } from '../../lib/api';
export default function AddStoryContent() {
const [formData, setFormData] = useState({
title: '',
summary: '',
authorName: '',
authorId: undefined as string | undefined,
contentHtml: '',
sourceUrl: '',
tags: [] as string[],
seriesName: '',
seriesId: undefined as string | undefined,
volume: '',
});
const [coverImage, setCoverImage] = useState<File | null>(null);
const [loading, setLoading] = useState(false);
const [processingImages, setProcessingImages] = useState(false);
const [errors, setErrors] = useState<Record<string, string>>({});
const [duplicateWarning, setDuplicateWarning] = useState<{
show: boolean;
count: number;
duplicates: Array<{
id: string;
title: string;
authorName: string;
createdAt: string;
}>;
}>({ show: false, count: 0, duplicates: [] });
const [checkingDuplicates, setCheckingDuplicates] = useState(false);
const router = useRouter();
const searchParams = useSearchParams();
const { isAuthenticated } = useAuth();
// Handle URL parameters
useEffect(() => {
const authorId = searchParams.get('authorId');
const from = searchParams.get('from');
// Pre-fill author if authorId is provided in URL
if (authorId) {
const loadAuthor = async () => {
try {
const author = await authorApi.getAuthor(authorId);
setFormData(prev => ({
...prev,
authorName: author.name,
authorId: author.id
}));
} catch (error) {
console.error('Failed to load author:', error);
}
};
loadAuthor();
}
// Handle URL import data
if (from === 'url-import') {
const title = searchParams.get('title') || '';
const summary = searchParams.get('summary') || '';
const author = searchParams.get('author') || '';
const sourceUrl = searchParams.get('sourceUrl') || '';
const tagsParam = searchParams.get('tags');
const content = searchParams.get('content') || '';
let tags: string[] = [];
try {
tags = tagsParam ? JSON.parse(tagsParam) : [];
} catch (error) {
console.error('Failed to parse tags:', error);
tags = [];
}
setFormData(prev => ({
...prev,
title,
summary,
authorName: author,
authorId: undefined, // Reset author ID when importing from URL
contentHtml: content,
sourceUrl,
tags
}));
// Show success message
setErrors({ success: 'Story data imported successfully! Review and edit as needed before saving.' });
}
}, [searchParams]);
// Load pending story data from bulk combine operation
useEffect(() => {
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
if (fromBulkCombine) {
const pendingStoryData = localStorage.getItem('pendingStory');
if (pendingStoryData) {
try {
const storyData = JSON.parse(pendingStoryData);
setFormData(prev => ({
...prev,
title: storyData.title || '',
authorName: storyData.author || '',
authorId: undefined, // Reset author ID for bulk combined stories
contentHtml: storyData.content || '',
sourceUrl: storyData.sourceUrl || '',
summary: storyData.summary || '',
tags: storyData.tags || []
}));
// Clear the pending data
localStorage.removeItem('pendingStory');
} catch (error) {
console.error('Failed to load pending story data:', error);
}
}
}
}, [searchParams]);
// Check for duplicates when title and author are both present
useEffect(() => {
const checkDuplicates = async () => {
const title = formData.title.trim();
const authorName = formData.authorName.trim();
// Don't check if user isn't authenticated or if title/author are empty
if (!isAuthenticated || !title || !authorName) {
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
return;
}
// Debounce the check to avoid too many API calls
const timeoutId = setTimeout(async () => {
try {
setCheckingDuplicates(true);
const result = await storyApi.checkDuplicate(title, authorName);
if (result.hasDuplicates) {
setDuplicateWarning({
show: true,
count: result.count,
duplicates: result.duplicates
});
} else {
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
}
} catch (error) {
console.error('Failed to check for duplicates:', error);
// Clear any existing duplicate warnings on error
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
// Don't show error to user as this is just a helpful warning
// Authentication errors will be handled by the API interceptor
} finally {
setCheckingDuplicates(false);
}
}, 500); // 500ms debounce
return () => clearTimeout(timeoutId);
};
checkDuplicates();
}, [formData.title, formData.authorName, isAuthenticated]);
const handleInputChange = (field: string) => (
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
) => {
setFormData(prev => ({
...prev,
[field]: e.target.value
}));
// Clear error when user starts typing
if (errors[field]) {
setErrors(prev => ({ ...prev, [field]: '' }));
}
};
const handleContentChange = (html: string) => {
setFormData(prev => ({ ...prev, contentHtml: html }));
if (errors.contentHtml) {
setErrors(prev => ({ ...prev, contentHtml: '' }));
}
};
const handleTagsChange = (tags: string[]) => {
setFormData(prev => ({ ...prev, tags }));
};
const handleAuthorChange = (authorName: string, authorId?: string) => {
setFormData(prev => ({
...prev,
authorName,
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
}));
// Clear error when user changes author
if (errors.authorName) {
setErrors(prev => ({ ...prev, authorName: '' }));
}
};
const handleSeriesChange = (seriesName: string, seriesId?: string) => {
setFormData(prev => ({
...prev,
seriesName,
seriesId: seriesId // This will be undefined if creating new series, which clears the existing ID
}));
// Clear error when user changes series
if (errors.seriesName) {
setErrors(prev => ({ ...prev, seriesName: '' }));
}
};
const validateForm = () => {
const newErrors: Record<string, string> = {};
if (!formData.title.trim()) {
newErrors.title = 'Title is required';
}
if (!formData.authorName.trim()) {
newErrors.authorName = 'Author name is required';
}
if (!formData.contentHtml.trim()) {
newErrors.contentHtml = 'Story content is required';
}
if (formData.seriesName && !formData.volume) {
newErrors.volume = 'Volume number is required when series is specified';
}
if (formData.volume && !formData.seriesName.trim()) {
newErrors.seriesName = 'Series name is required when volume is specified';
}
setErrors(newErrors);
return Object.keys(newErrors).length === 0;
};
// Helper function to detect external images in HTML content
const hasExternalImages = (htmlContent: string): boolean => {
if (!htmlContent) return false;
// Create a temporary DOM element to parse HTML
const tempDiv = document.createElement('div');
tempDiv.innerHTML = htmlContent;
const images = tempDiv.querySelectorAll('img');
for (let i = 0; i < images.length; i++) {
const img = images[i];
const src = img.getAttribute('src');
if (src && (src.startsWith('http://') || src.startsWith('https://'))) {
return true;
}
}
return false;
};
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!validateForm()) {
return;
}
setLoading(true);
try {
// First, create the story with JSON data
const storyData = {
title: formData.title,
summary: formData.summary || undefined,
contentHtml: formData.contentHtml,
sourceUrl: formData.sourceUrl || undefined,
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
// Send seriesId if we have it (existing series), otherwise send seriesName (new series)
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
// Send authorId if we have it (existing author), otherwise send authorName (new author)
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
};
const story = await storyApi.createStory(storyData);
// Process images if there are external images in the content
if (hasExternalImages(formData.contentHtml)) {
try {
setProcessingImages(true);
const imageResult = await storyApi.processContentImages(story.id, formData.contentHtml);
// If images were processed and content was updated, save the updated content
if (imageResult.processedContent !== formData.contentHtml) {
await storyApi.updateStory(story.id, {
title: formData.title,
summary: formData.summary || undefined,
contentHtml: imageResult.processedContent,
sourceUrl: formData.sourceUrl || undefined,
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
});
// Show success message with image processing info
if (imageResult.downloadedImages.length > 0) {
console.log(`Successfully processed ${imageResult.downloadedImages.length} images`);
}
if (imageResult.warnings && imageResult.warnings.length > 0) {
console.warn('Image processing warnings:', imageResult.warnings);
}
}
} catch (imageError) {
console.error('Failed to process images:', imageError);
// Don't fail the entire operation if image processing fails
// The story was created successfully, just without processed images
} finally {
setProcessingImages(false);
}
}
// If there's a cover image, upload it separately
if (coverImage) {
await storyApi.uploadCover(story.id, coverImage);
}
router.push(`/stories/${story.id}/detail`);
} catch (error: any) {
console.error('Failed to create story:', error);
const errorMessage = error.response?.data?.message || 'Failed to create story';
setErrors({ submit: errorMessage });
} finally {
setLoading(false);
}
};
return (
<>
{/* Success Message */}
{errors.success && (
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg mb-6">
<p className="text-green-800 dark:text-green-200">{errors.success}</p>
</div>
)}
<form onSubmit={handleSubmit} className="space-y-6">
{/* Title */}
<Input
label="Title *"
value={formData.title}
onChange={handleInputChange('title')}
placeholder="Enter the story title"
error={errors.title}
required
/>
{/* Author Selector */}
<AuthorSelector
label="Author *"
value={formData.authorName}
onChange={handleAuthorChange}
placeholder="Select or enter author name"
error={errors.authorName}
required
/>
{/* Duplicate Warning */}
{duplicateWarning.show && (
<div className="p-4 bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg">
<div className="flex items-start gap-3">
<div className="text-yellow-600 dark:text-yellow-400 mt-0.5">
</div>
<div>
<h4 className="font-medium text-yellow-800 dark:text-yellow-200">
Potential Duplicate Detected
</h4>
<p className="text-sm text-yellow-700 dark:text-yellow-300 mt-1">
Found {duplicateWarning.count} existing {duplicateWarning.count === 1 ? 'story' : 'stories'} with the same title and author:
</p>
<ul className="mt-2 space-y-1">
{duplicateWarning.duplicates.map((duplicate, index) => (
<li key={duplicate.id} className="text-sm text-yellow-700 dark:text-yellow-300">
<span className="font-medium">{duplicate.title}</span> by {duplicate.authorName}
<span className="text-xs ml-2">
(added {new Date(duplicate.createdAt).toLocaleDateString()})
</span>
</li>
))}
</ul>
<p className="text-xs text-yellow-600 dark:text-yellow-400 mt-2">
You can still create this story if it's different from the existing ones.
</p>
</div>
</div>
</div>
)}
{/* Checking indicator */}
{checkingDuplicates && (
<div className="flex items-center gap-2 text-sm theme-text">
<div className="animate-spin w-4 h-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
Checking for duplicates...
</div>
)}
{/* Summary */}
<div>
<label className="block text-sm font-medium theme-header mb-2">
Summary
</label>
<Textarea
value={formData.summary}
onChange={handleInputChange('summary')}
placeholder="Brief summary or description of the story..."
rows={3}
/>
<p className="text-sm theme-text mt-1">
Optional summary that will be displayed on the story detail page
</p>
</div>
{/* Cover Image Upload */}
<div>
<label className="block text-sm font-medium theme-header mb-2">
Cover Image
</label>
<ImageUpload
onImageSelect={setCoverImage}
accept="image/jpeg,image/png"
maxSizeMB={5}
aspectRatio="3:4"
placeholder="Drop a cover image here or click to select"
/>
</div>
{/* Content */}
<div>
<label className="block text-sm font-medium theme-header mb-2">
Story Content *
</label>
<SlateEditor
value={formData.contentHtml}
onChange={handleContentChange}
placeholder="Write or paste your story content here..."
error={errors.contentHtml}
enableImageProcessing={false}
/>
<p className="text-sm theme-text mt-2">
💡 <strong>Tip:</strong> If you paste content with images, they'll be automatically downloaded and stored locally when you save the story.
</p>
</div>
{/* Tags */}
<div>
<label className="block text-sm font-medium theme-header mb-2">
Tags
</label>
<TagInput
tags={formData.tags}
onChange={handleTagsChange}
placeholder="Add tags to categorize your story..."
/>
</div>
{/* Series and Volume */}
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
<SeriesSelector
label="Series (optional)"
value={formData.seriesName}
onChange={handleSeriesChange}
placeholder="Select or enter series name if part of a series"
error={errors.seriesName}
authorId={formData.authorId}
/>
<Input
label="Volume/Part (optional)"
type="number"
min="1"
value={formData.volume}
onChange={handleInputChange('volume')}
placeholder="Enter volume/part number"
error={errors.volume}
/>
</div>
{/* Source URL */}
<Input
label="Source URL (optional)"
type="url"
value={formData.sourceUrl}
onChange={handleInputChange('sourceUrl')}
placeholder="https://example.com/original-story-url"
/>
{/* Image Processing Indicator */}
{processingImages && (
<div className="p-4 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg">
<div className="flex items-center gap-3">
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
<p className="text-blue-800 dark:text-blue-200">
Processing and downloading images...
</p>
</div>
</div>
)}
{/* Submit Error */}
{errors.submit && (
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
<p className="text-red-800 dark:text-red-200">{errors.submit}</p>
</div>
)}
{/* Actions */}
<div className="flex justify-end gap-4 pt-6">
<Button
type="button"
variant="ghost"
onClick={() => router.back()}
disabled={loading}
>
Cancel
</Button>
<Button
type="submit"
loading={loading}
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
>
{processingImages ? 'Processing Images...' : 'Add Story'}
</Button>
</div>
</form>
</>
);
}

View File

@@ -1,554 +1,23 @@
'use client'; 'use client';
import { useState, useEffect } from 'react'; import { Suspense } from 'react';
import { useRouter, useSearchParams } from 'next/navigation';
import { useAuth } from '../../contexts/AuthContext';
import ImportLayout from '../../components/layout/ImportLayout'; import ImportLayout from '../../components/layout/ImportLayout';
import { Input, Textarea } from '../../components/ui/Input'; import LoadingSpinner from '../../components/ui/LoadingSpinner';
import Button from '../../components/ui/Button'; import AddStoryContent from './AddStoryContent';
import TagInput from '../../components/stories/TagInput';
import RichTextEditor from '../../components/stories/RichTextEditor';
import ImageUpload from '../../components/ui/ImageUpload';
import AuthorSelector from '../../components/stories/AuthorSelector';
import SeriesSelector from '../../components/stories/SeriesSelector';
import { storyApi, authorApi } from '../../lib/api';
export default function AddStoryPage() { export default function AddStoryPage() {
const [formData, setFormData] = useState({
title: '',
summary: '',
authorName: '',
authorId: undefined as string | undefined,
contentHtml: '',
sourceUrl: '',
tags: [] as string[],
seriesName: '',
seriesId: undefined as string | undefined,
volume: '',
});
const [coverImage, setCoverImage] = useState<File | null>(null);
const [loading, setLoading] = useState(false);
const [processingImages, setProcessingImages] = useState(false);
const [errors, setErrors] = useState<Record<string, string>>({});
const [duplicateWarning, setDuplicateWarning] = useState<{
show: boolean;
count: number;
duplicates: Array<{
id: string;
title: string;
authorName: string;
createdAt: string;
}>;
}>({ show: false, count: 0, duplicates: [] });
const [checkingDuplicates, setCheckingDuplicates] = useState(false);
const router = useRouter();
const searchParams = useSearchParams();
const { isAuthenticated } = useAuth();
// Handle URL parameters
useEffect(() => {
const authorId = searchParams.get('authorId');
const from = searchParams.get('from');
// Pre-fill author if authorId is provided in URL
if (authorId) {
const loadAuthor = async () => {
try {
const author = await authorApi.getAuthor(authorId);
setFormData(prev => ({
...prev,
authorName: author.name,
authorId: author.id
}));
} catch (error) {
console.error('Failed to load author:', error);
}
};
loadAuthor();
}
// Handle URL import data
if (from === 'url-import') {
const title = searchParams.get('title') || '';
const summary = searchParams.get('summary') || '';
const author = searchParams.get('author') || '';
const sourceUrl = searchParams.get('sourceUrl') || '';
const tagsParam = searchParams.get('tags');
const content = searchParams.get('content') || '';
let tags: string[] = [];
try {
tags = tagsParam ? JSON.parse(tagsParam) : [];
} catch (error) {
console.error('Failed to parse tags:', error);
tags = [];
}
setFormData(prev => ({
...prev,
title,
summary,
authorName: author,
authorId: undefined, // Reset author ID when importing from URL
contentHtml: content,
sourceUrl,
tags
}));
// Show success message
setErrors({ success: 'Story data imported successfully! Review and edit as needed before saving.' });
}
}, [searchParams]);
// Load pending story data from bulk combine operation
useEffect(() => {
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
if (fromBulkCombine) {
const pendingStoryData = localStorage.getItem('pendingStory');
if (pendingStoryData) {
try {
const storyData = JSON.parse(pendingStoryData);
setFormData(prev => ({
...prev,
title: storyData.title || '',
authorName: storyData.author || '',
authorId: undefined, // Reset author ID for bulk combined stories
contentHtml: storyData.content || '',
sourceUrl: storyData.sourceUrl || '',
summary: storyData.summary || '',
tags: storyData.tags || []
}));
// Clear the pending data
localStorage.removeItem('pendingStory');
} catch (error) {
console.error('Failed to load pending story data:', error);
}
}
}
}, [searchParams]);
// Check for duplicates when title and author are both present
useEffect(() => {
const checkDuplicates = async () => {
const title = formData.title.trim();
const authorName = formData.authorName.trim();
// Don't check if user isn't authenticated or if title/author are empty
if (!isAuthenticated || !title || !authorName) {
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
return;
}
// Debounce the check to avoid too many API calls
const timeoutId = setTimeout(async () => {
try {
setCheckingDuplicates(true);
const result = await storyApi.checkDuplicate(title, authorName);
if (result.hasDuplicates) {
setDuplicateWarning({
show: true,
count: result.count,
duplicates: result.duplicates
});
} else {
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
}
} catch (error) {
console.error('Failed to check for duplicates:', error);
// Clear any existing duplicate warnings on error
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
// Don't show error to user as this is just a helpful warning
// Authentication errors will be handled by the API interceptor
} finally {
setCheckingDuplicates(false);
}
}, 500); // 500ms debounce
return () => clearTimeout(timeoutId);
};
checkDuplicates();
}, [formData.title, formData.authorName, isAuthenticated]);
const handleInputChange = (field: string) => (
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
) => {
setFormData(prev => ({
...prev,
[field]: e.target.value
}));
// Clear error when user starts typing
if (errors[field]) {
setErrors(prev => ({ ...prev, [field]: '' }));
}
};
const handleContentChange = (html: string) => {
setFormData(prev => ({ ...prev, contentHtml: html }));
if (errors.contentHtml) {
setErrors(prev => ({ ...prev, contentHtml: '' }));
}
};
const handleTagsChange = (tags: string[]) => {
setFormData(prev => ({ ...prev, tags }));
};
const handleAuthorChange = (authorName: string, authorId?: string) => {
setFormData(prev => ({
...prev,
authorName,
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
}));
// Clear error when user changes author
if (errors.authorName) {
setErrors(prev => ({ ...prev, authorName: '' }));
}
};
const handleSeriesChange = (seriesName: string, seriesId?: string) => {
setFormData(prev => ({
...prev,
seriesName,
seriesId: seriesId // This will be undefined if creating new series, which clears the existing ID
}));
// Clear error when user changes series
if (errors.seriesName) {
setErrors(prev => ({ ...prev, seriesName: '' }));
}
};
const validateForm = () => {
const newErrors: Record<string, string> = {};
if (!formData.title.trim()) {
newErrors.title = 'Title is required';
}
if (!formData.authorName.trim()) {
newErrors.authorName = 'Author name is required';
}
if (!formData.contentHtml.trim()) {
newErrors.contentHtml = 'Story content is required';
}
if (formData.seriesName && !formData.volume) {
newErrors.volume = 'Volume number is required when series is specified';
}
if (formData.volume && !formData.seriesName.trim()) {
newErrors.seriesName = 'Series name is required when volume is specified';
}
setErrors(newErrors);
return Object.keys(newErrors).length === 0;
};
// Helper function to detect external images in HTML content
const hasExternalImages = (htmlContent: string): boolean => {
if (!htmlContent) return false;
// Create a temporary DOM element to parse HTML
const tempDiv = document.createElement('div');
tempDiv.innerHTML = htmlContent;
const images = tempDiv.querySelectorAll('img');
for (let i = 0; i < images.length; i++) {
const img = images[i];
const src = img.getAttribute('src');
if (src && (src.startsWith('http://') || src.startsWith('https://'))) {
return true;
}
}
return false;
};
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
if (!validateForm()) {
return;
}
setLoading(true);
try {
// First, create the story with JSON data
const storyData = {
title: formData.title,
summary: formData.summary || undefined,
contentHtml: formData.contentHtml,
sourceUrl: formData.sourceUrl || undefined,
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
// Send seriesId if we have it (existing series), otherwise send seriesName (new series)
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
// Send authorId if we have it (existing author), otherwise send authorName (new author)
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
};
const story = await storyApi.createStory(storyData);
// Process images if there are external images in the content
if (hasExternalImages(formData.contentHtml)) {
try {
setProcessingImages(true);
const imageResult = await storyApi.processContentImages(story.id, formData.contentHtml);
// If images were processed and content was updated, save the updated content
if (imageResult.processedContent !== formData.contentHtml) {
await storyApi.updateStory(story.id, {
title: formData.title,
summary: formData.summary || undefined,
contentHtml: imageResult.processedContent,
sourceUrl: formData.sourceUrl || undefined,
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
});
// Show success message with image processing info
if (imageResult.downloadedImages.length > 0) {
console.log(`Successfully processed ${imageResult.downloadedImages.length} images`);
}
if (imageResult.warnings && imageResult.warnings.length > 0) {
console.warn('Image processing warnings:', imageResult.warnings);
}
}
} catch (imageError) {
console.error('Failed to process images:', imageError);
// Don't fail the entire operation if image processing fails
// The story was created successfully, just without processed images
} finally {
setProcessingImages(false);
}
}
// If there's a cover image, upload it separately
if (coverImage) {
await storyApi.uploadCover(story.id, coverImage);
}
router.push(`/stories/${story.id}/detail`);
} catch (error: any) {
console.error('Failed to create story:', error);
const errorMessage = error.response?.data?.message || 'Failed to create story';
setErrors({ submit: errorMessage });
} finally {
setLoading(false);
}
};
return ( return (
<ImportLayout <ImportLayout
title="Add New Story" title="Add New Story"
description="Add a story to your personal collection" description="Add a story to your personal collection"
> >
{/* Success Message */} <Suspense fallback={
{errors.success && ( <div className="flex items-center justify-center py-20">
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg mb-6"> <LoadingSpinner size="lg" />
<p className="text-green-800 dark:text-green-200">{errors.success}</p>
</div> </div>
)} }>
<AddStoryContent />
<form onSubmit={handleSubmit} className="space-y-6"> </Suspense>
{/* Title */}
<Input
label="Title *"
value={formData.title}
onChange={handleInputChange('title')}
placeholder="Enter the story title"
error={errors.title}
required
/>
{/* Author Selector */}
<AuthorSelector
label="Author *"
value={formData.authorName}
onChange={handleAuthorChange}
placeholder="Select or enter author name"
error={errors.authorName}
required
/>
{/* Duplicate Warning */}
{duplicateWarning.show && (
<div className="p-4 bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg">
<div className="flex items-start gap-3">
<div className="text-yellow-600 dark:text-yellow-400 mt-0.5">
</div>
<div>
<h4 className="font-medium text-yellow-800 dark:text-yellow-200">
Potential Duplicate Detected
</h4>
<p className="text-sm text-yellow-700 dark:text-yellow-300 mt-1">
Found {duplicateWarning.count} existing {duplicateWarning.count === 1 ? 'story' : 'stories'} with the same title and author:
</p>
<ul className="mt-2 space-y-1">
{duplicateWarning.duplicates.map((duplicate, index) => (
<li key={duplicate.id} className="text-sm text-yellow-700 dark:text-yellow-300">
<span className="font-medium">{duplicate.title}</span> by {duplicate.authorName}
<span className="text-xs ml-2">
(added {new Date(duplicate.createdAt).toLocaleDateString()})
</span>
</li>
))}
</ul>
<p className="text-xs text-yellow-600 dark:text-yellow-400 mt-2">
You can still create this story if it's different from the existing ones.
</p>
</div>
</div>
</div>
)}
{/* Checking indicator */}
{checkingDuplicates && (
<div className="flex items-center gap-2 text-sm theme-text">
<div className="animate-spin w-4 h-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
Checking for duplicates...
</div>
)}
{/* Summary */}
<div>
<label className="block text-sm font-medium theme-header mb-2">
Summary
</label>
<Textarea
value={formData.summary}
onChange={handleInputChange('summary')}
placeholder="Brief summary or description of the story..."
rows={3}
/>
<p className="text-sm theme-text mt-1">
Optional summary that will be displayed on the story detail page
</p>
</div>
{/* Cover Image Upload */}
<div>
<label className="block text-sm font-medium theme-header mb-2">
Cover Image
</label>
<ImageUpload
onImageSelect={setCoverImage}
accept="image/jpeg,image/png"
maxSizeMB={5}
aspectRatio="3:4"
placeholder="Drop a cover image here or click to select"
/>
</div>
{/* Content */}
<div>
<label className="block text-sm font-medium theme-header mb-2">
Story Content *
</label>
<RichTextEditor
value={formData.contentHtml}
onChange={handleContentChange}
placeholder="Write or paste your story content here..."
error={errors.contentHtml}
enableImageProcessing={false}
/>
<p className="text-sm theme-text mt-2">
💡 <strong>Tip:</strong> If you paste content with images, they'll be automatically downloaded and stored locally when you save the story.
</p>
</div>
{/* Tags */}
<div>
<label className="block text-sm font-medium theme-header mb-2">
Tags
</label>
<TagInput
tags={formData.tags}
onChange={handleTagsChange}
placeholder="Add tags to categorize your story..."
/>
</div>
{/* Series and Volume */}
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
<SeriesSelector
label="Series (optional)"
value={formData.seriesName}
onChange={handleSeriesChange}
placeholder="Select or enter series name if part of a series"
error={errors.seriesName}
authorId={formData.authorId}
/>
<Input
label="Volume/Part (optional)"
type="number"
min="1"
value={formData.volume}
onChange={handleInputChange('volume')}
placeholder="Enter volume/part number"
error={errors.volume}
/>
</div>
{/* Source URL */}
<Input
label="Source URL (optional)"
type="url"
value={formData.sourceUrl}
onChange={handleInputChange('sourceUrl')}
placeholder="https://example.com/original-story-url"
/>
{/* Image Processing Indicator */}
{processingImages && (
<div className="p-4 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg">
<div className="flex items-center gap-3">
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
<p className="text-blue-800 dark:text-blue-200">
Processing and downloading images...
</p>
</div>
</div>
)}
{/* Submit Error */}
{errors.submit && (
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
<p className="text-red-800 dark:text-red-200">{errors.submit}</p>
</div>
)}
{/* Actions */}
<div className="flex justify-end gap-4 pt-6">
<Button
type="button"
variant="ghost"
onClick={() => router.back()}
disabled={loading}
>
Cancel
</Button>
<Button
type="submit"
loading={loading}
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
>
{processingImages ? 'Processing Images...' : 'Add Story'}
</Button>
</div>
</form>
</ImportLayout> </ImportLayout>
); );
} }

View File

@@ -35,7 +35,10 @@ export default function AuthorsPage() {
} else { } else {
setSearchLoading(true); setSearchLoading(true);
} }
const searchResults = await authorApi.getAuthors({
// Use Solr search for all queries (including empty search)
const searchResults = await authorApi.searchAuthors({
query: searchQuery || '*', // Use '*' for all authors when no search query
page: currentPage, page: currentPage,
size: ITEMS_PER_PAGE, size: ITEMS_PER_PAGE,
sortBy: sortBy, sortBy: sortBy,
@@ -44,21 +47,19 @@ export default function AuthorsPage() {
if (currentPage === 0) { if (currentPage === 0) {
// First page - replace all results // First page - replace all results
setAuthors(searchResults.content || []); setAuthors(searchResults.results || []);
setFilteredAuthors(searchResults.content || []); setFilteredAuthors(searchResults.results || []);
} else { } else {
// Subsequent pages - append results // Subsequent pages - append results
setAuthors(prev => [...prev, ...(searchResults.content || [])]); setAuthors(prev => [...prev, ...(searchResults.results || [])]);
setFilteredAuthors(prev => [...prev, ...(searchResults.content || [])]); setFilteredAuthors(prev => [...prev, ...(searchResults.results || [])]);
} }
setTotalHits(searchResults.totalElements || 0); setTotalHits(searchResults.totalHits || 0);
setHasMore(searchResults.content.length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < (searchResults.totalElements || 0)); setHasMore((searchResults.results || []).length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < (searchResults.totalHits || 0));
} catch (error) { } catch (error) {
console.error('Failed to load authors:', error); console.error('Failed to search authors:', error);
// Error handling for API failures
console.error('Failed to load authors:', error);
} finally { } finally {
setLoading(false); setLoading(false);
setSearchLoading(false); setSearchLoading(false);
@@ -84,17 +85,7 @@ export default function AuthorsPage() {
} }
}; };
// Client-side filtering for search query when using regular API // No longer needed - Solr search handles filtering directly
useEffect(() => {
if (searchQuery) {
const filtered = authors.filter(author =>
author.name.toLowerCase().includes(searchQuery.toLowerCase())
);
setFilteredAuthors(filtered);
} else {
setFilteredAuthors(authors);
}
}, [authors, searchQuery]);
// Note: We no longer have individual story ratings in the author list // Note: We no longer have individual story ratings in the author list
// Average rating would need to be calculated on backend if needed // Average rating would need to be calculated on backend if needed
@@ -117,9 +108,8 @@ export default function AuthorsPage() {
<div> <div>
<h1 className="text-3xl font-bold theme-header">Authors</h1> <h1 className="text-3xl font-bold theme-header">Authors</h1>
<p className="theme-text mt-1"> <p className="theme-text mt-1">
{searchQuery ? `${filteredAuthors.length} of ${authors.length}` : filteredAuthors.length} {(searchQuery ? authors.length : filteredAuthors.length) === 1 ? 'author' : 'authors'} {searchQuery ? `${totalHits} authors found` : `${totalHits} authors in your library`}
{searchQuery ? ` found` : ` in your library`} {hasMore && ` (showing first ${filteredAuthors.length})`}
{!searchQuery && hasMore && ` (showing first ${filteredAuthors.length})`}
</p> </p>
</div> </div>
@@ -226,7 +216,7 @@ export default function AuthorsPage() {
className="px-8 py-3" className="px-8 py-3"
loading={loading} loading={loading}
> >
{loading ? 'Loading...' : `Load More Authors (${totalHits - authors.length} remaining)`} {loading ? 'Loading...' : `Load More Authors (${totalHits - filteredAuthors.length} remaining)`}
</Button> </Button>
</div> </div>
)} )}

View File

@@ -139,6 +139,15 @@
@apply max-w-full h-auto mx-auto my-6 rounded-lg shadow-sm; @apply max-w-full h-auto mx-auto my-6 rounded-lg shadow-sm;
max-height: 80vh; /* Prevent images from being too tall */ max-height: 80vh; /* Prevent images from being too tall */
display: block; display: block;
/* Optimize for performance and prevent reloading */
will-change: auto;
transform: translateZ(0); /* Force hardware acceleration */
backface-visibility: hidden;
image-rendering: optimizeQuality;
/* Prevent layout shifts that might trigger reloads */
box-sizing: border-box;
/* Ensure stable dimensions */
min-height: 1px;
} }
.reading-content img[align="left"] { .reading-content img[align="left"] {

View File

@@ -0,0 +1,341 @@
'use client';
import { useState, useEffect } from 'react';
import { useRouter, useSearchParams } from 'next/navigation';
import { searchApi, storyApi, tagApi } from '../../lib/api';
import { Story, Tag, FacetCount, AdvancedFilters } from '../../types/api';
import { Input } from '../../components/ui/Input';
import Button from '../../components/ui/Button';
import StoryMultiSelect from '../../components/stories/StoryMultiSelect';
import TagFilter from '../../components/stories/TagFilter';
import LoadingSpinner from '../../components/ui/LoadingSpinner';
import SidebarLayout from '../../components/library/SidebarLayout';
import ToolbarLayout from '../../components/library/ToolbarLayout';
import MinimalLayout from '../../components/library/MinimalLayout';
import { useLibraryLayout } from '../../hooks/useLibraryLayout';
type ViewMode = 'grid' | 'list';
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastReadAt';
export default function LibraryContent() {
const router = useRouter();
const searchParams = useSearchParams();
const { layout } = useLibraryLayout();
const [stories, setStories] = useState<Story[]>([]);
const [tags, setTags] = useState<Tag[]>([]);
const [loading, setLoading] = useState(false);
const [searchLoading, setSearchLoading] = useState(false);
const [randomLoading, setRandomLoading] = useState(false);
const [searchQuery, setSearchQuery] = useState('');
const [selectedTags, setSelectedTags] = useState<string[]>([]);
const [viewMode, setViewMode] = useState<ViewMode>('list');
const [sortOption, setSortOption] = useState<SortOption>('lastReadAt');
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
const [page, setPage] = useState(0);
const [totalPages, setTotalPages] = useState(1);
const [totalElements, setTotalElements] = useState(0);
const [refreshTrigger, setRefreshTrigger] = useState(0);
const [urlParamsProcessed, setUrlParamsProcessed] = useState(false);
const [advancedFilters, setAdvancedFilters] = useState<AdvancedFilters>({});
// Initialize filters from URL parameters
useEffect(() => {
const tagsParam = searchParams.get('tags');
if (tagsParam) {
console.log('URL tag filter detected:', tagsParam);
// Use functional updates to ensure all state changes happen together
setSelectedTags([tagsParam]);
setPage(0); // Reset to first page when applying URL filter
}
setUrlParamsProcessed(true);
}, [searchParams]);
// Convert facet counts to Tag objects for the UI, enriched with full tag data
const [fullTags, setFullTags] = useState<Tag[]>([]);
// Fetch full tag data for enrichment
useEffect(() => {
const fetchFullTags = async () => {
try {
const result = await tagApi.getTags({ size: 1000 }); // Get all tags
setFullTags(result.content || []);
} catch (error) {
console.error('Failed to fetch full tag data:', error);
setFullTags([]);
}
};
fetchFullTags();
}, []);
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
if (!facets || !facets.tagNames_facet) {
return [];
}
return facets.tagNames_facet.map(facet => {
// Find the full tag data by name
const fullTag = fullTags.find(tag => tag.name.toLowerCase() === facet.value.toLowerCase());
return {
id: fullTag?.id || facet.value, // Use actual ID if available, fallback to name
name: facet.value,
storyCount: facet.count,
// Include color and other metadata from the full tag data
color: fullTag?.color,
description: fullTag?.description,
aliasCount: fullTag?.aliasCount,
createdAt: fullTag?.createdAt,
aliases: fullTag?.aliases
};
});
};
// Enrich existing tags when fullTags are loaded
useEffect(() => {
if (fullTags.length > 0) {
// Use functional update to get the current tags state
setTags(currentTags => {
if (currentTags.length > 0) {
// Check if tags already have color data to avoid infinite loops
const hasColors = currentTags.some(tag => tag.color);
if (!hasColors) {
// Re-enrich existing tags with color data
return currentTags.map(tag => {
const fullTag = fullTags.find(ft => ft.name.toLowerCase() === tag.name.toLowerCase());
return {
...tag,
color: fullTag?.color,
description: fullTag?.description,
aliasCount: fullTag?.aliasCount,
createdAt: fullTag?.createdAt,
aliases: fullTag?.aliases,
id: fullTag?.id || tag.id
};
});
}
}
return currentTags; // Return unchanged if no enrichment needed
});
}
}, [fullTags]); // Only run when fullTags change
// Debounce search to avoid too many API calls
useEffect(() => {
// Don't run search until URL parameters have been processed
if (!urlParamsProcessed) return;
const debounceTimer = setTimeout(() => {
const performSearch = async () => {
try {
// Use searchLoading for background search, loading only for initial load
const isInitialLoad = stories.length === 0 && !searchQuery;
if (isInitialLoad) {
setLoading(true);
} else {
setSearchLoading(true);
}
// Always use search API for consistency - use '*' for match-all when no query
const apiParams = {
query: searchQuery.trim() || '*',
page: page, // Use 0-based pagination consistently
size: 20,
tags: selectedTags.length > 0 ? selectedTags : undefined,
sortBy: sortOption,
sortDir: sortDirection,
facetBy: ['tagNames'], // Request tag facets for the filter UI
// Advanced filters
...advancedFilters
};
console.log('Performing search with params:', apiParams);
const result = await searchApi.search(apiParams);
const currentStories = result?.results || [];
setStories(currentStories);
setTotalPages(Math.ceil((result?.totalHits || 0) / 20));
setTotalElements(result?.totalHits || 0);
// Update tags from facets - these represent all matching stories, not just current page
const resultTags = convertFacetsToTags(result?.facets);
setTags(resultTags);
} catch (error) {
console.error('Failed to load stories:', error);
setStories([]);
setTags([]);
} finally {
setLoading(false);
setSearchLoading(false);
}
};
performSearch();
}, searchQuery ? 500 : 0); // Debounce search queries, but load immediately for filters/pagination
return () => clearTimeout(debounceTimer);
}, [searchQuery, selectedTags, sortOption, sortDirection, page, refreshTrigger, urlParamsProcessed, advancedFilters]);
const handleSearchChange = (e: React.ChangeEvent<HTMLInputElement>) => {
setSearchQuery(e.target.value);
setPage(0);
};
const handleStoryUpdate = () => {
setRefreshTrigger(prev => prev + 1);
};
const handleRandomStory = async () => {
if (totalElements === 0) return;
try {
setRandomLoading(true);
const randomStory = await storyApi.getRandomStory({
searchQuery: searchQuery || undefined,
tags: selectedTags.length > 0 ? selectedTags : undefined,
...advancedFilters
});
if (randomStory) {
router.push(`/stories/${randomStory.id}`);
} else {
alert('No stories available. Please add some stories first.');
}
} catch (error) {
console.error('Failed to get random story:', error);
alert('Failed to get a random story. Please try again.');
} finally {
setRandomLoading(false);
}
};
const clearFilters = () => {
setSearchQuery('');
setSelectedTags([]);
setAdvancedFilters({});
setPage(0);
setRefreshTrigger(prev => prev + 1);
};
const handleTagToggle = (tagName: string) => {
setSelectedTags(prev =>
prev.includes(tagName)
? prev.filter(t => t !== tagName)
: [...prev, tagName]
);
setPage(0);
setRefreshTrigger(prev => prev + 1);
};
const handleSortDirectionToggle = () => {
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
};
const handleAdvancedFiltersChange = (filters: AdvancedFilters) => {
setAdvancedFilters(filters);
setPage(0);
setRefreshTrigger(prev => prev + 1);
};
if (loading) {
return (
<div className="flex items-center justify-center py-20">
<LoadingSpinner size="lg" />
</div>
);
}
const handleSortChange = (option: string) => {
setSortOption(option as SortOption);
};
const layoutProps = {
stories,
tags,
totalElements,
searchQuery,
selectedTags,
viewMode,
sortOption,
sortDirection,
advancedFilters,
onSearchChange: handleSearchChange,
onTagToggle: handleTagToggle,
onViewModeChange: setViewMode,
onSortChange: handleSortChange,
onSortDirectionToggle: handleSortDirectionToggle,
onAdvancedFiltersChange: handleAdvancedFiltersChange,
onRandomStory: handleRandomStory,
onClearFilters: clearFilters,
};
const renderContent = () => {
if (stories.length === 0 && !loading) {
return (
<div className="text-center py-12 theme-card theme-shadow rounded-lg">
<p className="theme-text text-lg mb-4">
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false)
? 'No stories match your search criteria.'
: 'Your library is empty.'
}
</p>
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false) ? (
<Button variant="ghost" onClick={clearFilters}>
Clear Filters
</Button>
) : (
<Button href="/add-story">
Add Your First Story
</Button>
)}
</div>
);
}
return (
<>
<StoryMultiSelect
stories={stories}
viewMode={viewMode}
onUpdate={handleStoryUpdate}
allowMultiSelect={true}
/>
{/* Pagination */}
{totalPages > 1 && (
<div className="flex justify-center gap-2 mt-8">
<Button
variant="ghost"
onClick={() => setPage(page - 1)}
disabled={page === 0}
>
Previous
</Button>
<span className="flex items-center px-4 py-2 theme-text">
Page {page + 1} of {totalPages}
</span>
<Button
variant="ghost"
onClick={() => setPage(page + 1)}
disabled={page >= totalPages - 1}
>
Next
</Button>
</div>
)}
</>
);
};
const LayoutComponent = layout === 'sidebar' ? SidebarLayout :
layout === 'toolbar' ? ToolbarLayout :
MinimalLayout;
return (
<LayoutComponent {...layoutProps}>
{renderContent()}
</LayoutComponent>
);
}

View File

@@ -1,346 +1,20 @@
'use client'; 'use client';
import { useState, useEffect } from 'react'; import { Suspense } from 'react';
import { useRouter, useSearchParams } from 'next/navigation';
import { searchApi, storyApi, tagApi } from '../../lib/api';
import { Story, Tag, FacetCount, AdvancedFilters } from '../../types/api';
import AppLayout from '../../components/layout/AppLayout'; import AppLayout from '../../components/layout/AppLayout';
import { Input } from '../../components/ui/Input';
import Button from '../../components/ui/Button';
import StoryMultiSelect from '../../components/stories/StoryMultiSelect';
import TagFilter from '../../components/stories/TagFilter';
import LoadingSpinner from '../../components/ui/LoadingSpinner'; import LoadingSpinner from '../../components/ui/LoadingSpinner';
import SidebarLayout from '../../components/library/SidebarLayout'; import LibraryContent from './LibraryContent';
import ToolbarLayout from '../../components/library/ToolbarLayout';
import MinimalLayout from '../../components/library/MinimalLayout';
import { useLibraryLayout } from '../../hooks/useLibraryLayout';
type ViewMode = 'grid' | 'list';
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastRead';
export default function LibraryPage() { export default function LibraryPage() {
const router = useRouter(); return (
const searchParams = useSearchParams(); <AppLayout>
const { layout } = useLibraryLayout(); <Suspense fallback={
const [stories, setStories] = useState<Story[]>([]);
const [tags, setTags] = useState<Tag[]>([]);
const [loading, setLoading] = useState(false);
const [searchLoading, setSearchLoading] = useState(false);
const [randomLoading, setRandomLoading] = useState(false);
const [searchQuery, setSearchQuery] = useState('');
const [selectedTags, setSelectedTags] = useState<string[]>([]);
const [viewMode, setViewMode] = useState<ViewMode>('list');
const [sortOption, setSortOption] = useState<SortOption>('lastRead');
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
const [page, setPage] = useState(0);
const [totalPages, setTotalPages] = useState(1);
const [totalElements, setTotalElements] = useState(0);
const [refreshTrigger, setRefreshTrigger] = useState(0);
const [urlParamsProcessed, setUrlParamsProcessed] = useState(false);
const [advancedFilters, setAdvancedFilters] = useState<AdvancedFilters>({});
// Initialize filters from URL parameters
useEffect(() => {
const tagsParam = searchParams.get('tags');
if (tagsParam) {
console.log('URL tag filter detected:', tagsParam);
// Use functional updates to ensure all state changes happen together
setSelectedTags([tagsParam]);
setPage(0); // Reset to first page when applying URL filter
}
setUrlParamsProcessed(true);
}, [searchParams]);
// Convert facet counts to Tag objects for the UI, enriched with full tag data
const [fullTags, setFullTags] = useState<Tag[]>([]);
// Fetch full tag data for enrichment
useEffect(() => {
const fetchFullTags = async () => {
try {
const result = await tagApi.getTags({ size: 1000 }); // Get all tags
setFullTags(result.content || []);
} catch (error) {
console.error('Failed to fetch full tag data:', error);
setFullTags([]);
}
};
fetchFullTags();
}, []);
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
if (!facets || !facets.tagNames) {
return [];
}
return facets.tagNames.map(facet => {
// Find the full tag data by name
const fullTag = fullTags.find(tag => tag.name.toLowerCase() === facet.value.toLowerCase());
return {
id: fullTag?.id || facet.value, // Use actual ID if available, fallback to name
name: facet.value,
storyCount: facet.count,
// Include color and other metadata from the full tag data
color: fullTag?.color,
description: fullTag?.description,
aliasCount: fullTag?.aliasCount,
createdAt: fullTag?.createdAt,
aliases: fullTag?.aliases
};
});
};
// Enrich existing tags when fullTags are loaded
useEffect(() => {
if (fullTags.length > 0) {
// Use functional update to get the current tags state
setTags(currentTags => {
if (currentTags.length > 0) {
// Check if tags already have color data to avoid infinite loops
const hasColors = currentTags.some(tag => tag.color);
if (!hasColors) {
// Re-enrich existing tags with color data
return currentTags.map(tag => {
const fullTag = fullTags.find(ft => ft.name.toLowerCase() === tag.name.toLowerCase());
return {
...tag,
color: fullTag?.color,
description: fullTag?.description,
aliasCount: fullTag?.aliasCount,
createdAt: fullTag?.createdAt,
aliases: fullTag?.aliases,
id: fullTag?.id || tag.id
};
});
}
}
return currentTags; // Return unchanged if no enrichment needed
});
}
}, [fullTags]); // Only run when fullTags change
// Debounce search to avoid too many API calls
useEffect(() => {
// Don't run search until URL parameters have been processed
if (!urlParamsProcessed) return;
const debounceTimer = setTimeout(() => {
const performSearch = async () => {
try {
// Use searchLoading for background search, loading only for initial load
const isInitialLoad = stories.length === 0 && !searchQuery;
if (isInitialLoad) {
setLoading(true);
} else {
setSearchLoading(true);
}
// Always use search API for consistency - use '*' for match-all when no query
const apiParams = {
query: searchQuery.trim() || '*',
page: page, // Use 0-based pagination consistently
size: 20,
tags: selectedTags.length > 0 ? selectedTags : undefined,
sortBy: sortOption,
sortDir: sortDirection,
facetBy: ['tagNames'], // Request tag facets for the filter UI
// Advanced filters
...advancedFilters
};
console.log('Performing search with params:', apiParams);
const result = await searchApi.search(apiParams);
const currentStories = result?.results || [];
setStories(currentStories);
setTotalPages(Math.ceil((result?.totalHits || 0) / 20));
setTotalElements(result?.totalHits || 0);
// Update tags from facets - these represent all matching stories, not just current page
const resultTags = convertFacetsToTags(result?.facets);
setTags(resultTags);
} catch (error) {
console.error('Failed to load stories:', error);
setStories([]);
setTags([]);
} finally {
setLoading(false);
setSearchLoading(false);
}
};
performSearch();
}, searchQuery ? 500 : 0); // Debounce search queries, but load immediately for filters/pagination
return () => clearTimeout(debounceTimer);
}, [searchQuery, selectedTags, sortOption, sortDirection, page, refreshTrigger, urlParamsProcessed, advancedFilters]);
const handleSearchChange = (e: React.ChangeEvent<HTMLInputElement>) => {
setSearchQuery(e.target.value);
setPage(0);
};
const handleStoryUpdate = () => {
setRefreshTrigger(prev => prev + 1);
};
const handleRandomStory = async () => {
if (totalElements === 0) return;
try {
setRandomLoading(true);
const randomStory = await storyApi.getRandomStory({
searchQuery: searchQuery || undefined,
tags: selectedTags.length > 0 ? selectedTags : undefined,
...advancedFilters
});
if (randomStory) {
router.push(`/stories/${randomStory.id}`);
} else {
alert('No stories available. Please add some stories first.');
}
} catch (error) {
console.error('Failed to get random story:', error);
alert('Failed to get a random story. Please try again.');
} finally {
setRandomLoading(false);
}
};
const clearFilters = () => {
setSearchQuery('');
setSelectedTags([]);
setAdvancedFilters({});
setPage(0);
setRefreshTrigger(prev => prev + 1);
};
const handleTagToggle = (tagName: string) => {
setSelectedTags(prev =>
prev.includes(tagName)
? prev.filter(t => t !== tagName)
: [...prev, tagName]
);
setPage(0);
setRefreshTrigger(prev => prev + 1);
};
const handleSortDirectionToggle = () => {
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
};
const handleAdvancedFiltersChange = (filters: AdvancedFilters) => {
setAdvancedFilters(filters);
setPage(0);
setRefreshTrigger(prev => prev + 1);
};
if (loading) {
return (
<AppLayout>
<div className="flex items-center justify-center py-20"> <div className="flex items-center justify-center py-20">
<LoadingSpinner size="lg" /> <LoadingSpinner size="lg" />
</div> </div>
</AppLayout> }>
); <LibraryContent />
} </Suspense>
const handleSortChange = (option: string) => {
setSortOption(option as SortOption);
};
const layoutProps = {
stories,
tags,
totalElements,
searchQuery,
selectedTags,
viewMode,
sortOption,
sortDirection,
advancedFilters,
onSearchChange: handleSearchChange,
onTagToggle: handleTagToggle,
onViewModeChange: setViewMode,
onSortChange: handleSortChange,
onSortDirectionToggle: handleSortDirectionToggle,
onAdvancedFiltersChange: handleAdvancedFiltersChange,
onRandomStory: handleRandomStory,
onClearFilters: clearFilters,
};
const renderContent = () => {
if (stories.length === 0 && !loading) {
return (
<div className="text-center py-12 theme-card theme-shadow rounded-lg">
<p className="theme-text text-lg mb-4">
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false)
? 'No stories match your search criteria.'
: 'Your library is empty.'
}
</p>
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false) ? (
<Button variant="ghost" onClick={clearFilters}>
Clear Filters
</Button>
) : (
<Button href="/add-story">
Add Your First Story
</Button>
)}
</div>
);
}
return (
<>
<StoryMultiSelect
stories={stories}
viewMode={viewMode}
onUpdate={handleStoryUpdate}
allowMultiSelect={true}
/>
{/* Pagination */}
{totalPages > 1 && (
<div className="flex justify-center gap-2 mt-8">
<Button
variant="ghost"
onClick={() => setPage(page - 1)}
disabled={page === 0}
>
Previous
</Button>
<span className="flex items-center px-4 py-2 theme-text">
Page {page + 1} of {totalPages}
</span>
<Button
variant="ghost"
onClick={() => setPage(page + 1)}
disabled={page >= totalPages - 1}
>
Next
</Button>
</div>
)}
</>
);
};
const LayoutComponent = layout === 'sidebar' ? SidebarLayout :
layout === 'toolbar' ? ToolbarLayout :
MinimalLayout;
return (
<AppLayout>
<LayoutComponent {...layoutProps}>
{renderContent()}
</LayoutComponent>
</AppLayout> </AppLayout>
); );
} }

View File

@@ -1,27 +1,9 @@
import { NextRequest } from 'next/server'; import { NextRequest } from 'next/server';
import { progressStore, type ProgressUpdate } from '../../../../lib/progress';
// Configure route timeout for long-running progress streams // Configure route timeout for long-running progress streams
export const maxDuration = 900; // 15 minutes (900 seconds) export const maxDuration = 900; // 15 minutes (900 seconds)
interface ProgressUpdate {
type: 'progress' | 'completed' | 'error';
current: number;
total: number;
message: string;
url?: string;
title?: string;
author?: string;
wordCount?: number;
totalWordCount?: number;
error?: string;
combinedStory?: any;
results?: any[];
summary?: any;
}
// Global progress storage (in production, use Redis or database)
const progressStore = new Map<string, ProgressUpdate[]>();
export async function GET(request: NextRequest) { export async function GET(request: NextRequest) {
const searchParams = request.nextUrl.searchParams; const searchParams = request.nextUrl.searchParams;
const sessionId = searchParams.get('sessionId'); const sessionId = searchParams.get('sessionId');
@@ -81,13 +63,3 @@ export async function GET(request: NextRequest) {
}); });
} }
// Helper function for other routes to send progress updates
export function sendProgressUpdate(sessionId: string, update: ProgressUpdate) {
if (!progressStore.has(sessionId)) {
progressStore.set(sessionId, []);
}
progressStore.get(sessionId)!.push(update);
}
// Export the helper for other modules to use
export { progressStore };

View File

@@ -4,15 +4,7 @@ import { NextRequest, NextResponse } from 'next/server';
export const maxDuration = 900; // 15 minutes (900 seconds) export const maxDuration = 900; // 15 minutes (900 seconds)
// Import progress tracking helper // Import progress tracking helper
async function sendProgressUpdate(sessionId: string, update: any) { import { sendProgressUpdate } from '../../../lib/progress';
try {
// Dynamic import to avoid circular dependency
const { sendProgressUpdate: sendUpdate } = await import('./progress/route');
sendUpdate(sessionId, update);
} catch (error) {
console.warn('Failed to send progress update:', error);
}
}
interface BulkImportRequest { interface BulkImportRequest {
urls: string[]; urls: string[];
@@ -501,11 +493,11 @@ async function processIndividualMode(
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`); console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
// Trigger OpenSearch reindex if any stories were imported // Trigger Solr reindex if any stories were imported
if (importedCount > 0) { if (importedCount > 0) {
try { try {
console.log('Triggering OpenSearch reindex after bulk import...'); console.log('Triggering Solr reindex after bulk import...');
const reindexUrl = `http://backend:8080/api/admin/search/opensearch/reindex`; const reindexUrl = `http://backend:8080/api/admin/search/solr/reindex`;
const reindexResponse = await fetch(reindexUrl, { const reindexResponse = await fetch(reindexUrl, {
method: 'POST', method: 'POST',
headers: { headers: {
@@ -516,12 +508,12 @@ async function processIndividualMode(
if (reindexResponse.ok) { if (reindexResponse.ok) {
const reindexResult = await reindexResponse.json(); const reindexResult = await reindexResponse.json();
console.log('OpenSearch reindex completed:', reindexResult); console.log('Solr reindex completed:', reindexResult);
} else { } else {
console.warn('OpenSearch reindex failed:', reindexResponse.status); console.warn('Solr reindex failed:', reindexResponse.status);
} }
} catch (error) { } catch (error) {
console.warn('Failed to trigger OpenSearch reindex:', error); console.warn('Failed to trigger Solr reindex:', error);
// Don't fail the whole request if reindex fails // Don't fail the whole request if reindex fails
} }
} }

View File

@@ -0,0 +1,183 @@
'use client';
import { useState, useEffect } from 'react';
import { useRouter, useSearchParams } from 'next/navigation';
import TabNavigation from '../../components/ui/TabNavigation';
import AppearanceSettings from '../../components/settings/AppearanceSettings';
import ContentSettings from '../../components/settings/ContentSettings';
import SystemSettings from '../../components/settings/SystemSettings';
import Button from '../../components/ui/Button';
import { useTheme } from '../../lib/theme';
type FontFamily = 'serif' | 'sans' | 'mono';
type FontSize = 'small' | 'medium' | 'large' | 'extra-large';
type ReadingWidth = 'narrow' | 'medium' | 'wide';
interface Settings {
theme: 'light' | 'dark';
fontFamily: FontFamily;
fontSize: FontSize;
readingWidth: ReadingWidth;
readingSpeed: number; // words per minute
}
const defaultSettings: Settings = {
theme: 'light',
fontFamily: 'serif',
fontSize: 'medium',
readingWidth: 'medium',
readingSpeed: 200,
};
const tabs = [
{ id: 'appearance', label: 'Appearance', icon: '🎨' },
{ id: 'content', label: 'Content', icon: '🏷️' },
{ id: 'system', label: 'System', icon: '🔧' },
];
export default function SettingsContent() {
const router = useRouter();
const searchParams = useSearchParams();
const { theme, setTheme } = useTheme();
const [settings, setSettings] = useState<Settings>(defaultSettings);
const [saved, setSaved] = useState(false);
const [activeTab, setActiveTab] = useState('appearance');
// Initialize tab from URL parameter
useEffect(() => {
const tabFromUrl = searchParams.get('tab');
if (tabFromUrl && tabs.some(tab => tab.id === tabFromUrl)) {
setActiveTab(tabFromUrl);
}
}, [searchParams]);
// Load settings from localStorage on mount
useEffect(() => {
const savedSettings = localStorage.getItem('storycove-settings');
if (savedSettings) {
try {
const parsed = JSON.parse(savedSettings);
setSettings({ ...defaultSettings, ...parsed, theme });
} catch (error) {
console.error('Failed to parse saved settings:', error);
setSettings({ ...defaultSettings, theme });
}
} else {
setSettings({ ...defaultSettings, theme });
}
}, [theme]);
// Update URL when tab changes
const handleTabChange = (tabId: string) => {
setActiveTab(tabId);
const newUrl = `/settings?tab=${tabId}`;
router.replace(newUrl, { scroll: false });
};
// Save settings to localStorage
const saveSettings = () => {
localStorage.setItem('storycove-settings', JSON.stringify(settings));
// Apply theme change
setTheme(settings.theme);
// Apply font settings to CSS custom properties
const root = document.documentElement;
const fontFamilyMap = {
serif: 'Georgia, Times, serif',
sans: 'Inter, system-ui, sans-serif',
mono: 'Monaco, Consolas, monospace',
};
const fontSizeMap = {
small: '14px',
medium: '16px',
large: '18px',
'extra-large': '20px',
};
const readingWidthMap = {
narrow: '600px',
medium: '800px',
wide: '1000px',
};
root.style.setProperty('--reading-font-family', fontFamilyMap[settings.fontFamily]);
root.style.setProperty('--reading-font-size', fontSizeMap[settings.fontSize]);
root.style.setProperty('--reading-max-width', readingWidthMap[settings.readingWidth]);
setSaved(true);
setTimeout(() => setSaved(false), 2000);
};
const updateSetting = <K extends keyof Settings>(key: K, value: Settings[K]) => {
setSettings(prev => ({ ...prev, [key]: value }));
};
const resetToDefaults = () => {
setSettings({ ...defaultSettings, theme });
};
const renderTabContent = () => {
switch (activeTab) {
case 'appearance':
return (
<AppearanceSettings
settings={settings}
onSettingChange={updateSetting}
/>
);
case 'content':
return <ContentSettings />;
case 'system':
return <SystemSettings />;
default:
return <AppearanceSettings settings={settings} onSettingChange={updateSetting} />;
}
};
return (
<div className="max-w-4xl mx-auto space-y-6">
{/* Header */}
<div>
<h1 className="text-3xl font-bold theme-header">Settings</h1>
<p className="theme-text mt-2">
Customize your StoryCove experience and manage system settings
</p>
</div>
{/* Tab Navigation */}
<TabNavigation
tabs={tabs}
activeTab={activeTab}
onTabChange={handleTabChange}
className="mb-6"
/>
{/* Tab Content */}
<div className="min-h-[400px]">
{renderTabContent()}
</div>
{/* Save Actions - Only show for Appearance tab */}
{activeTab === 'appearance' && (
<div className="flex justify-end gap-4 pt-6 border-t theme-border">
<Button
variant="ghost"
onClick={resetToDefaults}
>
Reset to Defaults
</Button>
<Button
onClick={saveSettings}
className={saved ? 'bg-green-600 hover:bg-green-700' : ''}
>
{saved ? '✓ Saved!' : 'Save Settings'}
</Button>
</div>
)}
</div>
);
}

View File

@@ -1,186 +1,20 @@
'use client'; 'use client';
import { useState, useEffect } from 'react'; import { Suspense } from 'react';
import { useRouter, useSearchParams } from 'next/navigation';
import AppLayout from '../../components/layout/AppLayout'; import AppLayout from '../../components/layout/AppLayout';
import TabNavigation from '../../components/ui/TabNavigation'; import LoadingSpinner from '../../components/ui/LoadingSpinner';
import AppearanceSettings from '../../components/settings/AppearanceSettings'; import SettingsContent from './SettingsContent';
import ContentSettings from '../../components/settings/ContentSettings';
import SystemSettings from '../../components/settings/SystemSettings';
import Button from '../../components/ui/Button';
import { useTheme } from '../../lib/theme';
type FontFamily = 'serif' | 'sans' | 'mono';
type FontSize = 'small' | 'medium' | 'large' | 'extra-large';
type ReadingWidth = 'narrow' | 'medium' | 'wide';
interface Settings {
theme: 'light' | 'dark';
fontFamily: FontFamily;
fontSize: FontSize;
readingWidth: ReadingWidth;
readingSpeed: number; // words per minute
}
const defaultSettings: Settings = {
theme: 'light',
fontFamily: 'serif',
fontSize: 'medium',
readingWidth: 'medium',
readingSpeed: 200,
};
const tabs = [
{ id: 'appearance', label: 'Appearance', icon: '🎨' },
{ id: 'content', label: 'Content', icon: '🏷️' },
{ id: 'system', label: 'System', icon: '🔧' },
];
export default function SettingsPage() { export default function SettingsPage() {
const router = useRouter();
const searchParams = useSearchParams();
const { theme, setTheme } = useTheme();
const [settings, setSettings] = useState<Settings>(defaultSettings);
const [saved, setSaved] = useState(false);
const [activeTab, setActiveTab] = useState('appearance');
// Initialize tab from URL parameter
useEffect(() => {
const tabFromUrl = searchParams.get('tab');
if (tabFromUrl && tabs.some(tab => tab.id === tabFromUrl)) {
setActiveTab(tabFromUrl);
}
}, [searchParams]);
// Load settings from localStorage on mount
useEffect(() => {
const savedSettings = localStorage.getItem('storycove-settings');
if (savedSettings) {
try {
const parsed = JSON.parse(savedSettings);
setSettings({ ...defaultSettings, ...parsed, theme });
} catch (error) {
console.error('Failed to parse saved settings:', error);
setSettings({ ...defaultSettings, theme });
}
} else {
setSettings({ ...defaultSettings, theme });
}
}, [theme]);
// Update URL when tab changes
const handleTabChange = (tabId: string) => {
setActiveTab(tabId);
const newUrl = `/settings?tab=${tabId}`;
router.replace(newUrl, { scroll: false });
};
// Save settings to localStorage
const saveSettings = () => {
localStorage.setItem('storycove-settings', JSON.stringify(settings));
// Apply theme change
setTheme(settings.theme);
// Apply font settings to CSS custom properties
const root = document.documentElement;
const fontFamilyMap = {
serif: 'Georgia, Times, serif',
sans: 'Inter, system-ui, sans-serif',
mono: 'Monaco, Consolas, monospace',
};
const fontSizeMap = {
small: '14px',
medium: '16px',
large: '18px',
'extra-large': '20px',
};
const readingWidthMap = {
narrow: '600px',
medium: '800px',
wide: '1000px',
};
root.style.setProperty('--reading-font-family', fontFamilyMap[settings.fontFamily]);
root.style.setProperty('--reading-font-size', fontSizeMap[settings.fontSize]);
root.style.setProperty('--reading-max-width', readingWidthMap[settings.readingWidth]);
setSaved(true);
setTimeout(() => setSaved(false), 2000);
};
const updateSetting = <K extends keyof Settings>(key: K, value: Settings[K]) => {
setSettings(prev => ({ ...prev, [key]: value }));
};
const resetToDefaults = () => {
setSettings({ ...defaultSettings, theme });
};
const renderTabContent = () => {
switch (activeTab) {
case 'appearance':
return (
<AppearanceSettings
settings={settings}
onSettingChange={updateSetting}
/>
);
case 'content':
return <ContentSettings />;
case 'system':
return <SystemSettings />;
default:
return <AppearanceSettings settings={settings} onSettingChange={updateSetting} />;
}
};
return ( return (
<AppLayout> <AppLayout>
<div className="max-w-4xl mx-auto space-y-6"> <Suspense fallback={
{/* Header */} <div className="flex items-center justify-center py-20">
<div> <LoadingSpinner size="lg" />
<h1 className="text-3xl font-bold theme-header">Settings</h1>
<p className="theme-text mt-2">
Customize your StoryCove experience and manage system settings
</p>
</div> </div>
}>
{/* Tab Navigation */} <SettingsContent />
<TabNavigation </Suspense>
tabs={tabs}
activeTab={activeTab}
onTabChange={handleTabChange}
className="mb-6"
/>
{/* Tab Content */}
<div className="min-h-[400px]">
{renderTabContent()}
</div>
{/* Save Actions - Only show for Appearance tab */}
{activeTab === 'appearance' && (
<div className="flex justify-end gap-4 pt-6 border-t theme-border">
<Button
variant="ghost"
onClick={resetToDefaults}
>
Reset to Defaults
</Button>
<Button
onClick={saveSettings}
className={saved ? 'bg-green-600 hover:bg-green-700' : ''}
>
{saved ? '✓ Saved!' : 'Save Settings'}
</Button>
</div>
)}
</div>
</AppLayout> </AppLayout>
); );
} }

View File

@@ -7,7 +7,7 @@ import { Input, Textarea } from '../../../../components/ui/Input';
import Button from '../../../../components/ui/Button'; import Button from '../../../../components/ui/Button';
import TagInput from '../../../../components/stories/TagInput'; import TagInput from '../../../../components/stories/TagInput';
import TagSuggestions from '../../../../components/tags/TagSuggestions'; import TagSuggestions from '../../../../components/tags/TagSuggestions';
import RichTextEditor from '../../../../components/stories/RichTextEditor'; import SlateEditor from '../../../../components/stories/SlateEditor';
import ImageUpload from '../../../../components/ui/ImageUpload'; import ImageUpload from '../../../../components/ui/ImageUpload';
import AuthorSelector from '../../../../components/stories/AuthorSelector'; import AuthorSelector from '../../../../components/stories/AuthorSelector';
import SeriesSelector from '../../../../components/stories/SeriesSelector'; import SeriesSelector from '../../../../components/stories/SeriesSelector';
@@ -337,7 +337,7 @@ export default function EditStoryPage() {
<label className="block text-sm font-medium theme-header mb-2"> <label className="block text-sm font-medium theme-header mb-2">
Story Content * Story Content *
</label> </label>
<RichTextEditor <SlateEditor
value={formData.contentHtml} value={formData.contentHtml}
onChange={handleContentChange} onChange={handleContentChange}
placeholder="Edit your story content here..." placeholder="Edit your story content here..."

View File

@@ -1,6 +1,6 @@
'use client'; 'use client';
import { useState, useEffect, useRef, useCallback } from 'react'; import { useState, useEffect, useRef, useCallback, useMemo, memo } from 'react';
import { useParams, useRouter } from 'next/navigation'; import { useParams, useRouter } from 'next/navigation';
import Link from 'next/link'; import Link from 'next/link';
import { storyApi, seriesApi } from '../../../lib/api'; import { storyApi, seriesApi } from '../../../lib/api';
@@ -11,6 +11,65 @@ import StoryRating from '../../../components/stories/StoryRating';
import TagDisplay from '../../../components/tags/TagDisplay'; import TagDisplay from '../../../components/tags/TagDisplay';
import TableOfContents from '../../../components/stories/TableOfContents'; import TableOfContents from '../../../components/stories/TableOfContents';
import { sanitizeHtml, preloadSanitizationConfig } from '../../../lib/sanitization'; import { sanitizeHtml, preloadSanitizationConfig } from '../../../lib/sanitization';
import { debug } from '../../../lib/debug';
// Memoized content component that only re-renders when content changes
const StoryContent = memo(({
content,
contentRef
}: {
content: string;
contentRef: React.RefObject<HTMLDivElement>;
}) => {
const renderTime = Date.now();
debug.log('🔄 StoryContent component rendering at', renderTime, 'with content length:', content.length, 'hash:', content.slice(0, 50) + '...');
// Add observer to track image loading events
useEffect(() => {
if (!contentRef.current) return;
const images = contentRef.current.querySelectorAll('img');
debug.log('📸 Found', images.length, 'images in content');
const handleImageLoad = (e: Event) => {
const img = e.target as HTMLImageElement;
debug.log('🖼️ Image loaded:', img.src);
};
const handleImageError = (e: Event) => {
const img = e.target as HTMLImageElement;
debug.log('❌ Image error:', img.src);
};
images.forEach(img => {
img.addEventListener('load', handleImageLoad);
img.addEventListener('error', handleImageError);
debug.log('👀 Monitoring image:', img.src);
});
return () => {
images.forEach(img => {
img.removeEventListener('load', handleImageLoad);
img.removeEventListener('error', handleImageError);
});
};
}, [content]);
return (
<div
ref={contentRef}
className="reading-content"
dangerouslySetInnerHTML={{ __html: content }}
style={{
// Prevent layout shifts that might cause image reloads
minHeight: '100vh',
contain: 'layout style'
}}
/>
);
});
StoryContent.displayName = 'StoryContent';
export default function StoryReadingPage() { export default function StoryReadingPage() {
const params = useParams(); const params = useParams();
@@ -91,14 +150,14 @@ export default function StoryReadingPage() {
// Debounced function to save reading position // Debounced function to save reading position
const saveReadingPosition = useCallback(async (position: number) => { const saveReadingPosition = useCallback(async (position: number) => {
if (!story || position === story.readingPosition) { if (!story || position === story.readingPosition) {
console.log('Skipping save - no story or position unchanged:', { story: !!story, position, current: story?.readingPosition }); debug.log('Skipping save - no story or position unchanged:', { story: !!story, position, current: story?.readingPosition });
return; return;
} }
console.log('Saving reading position:', position, 'for story:', story.id); debug.log('Saving reading position:', position, 'for story:', story.id);
try { try {
const updatedStory = await storyApi.updateReadingProgress(story.id, position); const updatedStory = await storyApi.updateReadingProgress(story.id, position);
console.log('Reading position saved successfully, updated story:', updatedStory.readingPosition); debug.log('Reading position saved successfully, updated story:', updatedStory.readingPosition);
setStory(prev => prev ? { ...prev, readingPosition: position, lastReadAt: updatedStory.lastReadAt } : null); setStory(prev => prev ? { ...prev, readingPosition: position, lastReadAt: updatedStory.lastReadAt } : null);
} catch (error) { } catch (error) {
console.error('Failed to save reading position:', error); console.error('Failed to save reading position:', error);
@@ -179,12 +238,12 @@ export default function StoryReadingPage() {
if (story && sanitizedContent && !hasScrolledToPosition) { if (story && sanitizedContent && !hasScrolledToPosition) {
// Use a small delay to ensure content is rendered // Use a small delay to ensure content is rendered
const timeout = setTimeout(() => { const timeout = setTimeout(() => {
console.log('Initializing reading position tracking, saved position:', story.readingPosition); debug.log('Initializing reading position tracking, saved position:', story.readingPosition);
// Check if there's a hash in the URL (for TOC navigation) // Check if there's a hash in the URL (for TOC navigation)
const hash = window.location.hash.substring(1); const hash = window.location.hash.substring(1);
if (hash && hash.startsWith('heading-')) { if (hash && hash.startsWith('heading-')) {
console.log('Auto-scrolling to heading from URL hash:', hash); debug.log('Auto-scrolling to heading from URL hash:', hash);
const element = document.getElementById(hash); const element = document.getElementById(hash);
if (element) { if (element) {
element.scrollIntoView({ element.scrollIntoView({
@@ -198,13 +257,13 @@ export default function StoryReadingPage() {
// Otherwise, use saved reading position // Otherwise, use saved reading position
if (story.readingPosition && story.readingPosition > 0) { if (story.readingPosition && story.readingPosition > 0) {
console.log('Auto-scrolling to saved position:', story.readingPosition); debug.log('Auto-scrolling to saved position:', story.readingPosition);
const initialPercentage = calculateReadingPercentage(story.readingPosition); const initialPercentage = calculateReadingPercentage(story.readingPosition);
setReadingPercentage(initialPercentage); setReadingPercentage(initialPercentage);
scrollToCharacterPosition(story.readingPosition); scrollToCharacterPosition(story.readingPosition);
} else { } else {
// Even if there's no saved position, mark as ready for tracking // Even if there's no saved position, mark as ready for tracking
console.log('No saved position, starting fresh tracking'); debug.log('No saved position, starting fresh tracking');
setReadingPercentage(0); setReadingPercentage(0);
setHasScrolledToPosition(true); setHasScrolledToPosition(true);
} }
@@ -216,58 +275,72 @@ export default function StoryReadingPage() {
// Track reading progress and save position // Track reading progress and save position
useEffect(() => { useEffect(() => {
let ticking = false;
let scrollEventCount = 0;
const handleScroll = () => { const handleScroll = () => {
const article = document.querySelector('[data-reading-content]') as HTMLElement; scrollEventCount++;
if (article) { if (scrollEventCount % 10 === 0) {
const scrolled = window.scrollY; debug.log('📜 Scroll event #', scrollEventCount, 'at', Date.now());
const articleTop = article.offsetTop; }
const articleHeight = article.scrollHeight;
const windowHeight = window.innerHeight;
const progress = Math.min(100, Math.max(0, if (!ticking) {
((scrolled - articleTop + windowHeight) / articleHeight) * 100 requestAnimationFrame(() => {
)); const article = document.querySelector('[data-reading-content]') as HTMLElement;
if (article) {
const scrolled = window.scrollY;
const articleTop = article.offsetTop;
const articleHeight = article.scrollHeight;
const windowHeight = window.innerHeight;
setReadingProgress(progress); const progress = Math.min(100, Math.max(0,
((scrolled - articleTop + windowHeight) / articleHeight) * 100
));
// Multi-method end-of-story detection setReadingProgress(progress);
const documentHeight = document.documentElement.scrollHeight;
const windowBottom = scrolled + windowHeight;
const distanceFromBottom = documentHeight - windowBottom;
// Method 1: Distance from bottom (most reliable) // Multi-method end-of-story detection
const nearBottom = distanceFromBottom <= 200; const documentHeight = document.documentElement.scrollHeight;
const windowBottom = scrolled + windowHeight;
const distanceFromBottom = documentHeight - windowBottom;
// Method 2: High progress but only as secondary check // Method 1: Distance from bottom (most reliable)
const highProgress = progress >= 98; const nearBottom = distanceFromBottom <= 200;
// Method 3: Check if story content itself is fully visible // Method 2: High progress but only as secondary check
const storyContentElement = contentRef.current; const highProgress = progress >= 98;
let storyContentFullyVisible = false;
if (storyContentElement) {
const contentRect = storyContentElement.getBoundingClientRect();
const contentBottom = scrolled + contentRect.bottom;
const documentContentHeight = Math.max(documentHeight - 300, contentBottom); // Account for footer padding
storyContentFullyVisible = windowBottom >= documentContentHeight;
}
// Trigger end detection if user is near bottom AND (has high progress OR story content is fully visible) // Method 3: Check if story content itself is fully visible
if (nearBottom && (highProgress || storyContentFullyVisible) && !hasReachedEnd && hasScrolledToPosition) { const storyContentElement = contentRef.current;
console.log('End of story detected:', { nearBottom, highProgress, storyContentFullyVisible, distanceFromBottom, progress }); let storyContentFullyVisible = false;
setHasReachedEnd(true); if (storyContentElement) {
setShowEndOfStoryPopup(true); const contentRect = storyContentElement.getBoundingClientRect();
} const contentBottom = scrolled + contentRect.bottom;
const documentContentHeight = Math.max(documentHeight - 300, contentBottom); // Account for footer padding
storyContentFullyVisible = windowBottom >= documentContentHeight;
}
// Save reading position and update percentage (debounced) // Trigger end detection if user is near bottom AND (has high progress OR story content is fully visible)
if (hasScrolledToPosition) { // Only save after initial auto-scroll if (nearBottom && (highProgress || storyContentFullyVisible) && !hasReachedEnd && hasScrolledToPosition) {
const characterPosition = getCharacterPositionFromScroll(); debug.log('End of story detected:', { nearBottom, highProgress, storyContentFullyVisible, distanceFromBottom, progress });
const percentage = calculateReadingPercentage(characterPosition); setHasReachedEnd(true);
console.log('Scroll detected, character position:', characterPosition, 'percentage:', percentage); setShowEndOfStoryPopup(true);
setReadingPercentage(percentage); }
debouncedSavePosition(characterPosition);
} else { // Save reading position and update percentage (debounced)
console.log('Scroll detected but not ready for tracking yet'); if (hasScrolledToPosition) { // Only save after initial auto-scroll
} const characterPosition = getCharacterPositionFromScroll();
const percentage = calculateReadingPercentage(characterPosition);
debug.log('Scroll detected, character position:', characterPosition, 'percentage:', percentage);
setReadingPercentage(percentage);
debouncedSavePosition(characterPosition);
} else {
debug.log('Scroll detected but not ready for tracking yet');
}
}
ticking = false;
});
ticking = true;
} }
}; };
@@ -329,6 +402,11 @@ export default function StoryReadingPage() {
const nextStory = findNextStory(); const nextStory = findNextStory();
const previousStory = findPreviousStory(); const previousStory = findPreviousStory();
// Memoize the sanitized content to prevent re-processing on scroll
const memoizedContent = useMemo(() => {
return sanitizedContent;
}, [sanitizedContent]);
if (loading) { if (loading) {
return ( return (
<div className="min-h-screen theme-bg flex items-center justify-center"> <div className="min-h-screen theme-bg flex items-center justify-center">
@@ -535,10 +613,10 @@ export default function StoryReadingPage() {
</header> </header>
{/* Story Content */} {/* Story Content */}
<div <StoryContent
ref={contentRef} key={`story-content-${story?.id || 'loading'}`}
className="reading-content" content={memoizedContent}
dangerouslySetInnerHTML={{ __html: sanitizedContent }} contentRef={contentRef}
/> />
</article> </article>

View File

@@ -0,0 +1,259 @@
import React, { useState, useEffect } from 'react';
import { ImageProcessingProgressTracker, ImageProcessingProgress } from '../utils/imageProcessingProgress';
interface ImageProcessingProgressProps {
storyId: string;
autoStart?: boolean;
onComplete?: () => void;
onError?: (error: string) => void;
}
export const ImageProcessingProgressComponent: React.FC<ImageProcessingProgressProps> = ({
storyId,
autoStart = false,
onComplete,
onError
}) => {
const [progress, setProgress] = useState<ImageProcessingProgress | null>(null);
const [isTracking, setIsTracking] = useState(false);
const [tracker, setTracker] = useState<ImageProcessingProgressTracker | null>(null);
const startTracking = () => {
if (tracker) {
tracker.stop();
}
const newTracker = new ImageProcessingProgressTracker(storyId);
newTracker.onProgress((progress) => {
setProgress(progress);
});
newTracker.onComplete((finalProgress) => {
setProgress(finalProgress);
setIsTracking(false);
onComplete?.();
});
newTracker.onError((error) => {
console.error('Image processing error:', error);
setIsTracking(false);
onError?.(error);
});
setTracker(newTracker);
setIsTracking(true);
newTracker.start();
};
const stopTracking = () => {
if (tracker) {
tracker.stop();
setIsTracking(false);
}
};
useEffect(() => {
if (autoStart) {
startTracking();
}
return () => {
if (tracker) {
tracker.stop();
}
};
}, [storyId, autoStart]);
if (!progress && !isTracking) {
return null;
}
if (!progress?.isProcessing && !progress?.completed) {
return null;
}
return (
<div className="image-processing-progress">
<div className="progress-header">
<h4>Processing Images</h4>
{isTracking && (
<button onClick={stopTracking} className="btn btn-sm btn-secondary">
Cancel
</button>
)}
</div>
{progress && (
<div className="progress-content">
{progress.error ? (
<div className="alert alert-danger">
<strong>Error:</strong> {progress.error}
</div>
) : progress.completed ? (
<div className="alert alert-success">
<strong>Completed:</strong> {progress.status}
</div>
) : (
<div className="progress-info">
<div className="status-text">
<strong>Status:</strong> {progress.status}
</div>
<div className="progress-stats">
Processing {progress.processedImages} of {progress.totalImages} images
({progress.progressPercentage.toFixed(1)}%)
</div>
{progress.currentImageUrl && (
<div className="current-image">
<strong>Current:</strong>
<span className="image-url" title={progress.currentImageUrl}>
{progress.currentImageUrl.length > 60
? `...${progress.currentImageUrl.slice(-60)}`
: progress.currentImageUrl
}
</span>
</div>
)}
{/* Progress bar */}
<div className="progress-bar-container">
<div className="progress-bar">
<div
className="progress-bar-fill"
style={{ width: `${progress.progressPercentage}%` }}
/>
</div>
<span className="progress-percentage">
{progress.progressPercentage.toFixed(1)}%
</span>
</div>
</div>
)}
</div>
)}
<style jsx>{`
.image-processing-progress {
background: #f8f9fa;
border: 1px solid #dee2e6;
border-radius: 4px;
padding: 1rem;
margin: 1rem 0;
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif;
}
.progress-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 0.75rem;
}
.progress-header h4 {
margin: 0;
font-size: 1.1rem;
color: #495057;
}
.progress-content {
space-y: 0.5rem;
}
.status-text {
font-size: 0.9rem;
color: #6c757d;
margin-bottom: 0.5rem;
}
.progress-stats {
font-weight: 500;
color: #495057;
margin-bottom: 0.5rem;
}
.current-image {
font-size: 0.85rem;
color: #6c757d;
margin-bottom: 0.75rem;
}
.image-url {
font-family: 'Monaco', 'Menlo', 'Ubuntu Mono', monospace;
background: #e9ecef;
padding: 0.1rem 0.3rem;
border-radius: 2px;
margin-left: 0.5rem;
}
.progress-bar-container {
display: flex;
align-items: center;
gap: 0.75rem;
}
.progress-bar {
flex: 1;
height: 8px;
background: #e9ecef;
border-radius: 4px;
overflow: hidden;
}
.progress-bar-fill {
height: 100%;
background: linear-gradient(90deg, #007bff, #0056b3);
transition: width 0.3s ease;
}
.progress-percentage {
font-size: 0.85rem;
font-weight: 500;
color: #495057;
min-width: 3rem;
text-align: right;
}
.btn {
padding: 0.25rem 0.5rem;
border-radius: 3px;
border: 1px solid transparent;
cursor: pointer;
font-size: 0.8rem;
}
.btn-secondary {
background: #6c757d;
color: white;
border-color: #6c757d;
}
.btn-secondary:hover {
background: #5a6268;
border-color: #545b62;
}
.alert {
padding: 0.75rem;
border-radius: 4px;
margin-bottom: 0.5rem;
}
.alert-danger {
background: #f8d7da;
color: #721c24;
border: 1px solid #f5c6cb;
}
.alert-success {
background: #d4edda;
color: #155724;
border: 1px solid #c3e6cb;
}
`}</style>
</div>
);
};
export default ImageProcessingProgressComponent;

View File

@@ -1,16 +1,9 @@
'use client'; 'use client';
import { ReactNode } from 'react'; import { ReactNode, Suspense } from 'react';
import Link from 'next/link';
import { usePathname, useSearchParams } from 'next/navigation';
import AppLayout from './AppLayout'; import AppLayout from './AppLayout';
import LoadingSpinner from '../ui/LoadingSpinner';
interface ImportTab { import ImportLayoutContent from './ImportLayoutContent';
id: string;
label: string;
href: string;
description: string;
}
interface ImportLayoutProps { interface ImportLayoutProps {
children: ReactNode; children: ReactNode;
@@ -18,112 +11,23 @@ interface ImportLayoutProps {
description?: string; description?: string;
} }
const importTabs: ImportTab[] = [ export default function ImportLayout({
{ children,
id: 'manual', title,
label: 'Manual Entry', description
href: '/add-story', }: ImportLayoutProps) {
description: 'Add a story by manually entering details'
},
{
id: 'url',
label: 'Import from URL',
href: '/import',
description: 'Import a single story from a website'
},
{
id: 'epub',
label: 'Import EPUB',
href: '/import/epub',
description: 'Import a story from an EPUB file'
},
{
id: 'bulk',
label: 'Bulk Import',
href: '/import/bulk',
description: 'Import multiple stories from a list of URLs'
}
];
export default function ImportLayout({ children, title, description }: ImportLayoutProps) {
const pathname = usePathname();
const searchParams = useSearchParams();
const mode = searchParams.get('mode');
// Determine which tab is active
const getActiveTab = () => {
if (pathname === '/add-story') {
return 'manual';
} else if (pathname === '/import') {
return 'url';
} else if (pathname === '/import/epub') {
return 'epub';
} else if (pathname === '/import/bulk') {
return 'bulk';
}
return 'manual';
};
const activeTab = getActiveTab();
return ( return (
<AppLayout> <AppLayout>
<div className="max-w-4xl mx-auto space-y-6"> <div className="max-w-4xl mx-auto">
{/* Header */} <Suspense fallback={
<div className="text-center"> <div className="flex items-center justify-center py-20">
<h1 className="text-3xl font-bold theme-header">{title}</h1> <LoadingSpinner size="lg" />
{description && (
<p className="theme-text mt-2 text-lg">
{description}
</p>
)}
</div>
{/* Tab Navigation */}
<div className="theme-card theme-shadow rounded-lg overflow-hidden">
{/* Tab Headers */}
<div className="flex border-b theme-border overflow-x-auto">
{importTabs.map((tab) => (
<Link
key={tab.id}
href={tab.href}
className={`flex-1 min-w-0 px-4 py-3 text-sm font-medium text-center transition-colors whitespace-nowrap ${
activeTab === tab.id
? 'theme-accent-bg text-white border-b-2 border-transparent'
: 'theme-text hover:theme-accent-light hover:theme-accent-text'
}`}
>
<div className="truncate">
{tab.label}
</div>
</Link>
))}
</div> </div>
}>
{/* Tab Descriptions */} <ImportLayoutContent title={title} description={description}>
<div className="px-6 py-4 bg-gray-50 dark:bg-gray-800/50">
<div className="flex items-center justify-center">
<p className="text-sm theme-text text-center">
{importTabs.find(tab => tab.id === activeTab)?.description}
</p>
</div>
</div>
{/* Tab Content */}
<div className="p-6">
{children} {children}
</div> </ImportLayoutContent>
</div> </Suspense>
{/* Quick Actions */}
<div className="flex justify-center">
<Link
href="/library"
className="theme-text hover:theme-accent transition-colors text-sm"
>
Back to Library
</Link>
</div>
</div> </div>
</AppLayout> </AppLayout>
); );

View File

@@ -0,0 +1,116 @@
'use client';
import { ReactNode } from 'react';
import Link from 'next/link';
import { usePathname, useSearchParams } from 'next/navigation';
interface ImportTab {
id: string;
label: string;
href: string;
description: string;
}
interface ImportLayoutContentProps {
children: ReactNode;
title: string;
description?: string;
}
const importTabs: ImportTab[] = [
{
id: 'manual',
label: 'Manual Entry',
href: '/add-story',
description: 'Add a story by manually entering details'
},
{
id: 'url',
label: 'Import from URL',
href: '/import',
description: 'Import a single story from a website'
},
{
id: 'epub',
label: 'Import EPUB',
href: '/import/epub',
description: 'Import a story from an EPUB file'
},
{
id: 'bulk',
label: 'Bulk Import',
href: '/import/bulk',
description: 'Import multiple stories from URLs'
}
];
export default function ImportLayoutContent({
children,
title,
description
}: ImportLayoutContentProps) {
const pathname = usePathname();
const searchParams = useSearchParams();
// Determine active tab based on current path
const activeTab = importTabs.find(tab => {
if (tab.href === pathname) return true;
if (tab.href === '/import' && pathname === '/import') return true;
return false;
});
return (
<>
<div className="mb-8">
<div className="flex flex-col sm:flex-row sm:items-center sm:justify-between gap-4 mb-6">
<div>
<h1 className="text-3xl font-bold theme-header">{title}</h1>
{description && (
<p className="theme-text mt-2">{description}</p>
)}
</div>
<Link
href="/library"
className="inline-flex items-center px-4 py-2 text-sm font-medium theme-button theme-border border rounded-lg hover:theme-button-hover transition-colors"
>
Back to Library
</Link>
</div>
{/* Import Method Tabs */}
<div className="border-b theme-border">
<nav className="-mb-px flex space-x-8 overflow-x-auto">
{importTabs.map((tab) => {
const isActive = activeTab?.id === tab.id;
return (
<Link
key={tab.id}
href={tab.href}
className={`
group inline-flex items-center px-1 py-4 border-b-2 font-medium text-sm whitespace-nowrap
${isActive
? 'border-theme-accent text-theme-accent'
: 'border-transparent theme-text hover:text-theme-header hover:border-gray-300'
}
`}
>
<span className="flex flex-col">
<span>{tab.label}</span>
<span className="text-xs theme-text mt-1 group-hover:text-theme-header">
{tab.description}
</span>
</span>
</Link>
);
})}
</nav>
</div>
</div>
{/* Tab Content */}
<div className="flex-1">
{children}
</div>
</>
);
}

View File

@@ -66,7 +66,7 @@ export default function MinimalLayout({
const getSortDisplayText = () => { const getSortDisplayText = () => {
const sortLabels: Record<string, string> = { const sortLabels: Record<string, string> = {
lastRead: 'Last Read', lastReadAt: 'Last Read',
createdAt: 'Date Added', createdAt: 'Date Added',
title: 'Title', title: 'Title',
authorName: 'Author', authorName: 'Author',

View File

@@ -122,8 +122,8 @@ export default function SidebarLayout({
}} }}
className="px-2 py-1 border rounded-lg theme-card border-gray-300 dark:border-gray-600 text-xs" className="px-2 py-1 border rounded-lg theme-card border-gray-300 dark:border-gray-600 text-xs"
> >
<option value="lastRead_desc">Last Read </option> <option value="lastReadAt_desc">Last Read </option>
<option value="lastRead_asc">Last Read </option> <option value="lastReadAt_asc">Last Read </option>
<option value="createdAt_desc">Date Added </option> <option value="createdAt_desc">Date Added </option>
<option value="createdAt_asc">Date Added </option> <option value="createdAt_asc">Date Added </option>
<option value="title_asc">Title </option> <option value="title_asc">Title </option>
@@ -226,7 +226,7 @@ export default function SidebarLayout({
onChange={(e) => onSortChange(e.target.value)} onChange={(e) => onSortChange(e.target.value)}
className="flex-1 px-3 py-2 border rounded-lg theme-card border-gray-300 dark:border-gray-600" className="flex-1 px-3 py-2 border rounded-lg theme-card border-gray-300 dark:border-gray-600"
> >
<option value="lastRead">Last Read</option> <option value="lastReadAt">Last Read</option>
<option value="createdAt">Date Added</option> <option value="createdAt">Date Added</option>
<option value="title">Title</option> <option value="title">Title</option>
<option value="authorName">Author</option> <option value="authorName">Author</option>

View File

@@ -110,8 +110,8 @@ export default function ToolbarLayout({
}} }}
className="w-full px-3 py-2 border rounded-lg theme-card border-gray-300 dark:border-gray-600 max-md:text-sm" className="w-full px-3 py-2 border rounded-lg theme-card border-gray-300 dark:border-gray-600 max-md:text-sm"
> >
<option value="lastRead_desc">Sort: Last Read </option> <option value="lastReadAt_desc">Sort: Last Read </option>
<option value="lastRead_asc">Sort: Last Read </option> <option value="lastReadAt_asc">Sort: Last Read </option>
<option value="createdAt_desc">Sort: Date Added </option> <option value="createdAt_desc">Sort: Date Added </option>
<option value="createdAt_asc">Sort: Date Added </option> <option value="createdAt_asc">Sort: Date Added </option>
<option value="title_asc">Sort: Title </option> <option value="title_asc">Sort: Title </option>

View File

@@ -11,23 +11,25 @@ interface SystemSettingsProps {
export default function SystemSettings({}: SystemSettingsProps) { export default function SystemSettings({}: SystemSettingsProps) {
const [searchEngineStatus, setSearchEngineStatus] = useState<{ const [searchEngineStatus, setSearchEngineStatus] = useState<{
currentEngine: string; currentEngine: string;
openSearchAvailable: boolean; solrAvailable: boolean;
loading: boolean; loading: boolean;
message: string; message: string;
success?: boolean; success?: boolean;
}>({ }>({
currentEngine: 'opensearch', currentEngine: 'solr',
openSearchAvailable: false, solrAvailable: false,
loading: false, loading: false,
message: '' message: ''
}); });
const [openSearchStatus, setOpenSearchStatus] = useState<{ const [solrStatus, setSolrStatus] = useState<{
reindex: { loading: boolean; message: string; success?: boolean }; reindex: { loading: boolean; message: string; success?: boolean };
recreate: { loading: boolean; message: string; success?: boolean }; recreate: { loading: boolean; message: string; success?: boolean };
migrate: { loading: boolean; message: string; success?: boolean };
}>({ }>({
reindex: { loading: false, message: '' }, reindex: { loading: false, message: '' },
recreate: { loading: false, message: '' } recreate: { loading: false, message: '' },
migrate: { loading: false, message: '' }
}); });
const [databaseStatus, setDatabaseStatus] = useState<{ const [databaseStatus, setDatabaseStatus] = useState<{
@@ -47,6 +49,25 @@ export default function SystemSettings({}: SystemSettingsProps) {
execute: { loading: false, message: '' } execute: { loading: false, message: '' }
}); });
const [hoveredImage, setHoveredImage] = useState<{ src: string; alt: string } | null>(null);
const [mousePosition, setMousePosition] = useState<{ x: number; y: number }>({ x: 0, y: 0 });
const handleImageHover = (filePath: string, fileName: string, event: React.MouseEvent) => {
// Convert backend file path to frontend image URL
const imageUrl = filePath.replace(/^.*\/images\//, '/images/');
setHoveredImage({ src: imageUrl, alt: fileName });
setMousePosition({ x: event.clientX, y: event.clientY });
};
const handleImageLeave = () => {
setHoveredImage(null);
};
const isImageFile = (fileName: string): boolean => {
const imageExtensions = ['.jpg', '.jpeg', '.png', '.gif', '.webp', '.bmp', '.svg'];
return imageExtensions.some(ext => fileName.toLowerCase().endsWith(ext));
};
const handleCompleteBackup = async () => { const handleCompleteBackup = async () => {
@@ -229,13 +250,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
})); }));
} }
// Clear message after 10 seconds // Note: Preview message no longer auto-clears to allow users to review file details
setTimeout(() => {
setCleanupStatus(prev => ({
...prev,
preview: { loading: false, message: '', success: undefined }
}));
}, 10000);
}; };
const handleImageCleanupExecute = async () => { const handleImageCleanupExecute = async () => {
@@ -312,7 +327,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
setSearchEngineStatus(prev => ({ setSearchEngineStatus(prev => ({
...prev, ...prev,
currentEngine: status.primaryEngine, currentEngine: status.primaryEngine,
openSearchAvailable: status.openSearchAvailable, solrAvailable: status.solrAvailable,
})); }));
} catch (error: any) { } catch (error: any) {
console.error('Failed to load search engine status:', error); console.error('Failed to load search engine status:', error);
@@ -321,16 +336,16 @@ export default function SystemSettings({}: SystemSettingsProps) {
const handleOpenSearchReindex = async () => { const handleSolrReindex = async () => {
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
reindex: { loading: true, message: 'Reindexing OpenSearch...', success: undefined } reindex: { loading: true, message: 'Reindexing Solr...', success: undefined }
})); }));
try { try {
const result = await searchAdminApi.reindexOpenSearch(); const result = await searchAdminApi.reindexSolr();
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
reindex: { reindex: {
loading: false, loading: false,
@@ -340,13 +355,13 @@ export default function SystemSettings({}: SystemSettingsProps) {
})); }));
setTimeout(() => { setTimeout(() => {
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
reindex: { loading: false, message: '', success: undefined } reindex: { loading: false, message: '', success: undefined }
})); }));
}, 8000); }, 8000);
} catch (error: any) { } catch (error: any) {
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
reindex: { reindex: {
loading: false, loading: false,
@@ -356,7 +371,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
})); }));
setTimeout(() => { setTimeout(() => {
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
reindex: { loading: false, message: '', success: undefined } reindex: { loading: false, message: '', success: undefined }
})); }));
@@ -364,16 +379,16 @@ export default function SystemSettings({}: SystemSettingsProps) {
} }
}; };
const handleOpenSearchRecreate = async () => { const handleSolrRecreate = async () => {
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
recreate: { loading: true, message: 'Recreating OpenSearch indices...', success: undefined } recreate: { loading: true, message: 'Recreating Solr indices...', success: undefined }
})); }));
try { try {
const result = await searchAdminApi.recreateOpenSearchIndices(); const result = await searchAdminApi.recreateSolrIndices();
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
recreate: { recreate: {
loading: false, loading: false,
@@ -383,13 +398,13 @@ export default function SystemSettings({}: SystemSettingsProps) {
})); }));
setTimeout(() => { setTimeout(() => {
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
recreate: { loading: false, message: '', success: undefined } recreate: { loading: false, message: '', success: undefined }
})); }));
}, 8000); }, 8000);
} catch (error: any) { } catch (error: any) {
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
recreate: { recreate: {
loading: false, loading: false,
@@ -399,7 +414,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
})); }));
setTimeout(() => { setTimeout(() => {
setOpenSearchStatus(prev => ({ setSolrStatus(prev => ({
...prev, ...prev,
recreate: { loading: false, message: '', success: undefined } recreate: { loading: false, message: '', success: undefined }
})); }));
@@ -407,6 +422,57 @@ export default function SystemSettings({}: SystemSettingsProps) {
} }
}; };
const handleLibraryMigration = async () => {
const confirmed = window.confirm(
'This will migrate Solr to support library separation. It will clear existing search data and reindex with library context. Continue?'
);
if (!confirmed) return;
setSolrStatus(prev => ({
...prev,
migrate: { loading: true, message: 'Migrating to library-aware schema...', success: undefined }
}));
try {
const result = await searchAdminApi.migrateLibrarySchema();
setSolrStatus(prev => ({
...prev,
migrate: {
loading: false,
message: result.success
? `${result.message}${result.note ? ` Note: ${result.note}` : ''}`
: (result.error || result.details || 'Migration failed'),
success: result.success
}
}));
setTimeout(() => {
setSolrStatus(prev => ({
...prev,
migrate: { loading: false, message: '', success: undefined }
}));
}, 10000); // Longer timeout for migration messages
} catch (error: any) {
setSolrStatus(prev => ({
...prev,
migrate: {
loading: false,
message: error.message || 'Network error occurred',
success: false
}
}));
setTimeout(() => {
setSolrStatus(prev => ({
...prev,
migrate: { loading: false, message: '', success: undefined }
}));
}, 10000);
}
};
// Load status on component mount // Load status on component mount
useEffect(() => { useEffect(() => {
loadSearchEngineStatus(); loadSearchEngineStatus();
@@ -418,7 +484,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
<div className="theme-card theme-shadow rounded-lg p-6"> <div className="theme-card theme-shadow rounded-lg p-6">
<h2 className="text-xl font-semibold theme-header mb-4">Search Management</h2> <h2 className="text-xl font-semibold theme-header mb-4">Search Management</h2>
<p className="theme-text mb-6"> <p className="theme-text mb-6">
Manage OpenSearch indices for stories and authors. Use these tools if search isn't returning expected results. Manage Solr indices for stories and authors. Use these tools if search isn't returning expected results.
</p> </p>
<div className="space-y-6"> <div className="space-y-6">
@@ -427,9 +493,9 @@ export default function SystemSettings({}: SystemSettingsProps) {
<h3 className="text-lg font-semibold theme-header mb-3">Search Status</h3> <h3 className="text-lg font-semibold theme-header mb-3">Search Status</h3>
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 text-sm"> <div className="grid grid-cols-1 sm:grid-cols-2 gap-3 text-sm">
<div className="flex justify-between"> <div className="flex justify-between">
<span>OpenSearch:</span> <span>Solr:</span>
<span className={`font-medium ${searchEngineStatus.openSearchAvailable ? 'text-green-600 dark:text-green-400' : 'text-red-600 dark:text-red-400'}`}> <span className={`font-medium ${searchEngineStatus.solrAvailable ? 'text-green-600 dark:text-green-400' : 'text-red-600 dark:text-red-400'}`}>
{searchEngineStatus.openSearchAvailable ? 'Available' : 'Unavailable'} {searchEngineStatus.solrAvailable ? 'Available' : 'Unavailable'}
</span> </span>
</div> </div>
</div> </div>
@@ -444,43 +510,70 @@ export default function SystemSettings({}: SystemSettingsProps) {
<div className="flex flex-col sm:flex-row gap-3 mb-4"> <div className="flex flex-col sm:flex-row gap-3 mb-4">
<Button <Button
onClick={handleOpenSearchReindex} onClick={handleSolrReindex}
disabled={openSearchStatus.reindex.loading || openSearchStatus.recreate.loading || !searchEngineStatus.openSearchAvailable} disabled={solrStatus.reindex.loading || solrStatus.recreate.loading || solrStatus.migrate.loading || !searchEngineStatus.solrAvailable}
loading={openSearchStatus.reindex.loading} loading={solrStatus.reindex.loading}
variant="ghost" variant="ghost"
className="flex-1" className="flex-1"
> >
{openSearchStatus.reindex.loading ? 'Reindexing...' : '🔄 Reindex All'} {solrStatus.reindex.loading ? 'Reindexing...' : '🔄 Reindex All'}
</Button> </Button>
<Button <Button
onClick={handleOpenSearchRecreate} onClick={handleSolrRecreate}
disabled={openSearchStatus.reindex.loading || openSearchStatus.recreate.loading || !searchEngineStatus.openSearchAvailable} disabled={solrStatus.reindex.loading || solrStatus.recreate.loading || solrStatus.migrate.loading || !searchEngineStatus.solrAvailable}
loading={openSearchStatus.recreate.loading} loading={solrStatus.recreate.loading}
variant="secondary" variant="secondary"
className="flex-1" className="flex-1"
> >
{openSearchStatus.recreate.loading ? 'Recreating...' : '🏗️ Recreate Indices'} {solrStatus.recreate.loading ? 'Recreating...' : '🏗️ Recreate Indices'}
</Button>
</div>
{/* Library Migration Section */}
<div className="border-t theme-border pt-4">
<h4 className="text-md font-medium theme-header mb-2">Library Separation Migration</h4>
<p className="text-sm theme-text mb-3">
Migrate Solr to support proper library separation. This ensures search results are isolated between different libraries (password-based access).
</p>
<Button
onClick={handleLibraryMigration}
disabled={solrStatus.reindex.loading || solrStatus.recreate.loading || solrStatus.migrate.loading || !searchEngineStatus.solrAvailable}
loading={solrStatus.migrate.loading}
variant="primary"
className="w-full sm:w-auto"
>
{solrStatus.migrate.loading ? 'Migrating...' : '🔒 Migrate Library Schema'}
</Button> </Button>
</div> </div>
{/* Status Messages */} {/* Status Messages */}
{openSearchStatus.reindex.message && ( {solrStatus.reindex.message && (
<div className={`text-sm p-3 rounded mb-3 ${ <div className={`text-sm p-3 rounded mb-3 ${
openSearchStatus.reindex.success solrStatus.reindex.success
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200' ? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200' : 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
}`}> }`}>
{openSearchStatus.reindex.message} {solrStatus.reindex.message}
</div> </div>
)} )}
{openSearchStatus.recreate.message && ( {solrStatus.recreate.message && (
<div className={`text-sm p-3 rounded mb-3 ${ <div className={`text-sm p-3 rounded mb-3 ${
openSearchStatus.recreate.success solrStatus.recreate.success
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200' ? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200' : 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
}`}> }`}>
{openSearchStatus.recreate.message} {solrStatus.recreate.message}
</div>
)}
{solrStatus.migrate.message && (
<div className={`text-sm p-3 rounded mb-3 ${
solrStatus.migrate.success
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
}`}>
{solrStatus.migrate.message}
</div> </div>
)} )}
</div> </div>
@@ -490,7 +583,12 @@ export default function SystemSettings({}: SystemSettingsProps) {
<ul className="text-xs space-y-1 ml-4"> <ul className="text-xs space-y-1 ml-4">
<li> <strong>Reindex All:</strong> Refresh all search data while keeping existing schemas (fixes data sync issues)</li> <li> <strong>Reindex All:</strong> Refresh all search data while keeping existing schemas (fixes data sync issues)</li>
<li> <strong>Recreate Indices:</strong> Delete and rebuild all search indexes from scratch (fixes schema and structure issues)</li> <li> <strong>Recreate Indices:</strong> Delete and rebuild all search indexes from scratch (fixes schema and structure issues)</li>
<li> <strong>Migrate Library Schema:</strong> One-time migration to enable library separation (isolates search results by library)</li>
</ul> </ul>
<div className="mt-2 pt-2 border-t border-blue-200 dark:border-blue-700">
<p className="font-medium text-xs"> Library Migration:</p>
<p className="text-xs">Only run this once to enable library-aware search. Requires Solr schema to support libraryId field.</p>
</div>
</div> </div>
</div> </div>
</div> </div>
@@ -529,6 +627,18 @@ export default function SystemSettings({}: SystemSettingsProps) {
> >
{cleanupStatus.execute.loading ? 'Cleaning...' : 'Execute Cleanup'} {cleanupStatus.execute.loading ? 'Cleaning...' : 'Execute Cleanup'}
</Button> </Button>
{cleanupStatus.preview.message && (
<Button
onClick={() => setCleanupStatus(prev => ({
...prev,
preview: { loading: false, message: '', success: undefined, data: undefined }
}))}
variant="ghost"
className="px-4 py-2 text-sm"
>
Clear Preview
</Button>
)}
</div> </div>
{/* Preview Results */} {/* Preview Results */}
@@ -582,6 +692,76 @@ export default function SystemSettings({}: SystemSettingsProps) {
<span className="font-medium">Referenced Images:</span> {cleanupStatus.preview.data.referencedImagesCount} <span className="font-medium">Referenced Images:</span> {cleanupStatus.preview.data.referencedImagesCount}
</div> </div>
</div> </div>
{/* Detailed File List */}
{cleanupStatus.preview.data.orphanedFiles && cleanupStatus.preview.data.orphanedFiles.length > 0 && (
<div className="mt-4">
<details className="cursor-pointer">
<summary className="font-medium text-sm theme-header mb-2">
📁 View Files to be Deleted ({cleanupStatus.preview.data.orphanedFiles.length})
</summary>
<div className="mt-3 max-h-96 overflow-y-auto border theme-border rounded">
<table className="w-full text-xs">
<thead className="bg-gray-100 dark:bg-gray-800 sticky top-0">
<tr>
<th className="text-left p-2 font-medium">File Name</th>
<th className="text-left p-2 font-medium">Size</th>
<th className="text-left p-2 font-medium">Story</th>
<th className="text-left p-2 font-medium">Status</th>
</tr>
</thead>
<tbody>
{cleanupStatus.preview.data.orphanedFiles.map((file: any, index: number) => (
<tr key={index} className="border-t theme-border hover:bg-gray-50 dark:hover:bg-gray-800">
<td className="p-2">
<div
className={`truncate max-w-xs ${isImageFile(file.fileName) ? 'cursor-pointer text-blue-600 dark:text-blue-400' : ''}`}
title={file.fileName}
onMouseEnter={isImageFile(file.fileName) ? (e) => handleImageHover(file.filePath, file.fileName, e) : undefined}
onMouseMove={isImageFile(file.fileName) ? (e) => setMousePosition({ x: e.clientX, y: e.clientY }) : undefined}
onMouseLeave={isImageFile(file.fileName) ? handleImageLeave : undefined}
>
{isImageFile(file.fileName) && '🖼️ '}{file.fileName}
</div>
<div className="text-xs text-gray-500 truncate max-w-xs" title={file.filePath}>
{file.filePath}
</div>
</td>
<td className="p-2">{file.formattedSize}</td>
<td className="p-2">
{file.storyExists && file.storyTitle ? (
<a
href={`/stories/${file.storyId}`}
className="text-blue-600 dark:text-blue-400 hover:underline truncate max-w-xs block"
title={file.storyTitle}
>
{file.storyTitle}
</a>
) : file.storyId !== 'unknown' && file.storyId !== 'error' ? (
<span className="text-gray-500" title={`Story ID: ${file.storyId}`}>
Deleted Story
</span>
) : (
<span className="text-gray-400">Unknown</span>
)}
</td>
<td className="p-2">
{file.storyExists ? (
<span className="text-orange-600 dark:text-orange-400 text-xs">Orphaned</span>
) : file.storyId !== 'unknown' && file.storyId !== 'error' ? (
<span className="text-red-600 dark:text-red-400 text-xs">Story Deleted</span>
) : (
<span className="text-gray-500 text-xs">Unknown Folder</span>
)}
</td>
</tr>
))}
</tbody>
</table>
</div>
</details>
</div>
)}
</div> </div>
)} )}
</div> </div>
@@ -702,6 +882,31 @@ export default function SystemSettings({}: SystemSettingsProps) {
</div> </div>
</div> </div>
</div> </div>
{/* Image Preview Overlay */}
{hoveredImage && (
<div
className="fixed pointer-events-none z-50 bg-white dark:bg-gray-900 border border-gray-300 dark:border-gray-600 rounded-lg shadow-xl p-2 max-w-sm"
style={{
left: mousePosition.x + 10,
top: mousePosition.y - 100,
transform: mousePosition.x > window.innerWidth - 300 ? 'translateX(-100%)' : 'none'
}}
>
<img
src={hoveredImage.src}
alt={hoveredImage.alt}
className="max-w-full max-h-64 object-contain rounded"
onError={() => {
// Hide preview if image fails to load
setHoveredImage(null);
}}
/>
<div className="text-xs theme-text mt-1 truncate">
{hoveredImage.alt}
</div>
</div>
)}
</div> </div>
); );
} }

View File

@@ -0,0 +1,892 @@
'use client';
import React, { useState, useCallback, useMemo } from 'react';
import {
createEditor,
Descendant,
Element as SlateElement,
Node as SlateNode,
Transforms,
Editor,
Range
} from 'slate';
import {
Slate,
Editable,
withReact,
ReactEditor,
RenderElementProps,
RenderLeafProps,
useSlate as useEditor
} from 'slate-react';
import { withHistory } from 'slate-history';
import Button from '../ui/Button';
import { sanitizeHtmlSync } from '../../lib/sanitization';
import { debug } from '../../lib/debug';
interface SlateEditorProps {
value: string; // HTML value for compatibility with existing code
onChange: (value: string) => void; // Returns HTML for compatibility
placeholder?: string;
error?: string;
storyId?: string;
enableImageProcessing?: boolean;
}
// Custom types for our editor
type CustomElement = {
type: 'paragraph' | 'heading-one' | 'heading-two' | 'heading-three' | 'blockquote' | 'image' | 'code-block';
children: CustomText[];
src?: string; // for images
alt?: string; // for images
caption?: string; // for images
language?: string; // for code blocks
};
type CustomText = {
text: string;
bold?: boolean;
italic?: boolean;
underline?: boolean;
strikethrough?: boolean;
code?: boolean;
};
declare module 'slate' {
interface CustomTypes {
Editor: ReactEditor;
Element: CustomElement;
Text: CustomText;
}
}
// HTML to Slate conversion - preserves mixed content order
const htmlToSlate = (html: string): Descendant[] => {
if (!html || html.trim() === '') {
return [{ type: 'paragraph', children: [{ text: '' }] }];
}
const sanitizedHtml = sanitizeHtmlSync(html);
const parser = new DOMParser();
const doc = parser.parseFromString(sanitizedHtml, 'text/html');
const nodes: Descendant[] = [];
// Process all nodes in document order to maintain sequence
const processChildNodes = (parentNode: Node): Descendant[] => {
const results: Descendant[] = [];
Array.from(parentNode.childNodes).forEach(node => {
if (node.nodeType === Node.ELEMENT_NODE) {
const element = node as Element;
switch (element.tagName.toLowerCase()) {
case 'h1':
results.push({
type: 'heading-one',
children: [{ text: element.textContent || '' }]
});
break;
case 'h2':
results.push({
type: 'heading-two',
children: [{ text: element.textContent || '' }]
});
break;
case 'h3':
results.push({
type: 'heading-three',
children: [{ text: element.textContent || '' }]
});
break;
case 'blockquote':
results.push({
type: 'blockquote',
children: [{ text: element.textContent || '' }]
});
break;
case 'img':
const img = element as HTMLImageElement;
results.push({
type: 'image',
src: img.src || img.getAttribute('src') || '',
alt: img.alt || img.getAttribute('alt') || '',
caption: img.title || img.getAttribute('title') || '',
children: [{ text: '' }] // Images need children in Slate
});
break;
case 'pre':
const codeEl = element.querySelector('code');
const code = codeEl ? codeEl.textContent || '' : element.textContent || '';
const language = codeEl?.className?.replace('language-', '') || '';
results.push({
type: 'code-block',
language,
children: [{ text: code }]
});
break;
case 'p':
case 'div':
// Check if this paragraph contains mixed content (text + images)
if (element.querySelector('img')) {
// Process mixed content - handle both text and images in order
results.push(...processChildNodes(element));
} else {
const text = element.textContent || '';
if (text.trim()) {
results.push({
type: 'paragraph',
children: [{ text }]
});
}
}
break;
case 'br':
// Handle line breaks by creating empty paragraphs
results.push({
type: 'paragraph',
children: [{ text: '' }]
});
break;
default:
// For other elements, try to extract text or recurse
const text = element.textContent || '';
if (text.trim()) {
results.push({
type: 'paragraph',
children: [{ text }]
});
}
break;
}
} else if (node.nodeType === Node.TEXT_NODE) {
const text = node.textContent || '';
if (text.trim()) {
results.push({
type: 'paragraph',
children: [{ text: text.trim() }]
});
}
}
});
return results;
};
// Process all content
nodes.push(...processChildNodes(doc.body));
// Fallback for simple text content
if (nodes.length === 0 && doc.body.textContent?.trim()) {
const text = doc.body.textContent.trim();
const lines = text.split('\n').filter(line => line.trim());
lines.forEach(line => {
nodes.push({
type: 'paragraph',
children: [{ text: line.trim() }]
});
});
}
return nodes.length > 0 ? nodes : [{ type: 'paragraph', children: [{ text: '' }] }];
};
// Slate to HTML conversion
const slateToHtml = (nodes: Descendant[]): string => {
const htmlParts: string[] = [];
nodes.forEach(node => {
if (SlateElement.isElement(node)) {
const element = node as CustomElement;
const text = SlateNode.string(node);
switch (element.type) {
case 'heading-one':
htmlParts.push(`<h1>${text}</h1>`);
break;
case 'heading-two':
htmlParts.push(`<h2>${text}</h2>`);
break;
case 'heading-three':
htmlParts.push(`<h3>${text}</h3>`);
break;
case 'blockquote':
htmlParts.push(`<blockquote>${text}</blockquote>`);
break;
case 'image':
const attrs: string[] = [];
if (element.src) attrs.push(`src="${element.src}"`);
if (element.alt) attrs.push(`alt="${element.alt}"`);
if (element.caption) attrs.push(`title="${element.caption}"`);
htmlParts.push(`<img ${attrs.join(' ')} />`);
break;
case 'code-block':
const langClass = element.language ? ` class="language-${element.language}"` : '';
const escapedText = text
.replace(/&/g, '&amp;')
.replace(/</g, '&lt;')
.replace(/>/g, '&gt;')
.replace(/"/g, '&quot;')
.replace(/'/g, '&#39;');
htmlParts.push(`<pre><code${langClass}>${escapedText}</code></pre>`);
break;
case 'paragraph':
default:
htmlParts.push(text ? `<p>${text}</p>` : '<p></p>');
break;
}
}
});
const html = htmlParts.join('\n');
return sanitizeHtmlSync(html);
};
// Custom plugin to handle images
const withImages = (editor: ReactEditor) => {
const { insertData, isVoid } = editor;
editor.isVoid = element => {
return element.type === 'image' ? true : isVoid(element);
};
editor.insertData = (data) => {
const html = data.getData('text/html');
if (html && html.includes('<img')) {
debug.log('📋 Image paste detected in Slate editor');
// Convert HTML to Slate nodes maintaining order
const slateNodes = htmlToSlate(html);
// Insert all nodes in sequence
slateNodes.forEach(node => {
Transforms.insertNodes(editor, node);
});
debug.log(`📋 Inserted ${slateNodes.length} nodes from pasted HTML`);
return;
}
insertData(data);
};
return editor;
};
// Interactive Image Component
const ImageElement = ({ attributes, element, children }: {
attributes: any;
element: CustomElement;
children: React.ReactNode;
}) => {
const editor = useEditor();
const [isEditing, setIsEditing] = useState(false);
const [editUrl, setEditUrl] = useState(element.src || '');
const [editAlt, setEditAlt] = useState(element.alt || '');
const [editCaption, setEditCaption] = useState(element.caption || '');
const handleDelete = () => {
const path = ReactEditor.findPath(editor, element);
Transforms.removeNodes(editor, { at: path });
};
const handleSave = () => {
const path = ReactEditor.findPath(editor, element);
const newProperties: Partial<CustomElement> = {
src: editUrl,
alt: editAlt,
caption: editCaption,
};
Transforms.setNodes(editor, newProperties, { at: path });
setIsEditing(false);
};
const handleCancel = () => {
setEditUrl(element.src || '');
setEditAlt(element.alt || '');
setEditCaption(element.caption || '');
setIsEditing(false);
};
if (isEditing) {
return (
<div {...attributes} contentEditable={false} className="my-4">
<div className="border border-blue-300 rounded-lg p-4 bg-blue-50">
<h4 className="font-medium text-blue-900 mb-3">Edit Image</h4>
<div className="space-y-3">
<div>
<label className="block text-sm font-medium text-blue-800 mb-1">
Image URL *
</label>
<input
type="url"
value={editUrl}
onChange={(e) => setEditUrl(e.target.value)}
className="w-full px-3 py-2 border border-blue-300 rounded-md focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
placeholder="https://example.com/image.jpg"
/>
</div>
<div>
<label className="block text-sm font-medium text-blue-800 mb-1">
Alt Text
</label>
<input
type="text"
value={editAlt}
onChange={(e) => setEditAlt(e.target.value)}
className="w-full px-3 py-2 border border-blue-300 rounded-md focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
placeholder="Describe the image"
/>
</div>
<div>
<label className="block text-sm font-medium text-blue-800 mb-1">
Caption
</label>
<input
type="text"
value={editCaption}
onChange={(e) => setEditCaption(e.target.value)}
className="w-full px-3 py-2 border border-blue-300 rounded-md focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
placeholder="Image caption"
/>
</div>
</div>
<div className="flex gap-2 mt-4">
<button
onClick={handleSave}
className="px-3 py-1 bg-blue-600 text-white text-sm rounded hover:bg-blue-700 focus:ring-2 focus:ring-blue-500"
>
Save
</button>
<button
onClick={handleCancel}
className="px-3 py-1 bg-gray-300 text-gray-700 text-sm rounded hover:bg-gray-400 focus:ring-2 focus:ring-gray-500"
>
Cancel
</button>
</div>
</div>
{children}
</div>
);
}
return (
<div {...attributes} contentEditable={false} className="my-4">
<div
className="relative border border-gray-200 rounded-lg overflow-hidden bg-white shadow-sm group hover:shadow-md transition-shadow focus-within:ring-2 focus-within:ring-blue-500 focus-within:border-blue-500"
tabIndex={0}
onKeyDown={(event) => {
// Handle delete/backspace on focused image
if (event.key === 'Delete' || event.key === 'Backspace') {
event.preventDefault();
handleDelete();
}
// Handle Enter to edit
if (event.key === 'Enter') {
event.preventDefault();
setIsEditing(true);
}
}}
onClick={() => {
// Focus the image element when clicked
const path = ReactEditor.findPath(editor, element);
const start = Editor.start(editor, path);
Transforms.select(editor, start);
}}
>
{/* Control buttons - show on hover */}
<div className="absolute top-2 left-2 opacity-0 group-hover:opacity-100 transition-opacity z-10">
<div className="flex gap-1">
<button
onClick={() => setIsEditing(true)}
className="p-1 bg-white rounded-full shadow-sm hover:bg-blue-50 border border-gray-200 text-blue-600 hover:text-blue-700"
title="Edit image"
>
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M11 5H6a2 2 0 00-2 2v11a2 2 0 002 2h11a2 2 0 002-2v-5m-1.414-9.414a2 2 0 112.828 2.828L11.828 15H9v-2.828l8.586-8.586z" />
</svg>
</button>
<button
onClick={handleDelete}
className="p-1 bg-white rounded-full shadow-sm hover:bg-red-50 border border-gray-200 text-red-600 hover:text-red-700"
title="Delete image"
>
<svg className="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" />
</svg>
</button>
</div>
</div>
{element.src ? (
<>
<img
src={element.src}
alt={element.alt || ''}
className="w-full h-auto max-h-96 object-contain cursor-pointer"
onDoubleClick={() => setIsEditing(true)}
onError={(e) => {
// Fallback to text block if image fails to load
const target = e.target as HTMLImageElement;
const parent = target.parentElement;
if (parent) {
parent.innerHTML = `
<div class="p-3 border border-dashed border-red-300 rounded-lg bg-red-50">
<div class="flex items-center gap-2 mb-2">
<span class="text-lg">⚠️</span>
<span class="font-medium text-red-700">Image failed to load</span>
</div>
<div class="text-sm text-red-600 space-y-1">
<p><strong>Source:</strong> ${element.src}</p>
${element.alt ? `<p><strong>Alt:</strong> ${element.alt}</p>` : ''}
${element.caption ? `<p><strong>Caption:</strong> ${element.caption}</p>` : ''}
</div>
</div>
`;
}
}}
/>
{(element.alt || element.caption) && (
<div className="p-2 bg-gray-50 border-t border-gray-200">
<div className="text-sm text-gray-600">
{element.caption && (
<p className="font-medium">{element.caption}</p>
)}
{element.alt && element.alt !== element.caption && (
<p className="italic">{element.alt}</p>
)}
</div>
</div>
)}
{/* External image indicator */}
{element.src.startsWith('http') && (
<div className="absolute top-2 right-2">
<div className="bg-blue-100 text-blue-800 text-xs px-2 py-1 rounded-full flex items-center gap-1">
<span>🌐</span>
<span>External</span>
</div>
</div>
)}
</>
) : (
<div className="p-3 border border-dashed border-gray-300 rounded-lg bg-gray-50">
<div className="flex items-center gap-2 mb-2">
<span className="text-lg">🖼</span>
<span className="font-medium text-gray-700">Image (No Source)</span>
</div>
<div className="text-sm text-gray-600 space-y-1">
{element.alt && <p><strong>Alt:</strong> {element.alt}</p>}
{element.caption && <p><strong>Caption:</strong> {element.caption}</p>}
</div>
</div>
)}
</div>
{children}
</div>
);
};
// Component for rendering elements
const Element = ({ attributes, children, element }: RenderElementProps) => {
const customElement = element as CustomElement;
switch (customElement.type) {
case 'heading-one':
return <h1 {...attributes} className="text-3xl font-bold mb-4">{children}</h1>;
case 'heading-two':
return <h2 {...attributes} className="text-2xl font-bold mb-3">{children}</h2>;
case 'heading-three':
return <h3 {...attributes} className="text-xl font-bold mb-3">{children}</h3>;
case 'blockquote':
return <blockquote {...attributes} className="border-l-4 border-gray-300 pl-4 italic my-4">{children}</blockquote>;
case 'image':
return (
<ImageElement
attributes={attributes}
element={customElement}
children={children}
/>
);
case 'code-block':
return (
<pre {...attributes} className="my-4 p-3 bg-gray-100 rounded-lg overflow-x-auto">
<code className="text-sm font-mono">{children}</code>
</pre>
);
default:
return <p {...attributes} className="mb-2">{children}</p>;
}
};
// Component for rendering leaves (text formatting)
const Leaf = ({ attributes, children, leaf }: RenderLeafProps) => {
const customLeaf = leaf as CustomText;
if (customLeaf.bold) {
children = <strong>{children}</strong>;
}
if (customLeaf.italic) {
children = <em>{children}</em>;
}
if (customLeaf.underline) {
children = <u>{children}</u>;
}
if (customLeaf.strikethrough) {
children = <s>{children}</s>;
}
if (customLeaf.code) {
children = <code className="bg-gray-100 px-1 py-0.5 rounded text-sm font-mono">{children}</code>;
}
return <span {...attributes}>{children}</span>;
};
// Toolbar component
const Toolbar = ({ editor }: { editor: ReactEditor }) => {
type MarkFormat = 'bold' | 'italic' | 'underline' | 'strikethrough' | 'code';
const isMarkActive = (format: MarkFormat) => {
const marks = Editor.marks(editor);
return marks ? marks[format] === true : false;
};
const toggleMark = (format: MarkFormat) => {
const isActive = isMarkActive(format);
if (isActive) {
Editor.removeMark(editor, format);
} else {
Editor.addMark(editor, format, true);
}
};
const isBlockActive = (format: CustomElement['type']) => {
const { selection } = editor;
if (!selection) return false;
const [match] = Array.from(
Editor.nodes(editor, {
at: Editor.unhangRange(editor, selection),
match: n =>
!Editor.isEditor(n) &&
SlateElement.isElement(n) &&
n.type === format,
})
);
return !!match;
};
const toggleBlock = (format: CustomElement['type']) => {
const isActive = isBlockActive(format);
Transforms.setNodes(
editor,
{ type: isActive ? 'paragraph' : format },
{ match: n => SlateElement.isElement(n) && Editor.isBlock(editor, n) }
);
};
const insertImage = () => {
const url = prompt('Enter image URL:', 'https://');
if (url && url.trim() !== 'https://') {
const imageNode: CustomElement = {
type: 'image',
src: url.trim(),
alt: '',
caption: '',
children: [{ text: '' }],
};
Transforms.insertNodes(editor, imageNode);
// Add a paragraph after the image
Transforms.insertNodes(editor, {
type: 'paragraph',
children: [{ text: '' }],
});
}
};
return (
<div className="flex items-center gap-2 p-2 theme-card border theme-border rounded-t-lg">
<div className="text-xs bg-green-100 text-green-800 px-2 py-1 rounded">
Slate.js Editor
</div>
{/* Block type buttons */}
<div className="flex items-center gap-1 border-r pr-2 mr-2">
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => toggleBlock('paragraph')}
className={isBlockActive('paragraph') ? 'theme-accent-bg text-white' : ''}
title="Normal paragraph"
>
P
</Button>
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => toggleBlock('heading-one')}
className={`text-lg font-bold ${isBlockActive('heading-one') ? 'theme-accent-bg text-white' : ''}`}
title="Heading 1"
>
H1
</Button>
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => toggleBlock('heading-two')}
className={`text-base font-bold ${isBlockActive('heading-two') ? 'theme-accent-bg text-white' : ''}`}
title="Heading 2"
>
H2
</Button>
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => toggleBlock('heading-three')}
className={`text-sm font-bold ${isBlockActive('heading-three') ? 'theme-accent-bg text-white' : ''}`}
title="Heading 3"
>
H3
</Button>
</div>
{/* Text formatting buttons */}
<div className="flex items-center gap-1">
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => toggleMark('bold')}
className={`font-bold ${isMarkActive('bold') ? 'theme-accent-bg text-white' : ''}`}
title="Bold (Ctrl+B)"
>
B
</Button>
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => toggleMark('italic')}
className={`italic ${isMarkActive('italic') ? 'theme-accent-bg text-white' : ''}`}
title="Italic (Ctrl+I)"
>
I
</Button>
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => toggleMark('underline')}
className={`underline ${isMarkActive('underline') ? 'theme-accent-bg text-white' : ''}`}
title="Underline"
>
U
</Button>
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => toggleMark('strikethrough')}
className={`line-through ${isMarkActive('strikethrough') ? 'theme-accent-bg text-white' : ''}`}
title="Strike-through"
>
S
</Button>
</div>
{/* Image insertion button */}
<div className="flex items-center gap-1 border-l pl-2 ml-2">
<Button
type="button"
size="sm"
variant="ghost"
onClick={insertImage}
className="text-green-600 hover:bg-green-50"
title="Insert Image"
>
🖼
</Button>
</div>
</div>
);
};
export default function SlateEditor({
value,
onChange,
placeholder = 'Write your story here...',
error,
storyId,
enableImageProcessing = false
}: SlateEditorProps) {
const [isScrollable, setIsScrollable] = useState(true);
// Create editor with plugins
const editor = useMemo(
() => withImages(withHistory(withReact(createEditor()))),
[]
);
// Convert HTML to Slate format for initial value
const initialValue = useMemo(() => {
debug.log('🚀 Slate Editor initializing with HTML:', { htmlLength: value?.length });
return htmlToSlate(value);
}, [value]);
// Handle changes
const handleChange = useCallback((newValue: Descendant[]) => {
// Convert back to HTML and call onChange
const html = slateToHtml(newValue);
onChange(html);
debug.log('📝 Slate Editor changed:', {
htmlLength: html.length,
nodeCount: newValue.length
});
}, [onChange]);
debug.log('🎯 Slate Editor loaded!', {
valueLength: value?.length,
enableImageProcessing,
hasStoryId: !!storyId
});
return (
<div className="space-y-2">
<Slate editor={editor} initialValue={initialValue} onChange={handleChange}>
<Toolbar editor={editor} />
<div className="border theme-border rounded-b-lg overflow-hidden">
<Editable
className={`p-3 focus:outline-none focus:ring-0 resize-none ${
isScrollable
? 'h-[400px] overflow-y-auto'
: 'min-h-[300px]'
}`}
placeholder={placeholder}
renderElement={Element}
renderLeaf={Leaf}
onKeyDown={(event) => {
// Handle delete/backspace for selected content (including images)
if (event.key === 'Delete' || event.key === 'Backspace') {
const { selection } = editor;
if (!selection) return;
// If there's an expanded selection, let Slate handle it naturally
// This will delete all selected content including images
if (!Range.isCollapsed(selection)) {
// Slate will handle this automatically, including void elements
return;
}
// Handle single point deletions near images
const { anchor } = selection;
if (event.key === 'Delete') {
// Delete key - check if next node is an image
try {
const [nextNode] = Editor.next(editor, { at: anchor }) || [];
if (nextNode && SlateElement.isElement(nextNode) && nextNode.type === 'image') {
event.preventDefault();
const path = ReactEditor.findPath(editor, nextNode);
Transforms.removeNodes(editor, { at: path });
return;
}
} catch (error) {
// Ignore navigation errors at document boundaries
}
} else if (event.key === 'Backspace') {
// Backspace key - check if previous node is an image
try {
const [prevNode] = Editor.previous(editor, { at: anchor }) || [];
if (prevNode && SlateElement.isElement(prevNode) && prevNode.type === 'image') {
event.preventDefault();
const path = ReactEditor.findPath(editor, prevNode);
Transforms.removeNodes(editor, { at: path });
return;
}
} catch (error) {
// Ignore navigation errors at document boundaries
}
}
}
// Handle keyboard shortcuts
if (!event.ctrlKey && !event.metaKey) return;
switch (event.key) {
case 'b': {
event.preventDefault();
const marks = Editor.marks(editor);
const isActive = marks ? marks.bold === true : false;
if (isActive) {
Editor.removeMark(editor, 'bold');
} else {
Editor.addMark(editor, 'bold', true);
}
break;
}
case 'i': {
event.preventDefault();
const marks = Editor.marks(editor);
const isActive = marks ? marks.italic === true : false;
if (isActive) {
Editor.removeMark(editor, 'italic');
} else {
Editor.addMark(editor, 'italic', true);
}
break;
}
case 'a': {
// Handle Ctrl+A / Cmd+A to select all
event.preventDefault();
Transforms.select(editor, {
anchor: Editor.start(editor, []),
focus: Editor.end(editor, []),
});
break;
}
}
}}
/>
</div>
<div className="flex justify-between items-center">
<div className="text-xs theme-text">
<p>
<strong>Slate.js Editor:</strong> Rich text editor with advanced image paste handling.
{isScrollable ? ' Fixed height with scrolling.' : ' Auto-expanding height.'}
</p>
</div>
<Button
type="button"
size="sm"
variant="ghost"
onClick={() => setIsScrollable(!isScrollable)}
className={isScrollable ? 'theme-accent-bg text-white' : ''}
title={isScrollable ? 'Switch to auto-expand mode' : 'Switch to scrollable mode'}
>
{isScrollable ? '📜' : '📏'}
</Button>
</div>
</Slate>
{error && (
<p className="text-sm text-red-600 dark:text-red-400">{error}</p>
)}
</div>
);
}

View File

@@ -218,43 +218,91 @@ export const storyApi = {
hiddenGemsOnly?: boolean; hiddenGemsOnly?: boolean;
}): Promise<Story | null> => { }): Promise<Story | null> => {
try { try {
// Create URLSearchParams to properly handle array parameters like tags // Use proper Solr RandomSortField with dynamic field random_* for true randomness
const searchParams = new URLSearchParams(); // Each call generates a different random seed to ensure different random results
const randomSeed = Math.floor(Math.random() * 1000000);
const searchResult = await searchApi.search({
query: filters?.searchQuery || '*:*',
page: 0,
size: 1, // Only get one result - Solr RandomSortField considers entire dataset
authors: [],
tags: filters?.tags || [],
minRating: filters?.minRating,
maxRating: filters?.maxRating,
sortBy: `random_${randomSeed}`, // Use proper dynamic field with random seed
sortDir: 'desc',
if (filters?.searchQuery) { // Advanced filters - pass through all filter options
searchParams.append('searchQuery', filters.searchQuery); minWordCount: filters?.minWordCount,
} maxWordCount: filters?.maxWordCount,
if (filters?.tags && filters.tags.length > 0) { createdAfter: filters?.createdAfter,
filters.tags.forEach(tag => searchParams.append('tags', tag)); createdBefore: filters?.createdBefore,
} lastReadAfter: filters?.lastReadAfter,
lastReadBefore: filters?.lastReadBefore,
unratedOnly: filters?.unratedOnly,
readingStatus: filters?.readingStatus,
hasReadingProgress: filters?.hasReadingProgress,
hasCoverImage: filters?.hasCoverImage,
sourceDomain: filters?.sourceDomain,
seriesFilter: filters?.seriesFilter,
minTagCount: filters?.minTagCount,
popularOnly: filters?.popularOnly,
hiddenGemsOnly: filters?.hiddenGemsOnly,
});
// Advanced filters return searchResult.results && searchResult.results.length > 0
if (filters?.minWordCount !== undefined) searchParams.append('minWordCount', filters.minWordCount.toString()); ? searchResult.results[0]
if (filters?.maxWordCount !== undefined) searchParams.append('maxWordCount', filters.maxWordCount.toString()); : null;
if (filters?.createdAfter) searchParams.append('createdAfter', filters.createdAfter);
if (filters?.createdBefore) searchParams.append('createdBefore', filters.createdBefore);
if (filters?.lastReadAfter) searchParams.append('lastReadAfter', filters.lastReadAfter);
if (filters?.lastReadBefore) searchParams.append('lastReadBefore', filters.lastReadBefore);
if (filters?.minRating !== undefined) searchParams.append('minRating', filters.minRating.toString());
if (filters?.maxRating !== undefined) searchParams.append('maxRating', filters.maxRating.toString());
if (filters?.unratedOnly !== undefined) searchParams.append('unratedOnly', filters.unratedOnly.toString());
if (filters?.readingStatus) searchParams.append('readingStatus', filters.readingStatus);
if (filters?.hasReadingProgress !== undefined) searchParams.append('hasReadingProgress', filters.hasReadingProgress.toString());
if (filters?.hasCoverImage !== undefined) searchParams.append('hasCoverImage', filters.hasCoverImage.toString());
if (filters?.sourceDomain) searchParams.append('sourceDomain', filters.sourceDomain);
if (filters?.seriesFilter) searchParams.append('seriesFilter', filters.seriesFilter);
if (filters?.minTagCount !== undefined) searchParams.append('minTagCount', filters.minTagCount.toString());
if (filters?.popularOnly !== undefined) searchParams.append('popularOnly', filters.popularOnly.toString());
if (filters?.hiddenGemsOnly !== undefined) searchParams.append('hiddenGemsOnly', filters.hiddenGemsOnly.toString());
const response = await api.get(`/stories/random?${searchParams.toString()}`);
return response.data;
} catch (error: any) { } catch (error: any) {
if (error.response?.status === 204) { if (error.response?.status === 404 || error.response?.status === 204) {
// No content - no stories match filters // No content - no stories match filters
return null; return null;
} }
throw error;
// If random sorting fails, fallback to client-side approach
console.warn('Solr random sorting failed, falling back to client-side selection:', error.message);
try {
// Fallback: get larger sample and pick randomly client-side
const fallbackResult = await searchApi.search({
query: filters?.searchQuery || '*:*',
page: 0,
size: 200, // Large enough sample for good randomness
authors: [],
tags: filters?.tags || [],
minRating: filters?.minRating,
maxRating: filters?.maxRating,
sortBy: 'createdAt',
sortDir: 'desc',
// Same advanced filters
minWordCount: filters?.minWordCount,
maxWordCount: filters?.maxWordCount,
createdAfter: filters?.createdAfter,
createdBefore: filters?.createdBefore,
lastReadAfter: filters?.lastReadAfter,
lastReadBefore: filters?.lastReadBefore,
unratedOnly: filters?.unratedOnly,
readingStatus: filters?.readingStatus,
hasReadingProgress: filters?.hasReadingProgress,
hasCoverImage: filters?.hasCoverImage,
sourceDomain: filters?.sourceDomain,
seriesFilter: filters?.seriesFilter,
minTagCount: filters?.minTagCount,
popularOnly: filters?.popularOnly,
hiddenGemsOnly: filters?.hiddenGemsOnly,
});
if (fallbackResult.results && fallbackResult.results.length > 0) {
const randomIndex = Math.floor(Math.random() * fallbackResult.results.length);
return fallbackResult.results[randomIndex];
}
return null;
} catch (fallbackError: any) {
throw fallbackError;
}
} }
}, },
}; };
@@ -296,6 +344,33 @@ export const authorApi = {
await api.delete(`/authors/${id}/avatar`); await api.delete(`/authors/${id}/avatar`);
}, },
searchAuthors: async (params: {
query?: string;
page?: number;
size?: number;
sortBy?: string;
sortDir?: string;
}): Promise<{
results: Author[];
totalHits: number;
page: number;
perPage: number;
query: string;
searchTimeMs: number;
}> => {
const searchParams = new URLSearchParams();
// Add query parameter
searchParams.append('q', params.query || '*');
if (params.page !== undefined) searchParams.append('page', params.page.toString());
if (params.size !== undefined) searchParams.append('size', params.size.toString());
if (params.sortBy) searchParams.append('sortBy', params.sortBy);
if (params.sortDir) searchParams.append('sortOrder', params.sortDir);
const response = await api.get(`/authors/search-typesense?${searchParams.toString()}`);
return response.data;
},
}; };
// Tag endpoints // Tag endpoints
@@ -548,6 +623,17 @@ export const configApi = {
hasErrors: boolean; hasErrors: boolean;
dryRun: boolean; dryRun: boolean;
error?: string; error?: string;
orphanedFiles?: Array<{
filePath: string;
fileName: string;
fileSize: number;
formattedSize: string;
storyId: string;
storyTitle: string | null;
storyExists: boolean;
canAccessStory: boolean;
error?: string;
}>;
}> => { }> => {
const response = await api.post('/config/cleanup/images/preview'); const response = await api.post('/config/cleanup/images/preview');
return response.data; return response.data;
@@ -576,7 +662,7 @@ export const searchAdminApi = {
getStatus: async (): Promise<{ getStatus: async (): Promise<{
primaryEngine: string; primaryEngine: string;
dualWrite: boolean; dualWrite: boolean;
openSearchAvailable: boolean; solrAvailable: boolean;
}> => { }> => {
const response = await api.get('/admin/search/status'); const response = await api.get('/admin/search/status');
return response.data; return response.data;
@@ -600,8 +686,8 @@ export const searchAdminApi = {
}, },
// Switch engines // Switch engines
switchToOpenSearch: async (): Promise<{ message: string }> => { switchToSolr: async (): Promise<{ message: string }> => {
const response = await api.post('/admin/search/switch/opensearch'); const response = await api.post('/admin/search/switch/solr');
return response.data; return response.data;
}, },
@@ -612,8 +698,8 @@ export const searchAdminApi = {
return response.data; return response.data;
}, },
// OpenSearch operations // Solr operations
reindexOpenSearch: async (): Promise<{ reindexSolr: async (): Promise<{
success: boolean; success: boolean;
message: string; message: string;
storiesCount?: number; storiesCount?: number;
@@ -621,11 +707,11 @@ export const searchAdminApi = {
totalCount?: number; totalCount?: number;
error?: string; error?: string;
}> => { }> => {
const response = await api.post('/admin/search/opensearch/reindex'); const response = await api.post('/admin/search/solr/reindex');
return response.data; return response.data;
}, },
recreateOpenSearchIndices: async (): Promise<{ recreateSolrIndices: async (): Promise<{
success: boolean; success: boolean;
message: string; message: string;
storiesCount?: number; storiesCount?: number;
@@ -633,7 +719,34 @@ export const searchAdminApi = {
totalCount?: number; totalCount?: number;
error?: string; error?: string;
}> => { }> => {
const response = await api.post('/admin/search/opensearch/recreate'); const response = await api.post('/admin/search/solr/recreate');
return response.data;
},
// Add libraryId field to schema
addLibraryField: async (): Promise<{
success: boolean;
message: string;
error?: string;
details?: string;
note?: string;
}> => {
const response = await api.post('/admin/search/solr/add-library-field');
return response.data;
},
// Migrate to library-aware schema
migrateLibrarySchema: async (): Promise<{
success: boolean;
message: string;
storiesCount?: number;
authorsCount?: number;
totalCount?: number;
error?: string;
details?: string;
note?: string;
}> => {
const response = await api.post('/admin/search/solr/migrate-library-schema');
return response.data; return response.data;
}, },
}; };

90
frontend/src/lib/debug.ts Normal file
View File

@@ -0,0 +1,90 @@
/**
* Debug logging utility
* Allows conditional logging based on environment or debug flags
*/
// Check if we're in development mode or debug is explicitly enabled
const isDebugEnabled = (): boolean => {
if (typeof window === 'undefined') {
// Server-side: check NODE_ENV
return process.env.NODE_ENV === 'development' || process.env.DEBUG === 'true';
}
// Client-side: check localStorage flag or development mode
try {
return (
process.env.NODE_ENV === 'development' ||
localStorage.getItem('debug') === 'true' ||
window.location.search.includes('debug=true')
);
} catch {
return process.env.NODE_ENV === 'development';
}
};
/**
* Debug logger that only outputs in development or when debug is enabled
*/
export const debug = {
log: (...args: any[]) => {
if (isDebugEnabled()) {
console.log('[DEBUG]', ...args);
}
},
warn: (...args: any[]) => {
if (isDebugEnabled()) {
console.warn('[DEBUG]', ...args);
}
},
error: (...args: any[]) => {
if (isDebugEnabled()) {
console.error('[DEBUG]', ...args);
}
},
group: (label: string) => {
if (isDebugEnabled()) {
console.group(`[DEBUG] ${label}`);
}
},
groupEnd: () => {
if (isDebugEnabled()) {
console.groupEnd();
}
},
time: (label: string) => {
if (isDebugEnabled()) {
console.time(`[DEBUG] ${label}`);
}
},
timeEnd: (label: string) => {
if (isDebugEnabled()) {
console.timeEnd(`[DEBUG] ${label}`);
}
}
};
/**
* Enable debug mode (persists in localStorage)
*/
export const enableDebug = () => {
if (typeof window !== 'undefined') {
localStorage.setItem('debug', 'true');
console.log('Debug mode enabled. Reload page to see debug output.');
}
};
/**
* Disable debug mode
*/
export const disableDebug = () => {
if (typeof window !== 'undefined') {
localStorage.removeItem('debug');
console.log('Debug mode disabled. Reload page to hide debug output.');
}
};

View File

@@ -0,0 +1,32 @@
/**
* Progress tracking utilities for bulk operations
*/
export interface ProgressUpdate {
type: 'progress' | 'completed' | 'error';
current: number;
total: number;
message: string;
url?: string;
title?: string;
author?: string;
wordCount?: number;
totalWordCount?: number;
error?: string;
combinedStory?: any;
results?: any[];
summary?: any;
hasImages?: boolean;
imageWarnings?: string[];
}
// Global progress storage (in production, use Redis or database)
export const progressStore = new Map<string, ProgressUpdate[]>();
// Helper function for other routes to send progress updates
export function sendProgressUpdate(sessionId: string, update: ProgressUpdate) {
if (!progressStore.has(sessionId)) {
progressStore.set(sessionId, []);
}
progressStore.get(sessionId)!.push(update);
}

View File

@@ -1,5 +1,6 @@
import DOMPurify from 'dompurify'; import DOMPurify from 'dompurify';
import { configApi } from './api'; import { configApi } from './api';
import { debug } from './debug';
interface SanitizationConfig { interface SanitizationConfig {
allowedTags: string[]; allowedTags: string[];
@@ -28,7 +29,7 @@ function filterCssProperties(styleValue: string, allowedProperties: string[]): s
const isAllowed = allowedProperties.includes(property); const isAllowed = allowedProperties.includes(property);
if (!isAllowed) { if (!isAllowed) {
console.log(`CSS property "${property}" was filtered out (not in allowed list)`); debug.log(`CSS property "${property}" was filtered out (not in allowed list)`);
} }
return isAllowed; return isAllowed;
@@ -37,9 +38,9 @@ function filterCssProperties(styleValue: string, allowedProperties: string[]): s
const result = filteredDeclarations.join('; '); const result = filteredDeclarations.join('; ');
if (declarations.length !== filteredDeclarations.length) { if (declarations.length !== filteredDeclarations.length) {
console.log(`CSS filtering: ${declarations.length} -> ${filteredDeclarations.length} properties`); debug.log(`CSS filtering: ${declarations.length} -> ${filteredDeclarations.length} properties`);
console.log('Original:', styleValue); debug.log('Original:', styleValue);
console.log('Filtered:', result); debug.log('Filtered:', result);
} }
return result; return result;
@@ -152,7 +153,8 @@ function createDOMPurifyConfig(config: SanitizationConfig) {
const domPurifyConfig: DOMPurify.Config = { const domPurifyConfig: DOMPurify.Config = {
ALLOWED_TAGS: allowedTags, ALLOWED_TAGS: allowedTags,
ALLOWED_ATTR: uniqueAttributes, ALLOWED_ATTR: uniqueAttributes,
ALLOWED_URI_REGEXP: /^(?:(?:https?|#|\/):?\/?)[\w.\-#/?=&%]+$/i, // More permissive URL regex to allow complex query strings and tokens
ALLOWED_URI_REGEXP: /^(?:(?:https?|data|#|\/):)?[\s\S]*$/i,
ALLOW_UNKNOWN_PROTOCOLS: false, ALLOW_UNKNOWN_PROTOCOLS: false,
SANITIZE_DOM: true, SANITIZE_DOM: true,
KEEP_CONTENT: true, KEEP_CONTENT: true,
@@ -179,6 +181,75 @@ function createDOMPurifyConfig(config: SanitizationConfig) {
return domPurifyConfig; return domPurifyConfig;
} }
/**
* Preprocess HTML to extract images from figure tags before sanitization
*/
function preprocessFigureTags(html: string): string {
if (!html || html.trim() === '') {
return html;
}
try {
const parser = new DOMParser();
const doc = parser.parseFromString(html, 'text/html');
const figures = doc.querySelectorAll('figure');
figures.forEach((figure) => {
// Find img tags anywhere within the figure (deep search)
const images = figure.querySelectorAll('img');
if (images.length > 0) {
// Extract the first image
const img = images[0];
// Get the src attribute - it might be in the src attribute or data-src
const imgSrc = img.getAttribute('src') || img.getAttribute('data-src') || img.src || '';
if (!imgSrc || imgSrc.trim() === '') {
figure.remove();
return;
}
// Create a clean img element with just the essential attributes
const cleanImg = doc.createElement('img');
cleanImg.setAttribute('src', imgSrc);
// Preserve alt text
const existingAlt = img.getAttribute('alt') || img.alt;
if (existingAlt) {
cleanImg.setAttribute('alt', existingAlt);
} else {
// Check if there's a figcaption to use as alt text
const figcaption = figure.querySelector('figcaption');
if (figcaption) {
const captionText = figcaption.textContent?.trim();
if (captionText) {
cleanImg.setAttribute('alt', captionText);
}
}
}
// Preserve other useful attributes if they exist
const width = img.getAttribute('width') || img.width;
const height = img.getAttribute('height') || img.height;
if (width) cleanImg.setAttribute('width', width.toString());
if (height) cleanImg.setAttribute('height', height.toString());
// Replace the figure element with just the clean img
figure.replaceWith(cleanImg);
} else {
// No images in figure, remove it entirely
figure.remove();
}
});
return doc.body.innerHTML;
} catch (error) {
console.warn('Failed to preprocess figure tags, returning original HTML:', error);
return html;
}
}
/** /**
* Sanitize HTML content using shared configuration from backend * Sanitize HTML content using shared configuration from backend
*/ */
@@ -188,11 +259,14 @@ export async function sanitizeHtml(html: string): Promise<string> {
} }
try { try {
// Preprocess to extract images from figure tags
const preprocessed = preprocessFigureTags(html);
const config = await fetchSanitizationConfig(); const config = await fetchSanitizationConfig();
const domPurifyConfig = createDOMPurifyConfig(config); const domPurifyConfig = createDOMPurifyConfig(config);
// Configure DOMPurify with our settings // Configure DOMPurify with our settings
const cleanHtml = DOMPurify.sanitize(html, domPurifyConfig as any); const cleanHtml = DOMPurify.sanitize(preprocessed, domPurifyConfig as any);
return cleanHtml.toString(); return cleanHtml.toString();
} catch (error) { } catch (error) {
@@ -211,15 +285,18 @@ export function sanitizeHtmlSync(html: string): string {
return ''; return '';
} }
// Preprocess to extract images from figure tags
const preprocessed = preprocessFigureTags(html);
// If we have cached config, use it // If we have cached config, use it
if (cachedConfig) { if (cachedConfig) {
const domPurifyConfig = createDOMPurifyConfig(cachedConfig); const domPurifyConfig = createDOMPurifyConfig(cachedConfig);
return DOMPurify.sanitize(html, domPurifyConfig as any).toString(); return DOMPurify.sanitize(preprocessed, domPurifyConfig as any).toString();
} }
// If we don't have cached config but there's an ongoing request, wait for it // If we don't have cached config but there's an ongoing request, wait for it
if (configPromise) { if (configPromise) {
console.log('Sanitization config loading in progress, using fallback for now'); debug.log('Sanitization config loading in progress, using fallback for now');
} else { } else {
// No config and no ongoing request - try to load it for next time // No config and no ongoing request - try to load it for next time
console.warn('No cached sanitization config available, triggering load for future use'); console.warn('No cached sanitization config available, triggering load for future use');
@@ -229,7 +306,7 @@ export function sanitizeHtmlSync(html: string): string {
} }
// Use comprehensive fallback configuration that preserves formatting // Use comprehensive fallback configuration that preserves formatting
console.log('Using fallback sanitization configuration with formatting support'); debug.log('Using fallback sanitization configuration with formatting support');
const fallbackAllowedCssProperties = [ const fallbackAllowedCssProperties = [
'color', 'font-size', 'font-weight', 'color', 'font-size', 'font-weight',
'font-style', 'text-align', 'text-decoration', 'margin', 'font-style', 'text-align', 'text-decoration', 'margin',
@@ -246,8 +323,10 @@ export function sanitizeHtmlSync(html: string): string {
'blockquote', 'cite', 'q', 'hr', 'details', 'summary' 'blockquote', 'cite', 'q', 'hr', 'details', 'summary'
], ],
ALLOWED_ATTR: [ ALLOWED_ATTR: [
'class', 'style', 'colspan', 'rowspan', 'src', 'alt', 'width', 'height' 'class', 'style', 'colspan', 'rowspan', 'src', 'alt', 'width', 'height', 'href', 'title'
], ],
// More permissive URL regex to allow complex query strings and tokens
ALLOWED_URI_REGEXP: /^(?:(?:https?|data|#|\/):)?[\s\S]*$/i,
ALLOW_UNKNOWN_PROTOCOLS: false, ALLOW_UNKNOWN_PROTOCOLS: false,
SANITIZE_DOM: true, SANITIZE_DOM: true,
KEEP_CONTENT: true, KEEP_CONTENT: true,
@@ -270,7 +349,7 @@ export function sanitizeHtmlSync(html: string): string {
} }
}); });
return DOMPurify.sanitize(html, fallbackConfig as any).toString(); return DOMPurify.sanitize(preprocessed, fallbackConfig as any).toString();
} }
/** /**

View File

@@ -129,8 +129,7 @@ export async function cleanHtml(html: string): Promise<string> {
const cheerio = await import('cheerio'); const cheerio = await import('cheerio');
const $ = cheerio.load(html, { const $ = cheerio.load(html, {
// Preserve self-closing tags like <br> // Preserve self-closing tags like <br>
xmlMode: false, xmlMode: false
decodeEntities: false
}); });
// Remove dangerous elements // Remove dangerous elements

View File

@@ -182,7 +182,7 @@ export function extractLinkText(
$: cheerio.CheerioAPI, $: cheerio.CheerioAPI,
config: LinkTextStrategy config: LinkTextStrategy
): string { ): string {
let searchScope: cheerio.Cheerio<cheerio.AnyNode>; let searchScope: any;
if (config.searchWithin) { if (config.searchWithin) {
searchScope = $(config.searchWithin); searchScope = $(config.searchWithin);
@@ -196,7 +196,7 @@ export function extractLinkText(
config.nearText.forEach(text => { config.nearText.forEach(text => {
if (foundText) return; // Already found if (foundText) return; // Already found
searchScope.find('*').each((_, elem) => { searchScope.find('*').each((_: any, elem: any) => {
const $elem = $(elem); const $elem = $(elem);
const elemText = $elem.text().toLowerCase(); const elemText = $elem.text().toLowerCase();

View File

@@ -0,0 +1,246 @@
/**
* Utility for tracking image processing progress
*
* Usage example:
*
* // After saving a story, start polling for progress
* const progressTracker = new ImageProcessingProgressTracker(storyId);
*
* progressTracker.onProgress((progress) => {
* console.log(`Processing ${progress.processedImages}/${progress.totalImages} images`);
* console.log(`Current: ${progress.currentImageUrl}`);
* console.log(`Status: ${progress.status}`);
* });
*
* progressTracker.onComplete((finalProgress) => {
* console.log('Image processing completed!');
* });
*
* progressTracker.onError((error) => {
* console.error('Image processing failed:', error);
* });
*
* progressTracker.start();
*/
export interface ImageProcessingProgress {
isProcessing: boolean;
totalImages: number;
processedImages: number;
currentImageUrl: string;
status: string;
progressPercentage: number;
completed: boolean;
error: string;
message?: string;
}
export type ProgressCallback = (progress: ImageProcessingProgress) => void;
export type CompleteCallback = (finalProgress: ImageProcessingProgress) => void;
export type ErrorCallback = (error: string) => void;
export class ImageProcessingProgressTracker {
private storyId: string;
private pollInterval: number;
private timeoutMs: number;
private isPolling: boolean = false;
private pollTimer: NodeJS.Timeout | null = null;
private startTime: number = 0;
private progressCallbacks: ProgressCallback[] = [];
private completeCallbacks: CompleteCallback[] = [];
private errorCallbacks: ErrorCallback[] = [];
constructor(
storyId: string,
pollInterval: number = 1000, // Poll every 1 second
timeoutMs: number = 300000 // 5 minute timeout
) {
this.storyId = storyId;
this.pollInterval = pollInterval;
this.timeoutMs = timeoutMs;
}
public onProgress(callback: ProgressCallback): void {
this.progressCallbacks.push(callback);
}
public onComplete(callback: CompleteCallback): void {
this.completeCallbacks.push(callback);
}
public onError(callback: ErrorCallback): void {
this.errorCallbacks.push(callback);
}
public async start(): Promise<void> {
if (this.isPolling) {
console.warn('Progress tracking already started');
return;
}
this.isPolling = true;
this.startTime = Date.now();
console.log(`Starting image processing progress tracking for story ${this.storyId}`);
this.poll();
}
public stop(): void {
this.isPolling = false;
if (this.pollTimer) {
clearTimeout(this.pollTimer);
this.pollTimer = null;
}
console.log(`Stopped progress tracking for story ${this.storyId}`);
}
private async poll(): Promise<void> {
if (!this.isPolling) {
return;
}
// Check for timeout
const elapsed = Date.now() - this.startTime;
if (elapsed > this.timeoutMs) {
this.handleError('Image processing timed out');
return;
}
try {
const response = await fetch(`/api/stories/${this.storyId}/image-processing-progress`);
if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}
const progress: ImageProcessingProgress = await response.json();
// Call progress callbacks
this.progressCallbacks.forEach(callback => {
try {
callback(progress);
} catch (error) {
console.error('Error in progress callback:', error);
}
});
// Check if processing is complete
if (progress.completed) {
this.handleComplete(progress);
return;
}
// Check for errors
if (progress.error) {
this.handleError(progress.error);
return;
}
// Continue polling if still processing
if (progress.isProcessing) {
this.pollTimer = setTimeout(() => this.poll(), this.pollInterval);
} else {
// No active processing - might have finished or never started
this.handleComplete(progress);
}
} catch (error) {
this.handleError(`Failed to fetch progress: ${error instanceof Error ? error.message : 'Unknown error'}`);
}
}
private handleComplete(finalProgress: ImageProcessingProgress): void {
this.stop();
console.log(`Image processing completed for story ${this.storyId}`);
this.completeCallbacks.forEach(callback => {
try {
callback(finalProgress);
} catch (error) {
console.error('Error in complete callback:', error);
}
});
}
private handleError(error: string): void {
this.stop();
console.error(`Image processing error for story ${this.storyId}:`, error);
this.errorCallbacks.forEach(callback => {
try {
callback(error);
} catch (error) {
console.error('Error in error callback:', error);
}
});
}
}
/**
* React hook for image processing progress
*
* Note: This hook requires React to be imported in the file where it's used.
* To use this hook, import React in your component file:
*
* import React from 'react';
* import { useImageProcessingProgress } from '../utils/imageProcessingProgress';
*
* Usage:
* const { progress, isTracking, startTracking } = useImageProcessingProgress(storyId);
*/
import React from 'react';
export function useImageProcessingProgress(storyId: string) {
const [progress, setProgress] = React.useState<ImageProcessingProgress | null>(null);
const [isTracking, setIsTracking] = React.useState(false);
const [tracker, setTracker] = React.useState<ImageProcessingProgressTracker | null>(null);
const startTracking = React.useCallback(() => {
if (tracker) {
tracker.stop();
}
const newTracker = new ImageProcessingProgressTracker(storyId);
newTracker.onProgress((progress) => {
setProgress(progress);
});
newTracker.onComplete((finalProgress) => {
setProgress(finalProgress);
setIsTracking(false);
});
newTracker.onError((error) => {
console.error('Image processing error:', error);
setIsTracking(false);
});
setTracker(newTracker);
setIsTracking(true);
newTracker.start();
}, [storyId, tracker]);
const stopTracking = React.useCallback(() => {
if (tracker) {
tracker.stop();
setIsTracking(false);
}
}, [tracker]);
React.useEffect(() => {
return () => {
if (tracker) {
tracker.stop();
}
};
}, [tracker]);
return {
progress,
isTracking,
startTracking,
stopTracking
};
}

File diff suppressed because one or more lines are too long

View File

@@ -13,7 +13,7 @@ http {
server { server {
listen 80; listen 80;
client_max_body_size 256M; client_max_body_size 600M;
# Frontend routes # Frontend routes
location / { location / {
@@ -55,6 +55,10 @@ http {
proxy_connect_timeout 900s; proxy_connect_timeout 900s;
proxy_send_timeout 900s; proxy_send_timeout 900s;
proxy_read_timeout 900s; proxy_read_timeout 900s;
# Large upload settings
client_max_body_size 600M;
proxy_request_buffering off;
proxy_max_temp_file_size 0;
} }
# Static image serving # Static image serving

94
opensearch.Dockerfile Normal file
View File

@@ -0,0 +1,94 @@
# Custom OpenSearch Dockerfile with Java 21 for compatibility
FROM amazoncorretto:21-alpine AS java-base
# Download and extract OpenSearch
FROM java-base AS opensearch-builder
WORKDIR /tmp
RUN apk add --no-cache curl tar && \
curl -L https://artifacts.opensearch.org/releases/bundle/opensearch/3.2.0/opensearch-3.2.0-linux-x64.tar.gz | \
tar -xz && \
mv opensearch-3.2.0 /usr/share/opensearch
# Final runtime image
FROM java-base
WORKDIR /usr/share/opensearch
# Create opensearch user
RUN addgroup -g 1000 opensearch && \
adduser -u 1000 -G opensearch -s /bin/sh -D opensearch
# Copy OpenSearch from builder stage
COPY --from=opensearch-builder --chown=opensearch:opensearch /usr/share/opensearch /usr/share/opensearch
# Install necessary packages
RUN apk add --no-cache bash curl
# Debug: Check Java installation and set correct paths
RUN which java && java -version && \
ls -la /usr/lib/jvm/ && \
ln -sf /usr/lib/jvm/java-21-amazon-corretto /usr/lib/jvm/default-jvm
# Set environment variables
ENV JAVA_HOME=/usr/lib/jvm/java-21-amazon-corretto
ENV OPENSEARCH_JAVA_HOME=/usr/lib/jvm/java-21-amazon-corretto
ENV PATH=$PATH:$JAVA_HOME/bin
# Create required directories and disable security plugin
RUN mkdir -p /usr/share/opensearch/data && \
mkdir -p /usr/share/opensearch/logs && \
echo "plugins.security.disabled: true" >> /usr/share/opensearch/config/opensearch.yml && \
echo "discovery.type: single-node" >> /usr/share/opensearch/config/opensearch.yml && \
echo "cluster.name: storycove-opensearch" >> /usr/share/opensearch/config/opensearch.yml && \
echo "node.name: opensearch-node" >> /usr/share/opensearch/config/opensearch.yml && \
echo "bootstrap.memory_lock: false" >> /usr/share/opensearch/config/opensearch.yml && \
echo "network.host: 0.0.0.0" >> /usr/share/opensearch/config/opensearch.yml && \
echo "logger.level: DEBUG" >> /usr/share/opensearch/config/opensearch.yml && \
echo "node.processors: 1" >> /usr/share/opensearch/config/opensearch.yml && \
rm -rf /usr/share/opensearch/plugins/opensearch-performance-analyzer && \
rm -rf /usr/share/opensearch/agent && \
echo "# Custom JVM options for Synology NAS compatibility" > /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-Dlucene.useVectorAPI=false" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-Dorg.apache.lucene.store.MMapDirectory.enableMemorySegments=false" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-Djava.awt.headless=true" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-XX:+UseContainerSupport" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-Dorg.opensearch.bootstrap.start_timeout=300s" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-Dopensearch.logger.level=INFO" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "--add-opens=jdk.unsupported/sun.misc=ALL-UNNAMED" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "--add-opens=java.base/java.util=ALL-UNNAMED" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "--add-opens=java.base/java.lang=ALL-UNNAMED" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "--add-modules=jdk.unsupported" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-XX:+UnlockExperimentalVMOptions" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-XX:-UseVectorApi" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
echo "-Djdk.incubator.vector.VECTOR_ACCESS_OOB_CHECK=0" >> /usr/share/opensearch/config/jvm.options.d/synology.options && \
sed -i '/javaagent/d' /usr/share/opensearch/config/jvm.options && \
echo '#!/bin/bash' > /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Starting OpenSearch with Java 21..."' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Java version:"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'java -version' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Memory info:"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'free -h 2>/dev/null || echo "Memory info not available"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Starting OpenSearch process..."' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Architecture info:"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'uname -a' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "CPU info:"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'grep -E "^(processor|model name|flags)" /proc/cpuinfo | head -10 || echo "CPU info not available"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Using JVM options file: /usr/share/opensearch/config/jvm.options.d/synology.options"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'cat /usr/share/opensearch/config/jvm.options.d/synology.options' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Environment OPENSEARCH_JAVA_OPTS: $OPENSEARCH_JAVA_OPTS"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Attempting to force disable vector operations..."' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'export OPENSEARCH_JAVA_OPTS="$OPENSEARCH_JAVA_OPTS -Dlucene.useVectorAPI=false -Dorg.apache.lucene.store.MMapDirectory.enableMemorySegments=false --limit-modules=java.base,java.logging,java.xml,java.management,java.naming,java.desktop,java.security.jgss,jdk.unsupported"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Final OPENSEARCH_JAVA_OPTS: $OPENSEARCH_JAVA_OPTS"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "Starting OpenSearch binary..."' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'timeout 300s /usr/share/opensearch/bin/opensearch &' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'OPENSEARCH_PID=$!' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'echo "OpenSearch started with PID: $OPENSEARCH_PID"' >> /usr/share/opensearch/start-opensearch.sh && \
echo 'wait $OPENSEARCH_PID' >> /usr/share/opensearch/start-opensearch.sh && \
chmod +x /usr/share/opensearch/start-opensearch.sh && \
chown -R opensearch:opensearch /usr/share/opensearch
USER opensearch
EXPOSE 9200 9300
# Use startup script for better debugging
ENTRYPOINT ["/usr/share/opensearch/start-opensearch.sh"]

3010
package-lock.json generated

File diff suppressed because it is too large Load Diff

31
solr.Dockerfile Normal file
View File

@@ -0,0 +1,31 @@
FROM solr:9.9.0
# Switch to root to set up configuration
USER root
# Copy Solr configurations into the image
COPY ./solr/stories /opt/solr-9.9.0/server/solr/configsets/storycove_stories
COPY ./solr/authors /opt/solr-9.9.0/server/solr/configsets/storycove_authors
# Create initialization script using the precreate-core pattern
COPY <<EOF /docker-entrypoint-initdb.d/init-cores.sh
#!/bin/bash
echo "StoryCove: Initializing cores..."
# Use solr's built-in precreate-core functionality
precreate-core storycove_stories /opt/solr-9.9.0/server/solr/configsets/storycove_stories
precreate-core storycove_authors /opt/solr-9.9.0/server/solr/configsets/storycove_authors
echo "StoryCove: Core initialization complete!"
EOF
# Ensure proper permissions and make script executable
RUN chown -R solr:solr /opt/solr-9.9.0/server/solr/configsets/ && \
chmod +x /docker-entrypoint-initdb.d/init-cores.sh && \
chown solr:solr /docker-entrypoint-initdb.d/init-cores.sh
# Switch back to solr user
USER solr
# Use the default Solr entrypoint
CMD ["solr-foreground"]

104
solr/authors/conf/managed-schema Executable file
View File

@@ -0,0 +1,104 @@
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Solr Schema for StoryCove Authors Core
Based on AuthorSearchDto data model
-->
<schema name="storycove-authors" version="1.6">
<!-- Field Types -->
<!-- String field type for exact matching -->
<fieldType name="string" class="solr.StrField" sortMissingLast="true" />
<!-- Text field type for full-text search -->
<fieldType name="text_general" class="solr.TextField" positionIncrementGap="100">
<analyzer type="index">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
<analyzer type="query">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
<filter class="solr.SynonymGraphFilterFactory" synonyms="synonyms.txt" ignoreCase="true" expand="true"/>
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
</fieldType>
<!-- Enhanced text field for names -->
<fieldType name="text_enhanced" class="solr.TextField" positionIncrementGap="100">
<analyzer type="index">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
<filter class="solr.LowerCaseFilterFactory"/>
<filter class="solr.EdgeNGramFilterFactory" minGramSize="2" maxGramSize="15"/>
</analyzer>
<analyzer type="query">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
</fieldType>
<!-- Integer field type -->
<fieldType name="pint" class="solr.IntPointField" docValues="true"/>
<!-- Long field type -->
<fieldType name="plong" class="solr.LongPointField" docValues="true"/>
<!-- Double field type -->
<fieldType name="pdouble" class="solr.DoublePointField" docValues="true"/>
<!-- Date field type -->
<fieldType name="pdate" class="solr.DatePointField" docValues="true"/>
<!-- Multi-valued string for URLs -->
<fieldType name="strings" class="solr.StrField" sortMissingLast="true" multiValued="true"/>
<!-- Random sort field type for random ordering -->
<fieldType name="random" class="solr.RandomSortField" indexed="true"/>
<!-- Fields -->
<!-- Required Fields -->
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
<field name="_version_" type="plong" indexed="false" stored="false"/>
<!-- Core Author Fields -->
<field name="name" type="text_enhanced" indexed="true" stored="true" required="true"/>
<field name="notes" type="text_general" indexed="true" stored="true"/>
<field name="authorRating" type="pint" indexed="true" stored="true"/>
<field name="averageStoryRating" type="pdouble" indexed="true" stored="true"/>
<field name="storyCount" type="pint" indexed="true" stored="true"/>
<field name="urls" type="strings" indexed="true" stored="true"/>
<field name="avatarImagePath" type="string" indexed="false" stored="true"/>
<!-- Multi-tenant Library Separation -->
<field name="libraryId" type="string" indexed="true" stored="true" required="false" default="default"/>
<!-- Timestamp Fields -->
<field name="createdAt" type="pdate" indexed="true" stored="true"/>
<field name="updatedAt" type="pdate" indexed="true" stored="true"/>
<!-- Search-specific Fields -->
<field name="searchScore" type="plong" indexed="false" stored="true"/>
<!-- Combined search field for general queries -->
<field name="text" type="text_general" indexed="true" stored="false" multiValued="true"/>
<!-- Copy Fields for comprehensive search -->
<copyField source="name" dest="text"/>
<copyField source="notes" dest="text"/>
<copyField source="urls" dest="text"/>
<!-- Default Search Field -->
<!-- Dynamic Fields -->
<!-- Random sort dynamic field for generating random orderings -->
<dynamicField name="random_*" type="random" indexed="true" stored="false"/>
<!-- UniqueKey -->
<uniqueKey>id</uniqueKey>
</schema>

134
solr/authors/conf/solrconfig.xml Executable file
View File

@@ -0,0 +1,134 @@
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Solr Configuration for StoryCove Authors Core
Optimized for author search with highlighting and faceting
-->
<config>
<luceneMatchVersion>9.9.0</luceneMatchVersion>
<!-- DataDir configuration -->
<dataDir>${solr.data.dir:}</dataDir>
<!-- Directory Factory -->
<directoryFactory name="DirectoryFactory"
class="${solr.directoryFactory:solr.NRTCachingDirectoryFactory}"/>
<!-- CodecFactory -->
<codecFactory class="solr.SchemaCodecFactory"/>
<!-- Index Configuration -->
<indexConfig>
<lockType>${solr.lock.type:native}</lockType>
<infoStream>true</infoStream>
</indexConfig>
<!-- JMX Configuration -->
<jmx />
<!-- Update Handler -->
<updateHandler class="solr.DirectUpdateHandler2">
<updateLog>
<str name="dir">${solr.ulog.dir:}</str>
<int name="numVersionBuckets">${solr.ulog.numVersionBuckets:65536}</int>
</updateLog>
<autoCommit>
<maxTime>15000</maxTime>
<openSearcher>false</openSearcher>
</autoCommit>
<autoSoftCommit>
<maxTime>1000</maxTime>
</autoSoftCommit>
</updateHandler>
<!-- Query Configuration -->
<query>
<maxBooleanClauses>1024</maxBooleanClauses>
<filterCache class="solr.CaffeineCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<queryResultCache class="solr.CaffeineCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<documentCache class="solr.CaffeineCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<enableLazyFieldLoading>true</enableLazyFieldLoading>
</query>
<!-- Request Dispatcher -->
<requestDispatcher handleSelect="false" >
<requestParsers enableRemoteStreaming="true"
multipartUploadLimitInKB="2048000"
formdataUploadLimitInKB="2048"
addHttpRequestToContext="false"/>
<httpCaching never304="true" />
</requestDispatcher>
<!-- Request Handlers -->
<!-- Standard Select Handler -->
<requestHandler name="/select" class="solr.SearchHandler">
<lst name="defaults">
<str name="echoParams">explicit</str>
<int name="rows">10</int>
<str name="df">text</str>
<str name="wt">json</str>
<str name="indent">true</str>
<str name="hl">true</str>
<str name="hl.fl">name,notes</str>
<str name="hl.simple.pre">&lt;em&gt;</str>
<str name="hl.simple.post">&lt;/em&gt;</str>
<str name="hl.fragsize">150</str>
<str name="hl.maxAnalyzedChars">51200</str>
</lst>
</requestHandler>
<!-- Update Handler -->
<requestHandler name="/update" class="solr.UpdateRequestHandler" />
<!-- Admin Handlers -->
<requestHandler name="/admin/ping" class="solr.PingRequestHandler">
<lst name="invariants">
<str name="q">*:*</str>
</lst>
<lst name="defaults">
<str name="echoParams">all</str>
</lst>
</requestHandler>
<!-- Suggester Handler -->
<requestHandler name="/suggest" class="solr.SearchHandler" startup="lazy">
<lst name="defaults">
<str name="suggest">true</str>
<str name="suggest.count">10</str>
</lst>
<arr name="components">
<str>suggest</str>
</arr>
</requestHandler>
<!-- Search Components -->
<searchComponent name="suggest" class="solr.SuggestComponent">
<lst name="suggester">
<str name="name">authorSuggester</str>
<str name="lookupImpl">AnalyzingInfixLookupFactory</str>
<str name="dictionaryImpl">DocumentDictionaryFactory</str>
<str name="field">name</str>
<str name="weightField">storyCount</str>
<str name="suggestAnalyzerFieldType">text_general</str>
<str name="buildOnStartup">false</str>
<str name="buildOnCommit">false</str>
</lst>
</searchComponent>
<!-- Response Writers -->
<queryResponseWriter name="json" class="solr.JSONResponseWriter">
<str name="content-type">application/json; charset=UTF-8</str>
</queryResponseWriter>
</config>

34
solr/authors/conf/stopwords.txt Executable file
View File

@@ -0,0 +1,34 @@
# English stopwords for author search
a
an
and
are
as
at
be
but
by
for
if
in
into
is
it
no
not
of
on
or
such
that
the
their
then
there
these
they
this
to
was
will
with

9
solr/authors/conf/synonyms.txt Executable file
View File

@@ -0,0 +1,9 @@
# Synonyms for author search
# Format: word1,word2,word3 => synonym1,synonym2
writer,author,novelist
pen name,pseudonym,alias
prolific,productive
acclaimed,famous,renowned
bestselling,popular
contemporary,modern
classic,traditional

143
solr/stories/conf/managed-schema Executable file
View File

@@ -0,0 +1,143 @@
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Solr Schema for StoryCove Stories Core
Based on StorySearchDto data model
-->
<schema name="storycove-stories" version="1.6">
<!-- Field Types -->
<!-- String field type for exact matching -->
<fieldType name="string" class="solr.StrField" sortMissingLast="true" />
<!-- Text field type for full-text search -->
<fieldType name="text_general" class="solr.TextField" positionIncrementGap="100">
<analyzer type="index">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
<analyzer type="query">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
<filter class="solr.SynonymGraphFilterFactory" synonyms="synonyms.txt" ignoreCase="true" expand="true"/>
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
</fieldType>
<!-- Enhanced text field for titles and important content -->
<fieldType name="text_enhanced" class="solr.TextField" positionIncrementGap="100">
<analyzer type="index">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
<filter class="solr.LowerCaseFilterFactory"/>
<filter class="solr.EdgeNGramFilterFactory" minGramSize="2" maxGramSize="15"/>
</analyzer>
<analyzer type="query">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StopFilterFactory" ignoreCase="true" words="stopwords.txt" />
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
</fieldType>
<!-- Integer field type -->
<fieldType name="pint" class="solr.IntPointField" docValues="true"/>
<!-- Long field type -->
<fieldType name="plong" class="solr.LongPointField" docValues="true"/>
<!-- Double field type -->
<fieldType name="pdouble" class="solr.DoublePointField" docValues="true"/>
<!-- Boolean field type -->
<fieldType name="boolean" class="solr.BoolField" sortMissingLast="true"/>
<!-- Date field type -->
<fieldType name="pdate" class="solr.DatePointField" docValues="true"/>
<!-- Multi-valued string for tags and faceting -->
<fieldType name="strings" class="solr.StrField" sortMissingLast="true" multiValued="true" docValues="true"/>
<!-- Single string for exact matching and faceting -->
<fieldType name="string_facet" class="solr.StrField" sortMissingLast="true" docValues="true"/>
<!-- Random sort field type for random ordering -->
<fieldType name="random" class="solr.RandomSortField" indexed="true"/>
<!-- Fields -->
<!-- Required Fields -->
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
<field name="_version_" type="plong" indexed="false" stored="false"/>
<!-- Core Story Fields -->
<field name="title" type="text_enhanced" indexed="true" stored="true" required="true"/>
<field name="description" type="text_general" indexed="true" stored="true"/>
<field name="sourceUrl" type="string" indexed="true" stored="true"/>
<field name="coverPath" type="string" indexed="false" stored="true"/>
<field name="wordCount" type="pint" indexed="true" stored="true"/>
<field name="rating" type="pint" indexed="true" stored="true"/>
<field name="averageStoryRating" type="pdouble" indexed="true" stored="true"/>
<field name="volume" type="pint" indexed="true" stored="true"/>
<!-- Multi-tenant Library Separation -->
<field name="libraryId" type="string" indexed="true" stored="true" required="false" default="default"/>
<!-- Reading Status Fields -->
<field name="isRead" type="boolean" indexed="true" stored="true"/>
<field name="readingPosition" type="pint" indexed="true" stored="true"/>
<field name="lastReadAt" type="pdate" indexed="true" stored="true"/>
<field name="lastRead" type="pdate" indexed="true" stored="true"/>
<!-- Author Fields -->
<field name="authorId" type="string" indexed="true" stored="true"/>
<field name="authorName" type="text_enhanced" indexed="true" stored="true"/>
<field name="authorName_facet" type="string_facet" indexed="true" stored="false"/>
<!-- Series Fields -->
<field name="seriesId" type="string" indexed="true" stored="true"/>
<field name="seriesName" type="text_enhanced" indexed="true" stored="true"/>
<field name="seriesName_facet" type="string_facet" indexed="true" stored="false"/>
<!-- Tag Fields -->
<field name="tagNames" type="strings" indexed="true" stored="true"/>
<field name="tagNames_facet" type="strings" indexed="true" stored="false"/>
<!-- Timestamp Fields -->
<field name="createdAt" type="pdate" indexed="true" stored="true"/>
<field name="updatedAt" type="pdate" indexed="true" stored="true"/>
<field name="dateAdded" type="pdate" indexed="true" stored="true"/>
<!-- Search-specific Fields -->
<field name="searchScore" type="pdouble" indexed="false" stored="true"/>
<field name="highlights" type="strings" indexed="false" stored="true"/>
<!-- Combined search field for general queries -->
<field name="text" type="text_general" indexed="true" stored="false" multiValued="true"/>
<!-- Copy Fields for comprehensive search -->
<copyField source="title" dest="text"/>
<copyField source="description" dest="text"/>
<copyField source="authorName" dest="text"/>
<copyField source="seriesName" dest="text"/>
<copyField source="tagNames" dest="text"/>
<!-- Copy Fields for faceting -->
<copyField source="authorName" dest="authorName_facet"/>
<copyField source="seriesName" dest="seriesName_facet"/>
<copyField source="tagNames" dest="tagNames_facet"/>
<!-- Copy field for lastRead sorting compatibility -->
<copyField source="lastReadAt" dest="lastRead"/>
<!-- Default Search Field -->
<!-- Dynamic Fields -->
<!-- Random sort dynamic field for generating random orderings -->
<dynamicField name="random_*" type="random" indexed="true" stored="false"/>
<!-- UniqueKey -->
<uniqueKey>id</uniqueKey>
</schema>

153
solr/stories/conf/solrconfig.xml Executable file
View File

@@ -0,0 +1,153 @@
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Solr Configuration for StoryCove Stories Core
Optimized for story search with highlighting and faceting
-->
<config>
<luceneMatchVersion>9.9.0</luceneMatchVersion>
<!-- DataDir configuration -->
<dataDir>${solr.data.dir:}</dataDir>
<!-- Directory Factory -->
<directoryFactory name="DirectoryFactory"
class="${solr.directoryFactory:solr.NRTCachingDirectoryFactory}"/>
<!-- CodecFactory -->
<codecFactory class="solr.SchemaCodecFactory"/>
<!-- Index Configuration -->
<indexConfig>
<lockType>${solr.lock.type:native}</lockType>
<infoStream>true</infoStream>
</indexConfig>
<!-- JMX Configuration -->
<jmx />
<!-- Update Handler -->
<updateHandler class="solr.DirectUpdateHandler2">
<updateLog>
<str name="dir">${solr.ulog.dir:}</str>
<int name="numVersionBuckets">${solr.ulog.numVersionBuckets:65536}</int>
</updateLog>
<autoCommit>
<maxTime>15000</maxTime>
<openSearcher>false</openSearcher>
</autoCommit>
<autoSoftCommit>
<maxTime>1000</maxTime>
</autoSoftCommit>
</updateHandler>
<!-- Query Configuration -->
<query>
<maxBooleanClauses>1024</maxBooleanClauses>
<filterCache class="solr.CaffeineCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<queryResultCache class="solr.CaffeineCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<documentCache class="solr.CaffeineCache"
size="512"
initialSize="512"
autowarmCount="0"/>
<enableLazyFieldLoading>true</enableLazyFieldLoading>
</query>
<!-- Request Dispatcher -->
<requestDispatcher handleSelect="false" >
<requestParsers enableRemoteStreaming="true"
multipartUploadLimitInKB="2048000"
formdataUploadLimitInKB="2048"
addHttpRequestToContext="false"/>
<httpCaching never304="true" />
</requestDispatcher>
<!-- Request Handlers -->
<!-- Standard Select Handler -->
<requestHandler name="/select" class="solr.SearchHandler">
<lst name="defaults">
<str name="echoParams">explicit</str>
<int name="rows">10</int>
<str name="df">text</str>
<str name="wt">json</str>
<str name="indent">true</str>
<str name="hl">true</str>
<str name="hl.fl">title,description</str>
<str name="hl.simple.pre">&lt;em&gt;</str>
<str name="hl.simple.post">&lt;/em&gt;</str>
<str name="hl.fragsize">150</str>
<str name="hl.maxAnalyzedChars">51200</str>
<str name="facet">true</str>
<str name="facet.field">authorName</str>
<str name="facet.field">tagNames</str>
<str name="facet.field">seriesName</str>
<str name="facet.field">rating</str>
<str name="facet.field">isRead</str>
<str name="facet.mincount">1</str>
<str name="facet.sort">count</str>
</lst>
</requestHandler>
<!-- Update Handler -->
<requestHandler name="/update" class="solr.UpdateRequestHandler" />
<!-- Admin Handlers -->
<requestHandler name="/admin/ping" class="solr.PingRequestHandler">
<lst name="invariants">
<str name="q">*:*</str>
</lst>
<lst name="defaults">
<str name="echoParams">all</str>
</lst>
</requestHandler>
<!-- More Like This Handler -->
<requestHandler name="/mlt" class="solr.MoreLikeThisHandler">
<lst name="defaults">
<str name="mlt.fl">title,description</str>
<int name="mlt.mindf">2</int>
<int name="mlt.mintf">2</int>
<str name="mlt.qf">title^2.0 description^1.0</str>
<int name="rows">5</int>
</lst>
</requestHandler>
<!-- Suggester Handler -->
<requestHandler name="/suggest" class="solr.SearchHandler" startup="lazy">
<lst name="defaults">
<str name="suggest">true</str>
<str name="suggest.count">10</str>
</lst>
<arr name="components">
<str>suggest</str>
</arr>
</requestHandler>
<!-- Search Components -->
<searchComponent name="suggest" class="solr.SuggestComponent">
<lst name="suggester">
<str name="name">storySuggester</str>
<str name="lookupImpl">AnalyzingInfixLookupFactory</str>
<str name="dictionaryImpl">DocumentDictionaryFactory</str>
<str name="field">title</str>
<str name="weightField">rating</str>
<str name="suggestAnalyzerFieldType">text_general</str>
<str name="buildOnStartup">false</str>
<str name="buildOnCommit">false</str>
</lst>
</searchComponent>
<!-- Response Writers -->
<queryResponseWriter name="json" class="solr.JSONResponseWriter">
<str name="content-type">application/json; charset=UTF-8</str>
</queryResponseWriter>
</config>

34
solr/stories/conf/stopwords.txt Executable file
View File

@@ -0,0 +1,34 @@
# English stopwords for story search
a
an
and
are
as
at
be
but
by
for
if
in
into
is
it
no
not
of
on
or
such
that
the
their
then
there
these
they
this
to
was
will
with

16
solr/stories/conf/synonyms.txt Executable file
View File

@@ -0,0 +1,16 @@
# Synonyms for story search
# Format: word1,word2,word3 => synonym1,synonym2
fantasy,magical,magic
sci-fi,science fiction,scifi
romance,romantic,love
mystery,detective,crime
adventure,action
horror,scary,frightening
drama,dramatic
comedy,funny,humor
thriller,suspense
historical,history
contemporary,modern
short,brief
novel,book
story,tale,narrative