Compare commits
31 Commits
4357351ec8
...
feature/op
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
aae8f8926b | ||
|
|
f1773873d4 | ||
|
|
54df3c471e | ||
|
|
64f97f5648 | ||
|
|
c0b3ae3b72 | ||
|
|
e5596b5a17 | ||
|
|
c7b516be31 | ||
|
|
c92308c24a | ||
|
|
f92dcc5314 | ||
|
|
702fcb33c1 | ||
|
|
11b2a8b071 | ||
|
|
d1289bd616 | ||
|
|
15708b5ab2 | ||
|
|
a660056003 | ||
|
|
35a5825e76 | ||
|
|
87a4999ffe | ||
|
|
4ee5fa2330 | ||
|
|
6128d61349 | ||
|
|
5e347f2e2e | ||
|
|
8eb126a304 | ||
|
|
3dc02420fe | ||
|
|
241a15a174 | ||
|
|
6b97c0a70f | ||
|
|
e952241e3c | ||
|
|
65f1c6edc7 | ||
|
|
40fe3fdb80 | ||
|
|
95ce5fb532 | ||
|
|
1a99d9830d | ||
|
|
6b83783381 | ||
|
|
460ec358ca | ||
|
|
1d14d3d7aa |
@@ -14,11 +14,18 @@ JWT_SECRET=secure_jwt_secret_here
|
||||
# Application Authentication
|
||||
APP_PASSWORD=application_password_here
|
||||
|
||||
# Search Engine Configuration
|
||||
SEARCH_ENGINE=typesense
|
||||
|
||||
# Typesense Search Configuration
|
||||
TYPESENSE_API_KEY=secure_api_key_here
|
||||
TYPESENSE_ENABLED=true
|
||||
TYPESENSE_REINDEX_INTERVAL=3600000
|
||||
|
||||
# OpenSearch Configuration
|
||||
OPENSEARCH_USERNAME=admin
|
||||
OPENSEARCH_PASSWORD=secure_opensearch_password_here
|
||||
|
||||
# Image Storage
|
||||
IMAGE_STORAGE_PATH=/app/images
|
||||
|
||||
|
||||
@@ -18,10 +18,9 @@ JWT_SECRET=REPLACE_WITH_SECURE_JWT_SECRET_MINIMUM_32_CHARS
|
||||
# Use a strong password in production
|
||||
APP_PASSWORD=REPLACE_WITH_SECURE_APP_PASSWORD
|
||||
|
||||
# Typesense Search Configuration
|
||||
TYPESENSE_API_KEY=REPLACE_WITH_SECURE_TYPESENSE_API_KEY
|
||||
TYPESENSE_ENABLED=true
|
||||
TYPESENSE_REINDEX_INTERVAL=3600000
|
||||
# OpenSearch Configuration
|
||||
OPENSEARCH_PASSWORD=REPLACE_WITH_SECURE_OPENSEARCH_PASSWORD
|
||||
SEARCH_ENGINE=opensearch
|
||||
|
||||
# Image Storage
|
||||
IMAGE_STORAGE_PATH=/app/images
|
||||
|
||||
889
OPENSEARCH_MIGRATION_SPECIFICATION.md
Normal file
889
OPENSEARCH_MIGRATION_SPECIFICATION.md
Normal file
@@ -0,0 +1,889 @@
|
||||
# StoryCove Search Migration Specification: Typesense to OpenSearch
|
||||
|
||||
## Executive Summary
|
||||
|
||||
This document specifies the migration from Typesense to OpenSearch for the StoryCove application. The migration will be implemented using a parallel approach, maintaining Typesense functionality while gradually transitioning to OpenSearch, ensuring zero downtime and the ability to rollback if needed.
|
||||
|
||||
**Migration Goals:**
|
||||
- Solve random query reliability issues
|
||||
- Improve complex filtering performance
|
||||
- Maintain feature parity during transition
|
||||
- Zero downtime migration
|
||||
- Improved developer experience
|
||||
|
||||
---
|
||||
|
||||
## Current State Analysis
|
||||
|
||||
### Typesense Implementation Overview
|
||||
|
||||
**Service Architecture:**
|
||||
- `TypesenseService.java` (~2000 lines) - Primary search service
|
||||
- 3 search indexes: Stories, Authors, Collections
|
||||
- Multi-library support with dynamic collection names
|
||||
- Integration with Spring Boot backend
|
||||
|
||||
**Core Functionality:**
|
||||
1. **Full-text Search**: Stories, Authors with complex query building
|
||||
2. **Random Story Selection**: `_rand()` function with fallback logic
|
||||
3. **Advanced Filtering**: 15+ filter conditions with boolean logic
|
||||
4. **Faceting**: Tag aggregations and counts
|
||||
5. **Autocomplete**: Search suggestions with typeahead
|
||||
6. **CRUD Operations**: Index/update/delete for all entity types
|
||||
|
||||
**Current Issues Identified:**
|
||||
- `_rand()` function unreliability requiring complex fallback logic
|
||||
- Complex filter query building with escaping issues
|
||||
- Limited aggregation capabilities
|
||||
- Inconsistent API behavior across query patterns
|
||||
- Multi-collection management complexity
|
||||
|
||||
### Data Models and Schema
|
||||
|
||||
**Story Index Fields:**
|
||||
```java
|
||||
// Core fields
|
||||
UUID id, String title, String description, String sourceUrl
|
||||
Integer wordCount, Integer rating, Integer volume
|
||||
Boolean isRead, LocalDateTime lastReadAt, Integer readingPosition
|
||||
|
||||
// Relationships
|
||||
UUID authorId, String authorName
|
||||
UUID seriesId, String seriesName
|
||||
List<String> tagNames
|
||||
|
||||
// Metadata
|
||||
LocalDateTime createdAt, LocalDateTime updatedAt
|
||||
String coverPath, String sourceDomain
|
||||
```
|
||||
|
||||
**Author Index Fields:**
|
||||
```java
|
||||
UUID id, String name, String notes
|
||||
Integer authorRating, Double averageStoryRating, Integer storyCount
|
||||
List<String> urls, String avatarImagePath
|
||||
LocalDateTime createdAt, LocalDateTime updatedAt
|
||||
```
|
||||
|
||||
**Collection Index Fields:**
|
||||
```java
|
||||
UUID id, String name, String description
|
||||
List<String> tagNames, Boolean archived
|
||||
LocalDateTime createdAt, LocalDateTime updatedAt
|
||||
Integer storyCount, Integer currentPosition
|
||||
```
|
||||
|
||||
### API Endpoints Current State
|
||||
|
||||
**Search Endpoints Analysis:**
|
||||
|
||||
**✅ USED by Frontend (Must Implement):**
|
||||
- `GET /api/stories/search` - Main story search with complex filtering (CRITICAL)
|
||||
- `GET /api/stories/random` - Random story selection with filters (CRITICAL)
|
||||
- `GET /api/authors/search-typesense` - Author search (HIGH)
|
||||
- `GET /api/tags/autocomplete` - Tag suggestions (MEDIUM)
|
||||
- `POST /api/stories/reindex-typesense` - Admin reindex operations (MEDIUM)
|
||||
- `POST /api/authors/reindex-typesense` - Admin reindex operations (MEDIUM)
|
||||
- `POST /api/stories/recreate-typesense-collection` - Admin recreate (MEDIUM)
|
||||
- `POST /api/authors/recreate-typesense-collection` - Admin recreate (MEDIUM)
|
||||
|
||||
**❌ UNUSED by Frontend (Skip Implementation):**
|
||||
- `GET /api/stories/search/suggestions` - Not used by frontend
|
||||
- `GET /api/authors/search` - Superseded by typesense version
|
||||
- `GET /api/series/search` - Not used by frontend
|
||||
- `GET /api/tags/search` - Superseded by autocomplete
|
||||
- `POST /api/search/reindex` - Not used by frontend
|
||||
- `GET /api/search/health` - Not used by frontend
|
||||
|
||||
**Scope Reduction: ~40% fewer endpoints to implement**
|
||||
|
||||
**Search Parameters (Stories):**
|
||||
```
|
||||
query, page, size, authors[], tags[], minRating, maxRating
|
||||
sortBy, sortDir, facetBy[]
|
||||
minWordCount, maxWordCount, createdAfter, createdBefore
|
||||
lastReadAfter, lastReadBefore, unratedOnly, readingStatus
|
||||
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter
|
||||
minTagCount, popularOnly, hiddenGemsOnly
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Target OpenSearch Architecture
|
||||
|
||||
### Service Layer Design
|
||||
|
||||
**New Components:**
|
||||
```
|
||||
OpenSearchService.java - Primary search service (mirrors TypesenseService API)
|
||||
OpenSearchConfig.java - Configuration and client setup
|
||||
SearchMigrationService.java - Handles parallel operation during migration
|
||||
SearchServiceAdapter.java - Abstraction layer for service switching
|
||||
```
|
||||
|
||||
**Index Strategy:**
|
||||
- **Single-node deployment** for development/small installations
|
||||
- **Index-per-library** approach: `stories-{libraryId}`, `authors-{libraryId}`, `collections-{libraryId}`
|
||||
- **Index templates** for consistent mapping across libraries
|
||||
- **Aliases** for easy switching and zero-downtime updates
|
||||
|
||||
### OpenSearch Index Mappings
|
||||
|
||||
**Stories Index Mapping:**
|
||||
```json
|
||||
{
|
||||
"settings": {
|
||||
"number_of_shards": 1,
|
||||
"number_of_replicas": 0,
|
||||
"analysis": {
|
||||
"analyzer": {
|
||||
"story_analyzer": {
|
||||
"type": "custom",
|
||||
"tokenizer": "standard",
|
||||
"filter": ["lowercase", "stop", "snowball"]
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"id": {"type": "keyword"},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {"keyword": {"type": "keyword"}}
|
||||
},
|
||||
"description": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer"
|
||||
},
|
||||
"authorName": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {"keyword": {"type": "keyword"}}
|
||||
},
|
||||
"seriesName": {
|
||||
"type": "text",
|
||||
"fields": {"keyword": {"type": "keyword"}}
|
||||
},
|
||||
"tagNames": {"type": "keyword"},
|
||||
"wordCount": {"type": "integer"},
|
||||
"rating": {"type": "integer"},
|
||||
"volume": {"type": "integer"},
|
||||
"isRead": {"type": "boolean"},
|
||||
"readingPosition": {"type": "integer"},
|
||||
"lastReadAt": {"type": "date"},
|
||||
"createdAt": {"type": "date"},
|
||||
"updatedAt": {"type": "date"},
|
||||
"coverPath": {"type": "keyword"},
|
||||
"sourceUrl": {"type": "keyword"},
|
||||
"sourceDomain": {"type": "keyword"}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Authors Index Mapping:**
|
||||
```json
|
||||
{
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"id": {"type": "keyword"},
|
||||
"name": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {"keyword": {"type": "keyword"}}
|
||||
},
|
||||
"notes": {"type": "text"},
|
||||
"authorRating": {"type": "integer"},
|
||||
"averageStoryRating": {"type": "float"},
|
||||
"storyCount": {"type": "integer"},
|
||||
"urls": {"type": "keyword"},
|
||||
"avatarImagePath": {"type": "keyword"},
|
||||
"createdAt": {"type": "date"},
|
||||
"updatedAt": {"type": "date"}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Collections Index Mapping:**
|
||||
```json
|
||||
{
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"id": {"type": "keyword"},
|
||||
"name": {
|
||||
"type": "text",
|
||||
"fields": {"keyword": {"type": "keyword"}}
|
||||
},
|
||||
"description": {"type": "text"},
|
||||
"tagNames": {"type": "keyword"},
|
||||
"archived": {"type": "boolean"},
|
||||
"storyCount": {"type": "integer"},
|
||||
"currentPosition": {"type": "integer"},
|
||||
"createdAt": {"type": "date"},
|
||||
"updatedAt": {"type": "date"}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Query Translation Strategy
|
||||
|
||||
**Random Story Queries:**
|
||||
```java
|
||||
// Typesense (problematic)
|
||||
String sortBy = seed != null ? "_rand(" + seed + ")" : "_rand()";
|
||||
|
||||
// OpenSearch (reliable)
|
||||
QueryBuilder randomQuery = QueryBuilders.functionScoreQuery(
|
||||
QueryBuilders.boolQuery().must(filters),
|
||||
ScoreFunctionBuilders.randomFunction(seed != null ? seed.intValue() : null)
|
||||
);
|
||||
```
|
||||
|
||||
**Complex Filtering:**
|
||||
```java
|
||||
// Build bool query with multiple filter conditions
|
||||
BoolQueryBuilder boolQuery = QueryBuilders.boolQuery()
|
||||
.must(QueryBuilders.multiMatchQuery(query, "title", "description", "authorName"))
|
||||
.filter(QueryBuilders.termsQuery("tagNames", tags))
|
||||
.filter(QueryBuilders.rangeQuery("wordCount").gte(minWords).lte(maxWords))
|
||||
.filter(QueryBuilders.rangeQuery("rating").gte(minRating).lte(maxRating));
|
||||
```
|
||||
|
||||
**Faceting/Aggregations:**
|
||||
```java
|
||||
// Tags aggregation
|
||||
AggregationBuilder tagsAgg = AggregationBuilders
|
||||
.terms("tags")
|
||||
.field("tagNames")
|
||||
.size(100);
|
||||
|
||||
// Rating ranges
|
||||
AggregationBuilder ratingRanges = AggregationBuilders
|
||||
.range("rating_ranges")
|
||||
.field("rating")
|
||||
.addRange("unrated", 0, 1)
|
||||
.addRange("low", 1, 3)
|
||||
.addRange("high", 4, 6);
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Revised Implementation Phases (Scope Reduced by 40%)
|
||||
|
||||
### Phase 1: Infrastructure Setup (Week 1)
|
||||
|
||||
**Objectives:**
|
||||
- Add OpenSearch to Docker Compose
|
||||
- Create basic OpenSearch service
|
||||
- Establish index templates and mappings
|
||||
- **Focus**: Only stories, authors, and tags indexes (skip series, collections)
|
||||
|
||||
**Deliverables:**
|
||||
1. **Docker Compose Updates:**
|
||||
```yaml
|
||||
opensearch:
|
||||
image: opensearchproject/opensearch:2.11.0
|
||||
environment:
|
||||
- discovery.type=single-node
|
||||
- DISABLE_SECURITY_PLUGIN=true
|
||||
- OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx1g
|
||||
ports:
|
||||
- "9200:9200"
|
||||
volumes:
|
||||
- opensearch_data:/usr/share/opensearch/data
|
||||
```
|
||||
|
||||
2. **OpenSearchConfig.java:**
|
||||
```java
|
||||
@Configuration
|
||||
@ConditionalOnProperty(name = "storycove.opensearch.enabled", havingValue = "true")
|
||||
public class OpenSearchConfig {
|
||||
@Bean
|
||||
public OpenSearchClient openSearchClient() {
|
||||
// Client configuration
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3. **Basic Index Creation:**
|
||||
- Create index templates for stories, authors, collections
|
||||
- Implement index creation with proper mappings
|
||||
- Add health check endpoint
|
||||
|
||||
**Success Criteria:**
|
||||
- OpenSearch container starts successfully
|
||||
- Basic connectivity established
|
||||
- Index templates created and validated
|
||||
|
||||
### Phase 2: Core Service Implementation (Week 2)
|
||||
|
||||
**Objectives:**
|
||||
- Implement OpenSearchService with core functionality
|
||||
- Create service abstraction layer
|
||||
- Implement basic search operations
|
||||
- **Focus**: Only critical endpoints (stories search, random, authors)
|
||||
|
||||
**Deliverables:**
|
||||
1. **OpenSearchService.java** - Core service implementing:
|
||||
- `indexStory()`, `updateStory()`, `deleteStory()`
|
||||
- `searchStories()` with basic query support (CRITICAL)
|
||||
- `getRandomStoryId()` with reliable seed support (CRITICAL)
|
||||
- `indexAuthor()`, `updateAuthor()`, `deleteAuthor()`
|
||||
- `searchAuthors()` for authors page (HIGH)
|
||||
- `bulkIndexStories()`, `bulkIndexAuthors()` for initial data loading
|
||||
|
||||
2. **SearchServiceAdapter.java** - Abstraction layer:
|
||||
```java
|
||||
@Service
|
||||
public class SearchServiceAdapter {
|
||||
@Autowired(required = false)
|
||||
private TypesenseService typesenseService;
|
||||
|
||||
@Autowired(required = false)
|
||||
private OpenSearchService openSearchService;
|
||||
|
||||
@Value("${storycove.search.provider:typesense}")
|
||||
private String searchProvider;
|
||||
|
||||
public SearchResultDto<StorySearchDto> searchStories(...) {
|
||||
return "opensearch".equals(searchProvider)
|
||||
? openSearchService.searchStories(...)
|
||||
: typesenseService.searchStories(...);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3. **Basic Query Implementation:**
|
||||
- Full-text search across title/description/author
|
||||
- Basic filtering (tags, rating, word count)
|
||||
- Pagination and sorting
|
||||
|
||||
**Success Criteria:**
|
||||
- Basic search functionality working
|
||||
- Service abstraction layer functional
|
||||
- Can switch between Typesense and OpenSearch via configuration
|
||||
|
||||
### Phase 3: Advanced Features Implementation (Week 3)
|
||||
|
||||
**Objectives:**
|
||||
- Implement complex filtering (all 15+ filter types)
|
||||
- Add random story functionality
|
||||
- Implement faceting/aggregations
|
||||
- Add autocomplete/suggestions
|
||||
|
||||
**Deliverables:**
|
||||
1. **Complex Query Builder:**
|
||||
- All filter conditions from original implementation
|
||||
- Date range filtering with proper timezone handling
|
||||
- Boolean logic for reading status, coverage, series filters
|
||||
|
||||
2. **Random Story Implementation:**
|
||||
```java
|
||||
public Optional<UUID> getRandomStoryId(String searchQuery, List<String> tags, Long seed, ...) {
|
||||
BoolQueryBuilder baseQuery = buildFilterQuery(searchQuery, tags, ...);
|
||||
|
||||
QueryBuilder randomQuery = QueryBuilders.functionScoreQuery(
|
||||
baseQuery,
|
||||
ScoreFunctionBuilders.randomFunction(seed != null ? seed.intValue() : null)
|
||||
);
|
||||
|
||||
SearchRequest request = new SearchRequest("stories-" + getCurrentLibraryId())
|
||||
.source(new SearchSourceBuilder()
|
||||
.query(randomQuery)
|
||||
.size(1)
|
||||
.fetchSource(new String[]{"id"}, null));
|
||||
|
||||
// Execute and return result
|
||||
}
|
||||
```
|
||||
|
||||
3. **Faceting Implementation:**
|
||||
- Tag aggregations with counts
|
||||
- Rating range aggregations
|
||||
- Author aggregations
|
||||
- Custom facet builders
|
||||
|
||||
4. **Autocomplete Service:**
|
||||
- Suggest-based implementation using completion fields
|
||||
- Prefix matching for story titles and author names
|
||||
|
||||
**Success Criteria:**
|
||||
- All filter conditions working correctly
|
||||
- Random story selection reliable with seed support
|
||||
- Faceting returns accurate counts
|
||||
- Autocomplete responsive and accurate
|
||||
|
||||
### Phase 4: Data Migration & Parallel Operation (Week 4)
|
||||
|
||||
**Objectives:**
|
||||
- Implement bulk data migration from database
|
||||
- Enable parallel operation (write to both systems)
|
||||
- Comprehensive testing of OpenSearch functionality
|
||||
|
||||
**Deliverables:**
|
||||
1. **Migration Service:**
|
||||
```java
|
||||
@Service
|
||||
public class SearchMigrationService {
|
||||
public void performFullMigration() {
|
||||
// Migrate all libraries
|
||||
List<Library> libraries = libraryService.findAll();
|
||||
for (Library library : libraries) {
|
||||
migrateLibraryData(library);
|
||||
}
|
||||
}
|
||||
|
||||
private void migrateLibraryData(Library library) {
|
||||
// Create indexes for library
|
||||
// Bulk load stories, authors, collections
|
||||
// Verify data integrity
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
2. **Dual-Write Implementation:**
|
||||
- Modify all entity update operations to write to both systems
|
||||
- Add configuration flag for dual-write mode
|
||||
- Error handling for partial failures
|
||||
|
||||
3. **Data Validation Tools:**
|
||||
- Compare search result counts between systems
|
||||
- Validate random story selection consistency
|
||||
- Check faceting accuracy
|
||||
|
||||
**Success Criteria:**
|
||||
- Complete data migration with 100% accuracy
|
||||
- Dual-write operations working without errors
|
||||
- Search result parity between systems verified
|
||||
|
||||
### Phase 5: API Integration & Testing (Week 5)
|
||||
|
||||
**Objectives:**
|
||||
- Update controller endpoints to use OpenSearch
|
||||
- Comprehensive integration testing
|
||||
- Performance testing and optimization
|
||||
|
||||
**Deliverables:**
|
||||
1. **Controller Updates:**
|
||||
- Modify controllers to use SearchServiceAdapter
|
||||
- Add migration controls for gradual rollout
|
||||
- Implement A/B testing capability
|
||||
|
||||
2. **Integration Tests:**
|
||||
```java
|
||||
@SpringBootTest
|
||||
@TestMethodOrder(OrderAnnotation.class)
|
||||
public class OpenSearchIntegrationTest {
|
||||
@Test
|
||||
@Order(1)
|
||||
void testBasicSearch() {
|
||||
// Test basic story search functionality
|
||||
}
|
||||
|
||||
@Test
|
||||
@Order(2)
|
||||
void testComplexFiltering() {
|
||||
// Test all 15+ filter conditions
|
||||
}
|
||||
|
||||
@Test
|
||||
@Order(3)
|
||||
void testRandomStory() {
|
||||
// Test random story with and without seed
|
||||
}
|
||||
|
||||
@Test
|
||||
@Order(4)
|
||||
void testFaceting() {
|
||||
// Test aggregation accuracy
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3. **Performance Testing:**
|
||||
- Load testing with realistic data volumes
|
||||
- Query performance benchmarking
|
||||
- Memory usage monitoring
|
||||
|
||||
**Success Criteria:**
|
||||
- All integration tests passing
|
||||
- Performance meets or exceeds Typesense baseline
|
||||
- Memory usage within acceptable limits (< 2GB)
|
||||
|
||||
### Phase 6: Production Rollout & Monitoring (Week 6)
|
||||
|
||||
**Objectives:**
|
||||
- Production deployment with feature flags
|
||||
- Gradual user migration with monitoring
|
||||
- Rollback capability testing
|
||||
|
||||
**Deliverables:**
|
||||
1. **Feature Flag Implementation:**
|
||||
```java
|
||||
@Component
|
||||
public class SearchFeatureFlags {
|
||||
@Value("${storycove.search.opensearch.enabled:false}")
|
||||
private boolean openSearchEnabled;
|
||||
|
||||
@Value("${storycove.search.opensearch.percentage:0}")
|
||||
private int rolloutPercentage;
|
||||
|
||||
public boolean shouldUseOpenSearch(String userId) {
|
||||
if (!openSearchEnabled) return false;
|
||||
return userId.hashCode() % 100 < rolloutPercentage;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
2. **Monitoring & Alerting:**
|
||||
- Query performance metrics
|
||||
- Error rate monitoring
|
||||
- Search result accuracy validation
|
||||
- User experience metrics
|
||||
|
||||
3. **Rollback Procedures:**
|
||||
- Immediate rollback to Typesense capability
|
||||
- Data consistency verification
|
||||
- Performance rollback triggers
|
||||
|
||||
**Success Criteria:**
|
||||
- Successful production deployment
|
||||
- Zero user-facing issues during rollout
|
||||
- Monitoring showing improved performance
|
||||
- Rollback procedures validated
|
||||
|
||||
### Phase 7: Cleanup & Documentation (Week 7)
|
||||
|
||||
**Objectives:**
|
||||
- Remove Typesense dependencies
|
||||
- Update documentation
|
||||
- Performance optimization
|
||||
|
||||
**Deliverables:**
|
||||
1. **Code Cleanup:**
|
||||
- Remove TypesenseService and related classes
|
||||
- Clean up Docker Compose configuration
|
||||
- Remove unused dependencies
|
||||
|
||||
2. **Documentation Updates:**
|
||||
- Update deployment documentation
|
||||
- Search API documentation
|
||||
- Troubleshooting guides
|
||||
|
||||
3. **Performance Tuning:**
|
||||
- Index optimization
|
||||
- Query performance tuning
|
||||
- Resource allocation optimization
|
||||
|
||||
**Success Criteria:**
|
||||
- Typesense completely removed
|
||||
- Documentation up to date
|
||||
- Optimized performance in production
|
||||
|
||||
---
|
||||
|
||||
## Data Migration Strategy
|
||||
|
||||
### Pre-Migration Validation
|
||||
|
||||
**Data Integrity Checks:**
|
||||
1. Count validation: Ensure all stories/authors/collections are present
|
||||
2. Field validation: Verify all required fields are populated
|
||||
3. Relationship validation: Check author-story and series-story relationships
|
||||
4. Library separation: Ensure proper multi-library data isolation
|
||||
|
||||
**Migration Process:**
|
||||
|
||||
1. **Index Creation:**
|
||||
```java
|
||||
// Create indexes with proper mappings for each library
|
||||
for (Library library : libraries) {
|
||||
String storiesIndex = "stories-" + library.getId();
|
||||
createIndexWithMapping(storiesIndex, getStoriesMapping());
|
||||
createIndexWithMapping("authors-" + library.getId(), getAuthorsMapping());
|
||||
createIndexWithMapping("collections-" + library.getId(), getCollectionsMapping());
|
||||
}
|
||||
```
|
||||
|
||||
2. **Bulk Data Loading:**
|
||||
```java
|
||||
// Load in batches to manage memory usage
|
||||
int batchSize = 1000;
|
||||
List<Story> allStories = storyService.findByLibraryId(libraryId);
|
||||
|
||||
for (int i = 0; i < allStories.size(); i += batchSize) {
|
||||
List<Story> batch = allStories.subList(i, Math.min(i + batchSize, allStories.size()));
|
||||
List<StoryDocument> documents = batch.stream()
|
||||
.map(this::convertToSearchDocument)
|
||||
.collect(Collectors.toList());
|
||||
|
||||
bulkIndexStories(documents, "stories-" + libraryId);
|
||||
}
|
||||
```
|
||||
|
||||
3. **Post-Migration Validation:**
|
||||
- Count comparison between database and OpenSearch
|
||||
- Spot-check random records for field accuracy
|
||||
- Test search functionality with known queries
|
||||
- Verify faceting counts match expected values
|
||||
|
||||
### Rollback Strategy
|
||||
|
||||
**Immediate Rollback Triggers:**
|
||||
- Search error rate > 1%
|
||||
- Query performance degradation > 50%
|
||||
- Data inconsistency detected
|
||||
- Memory usage > 4GB sustained
|
||||
|
||||
**Rollback Process:**
|
||||
1. Update feature flag to disable OpenSearch
|
||||
2. Verify Typesense still operational
|
||||
3. Clear OpenSearch indexes to free resources
|
||||
4. Investigate and document issues
|
||||
|
||||
**Data Consistency During Rollback:**
|
||||
- Continue dual-write during investigation
|
||||
- Re-sync any missed updates to OpenSearch
|
||||
- Validate data integrity before retry
|
||||
|
||||
---
|
||||
|
||||
## Testing Strategy
|
||||
|
||||
### Unit Tests
|
||||
|
||||
**OpenSearchService Unit Tests:**
|
||||
```java
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
class OpenSearchServiceTest {
|
||||
@Mock private OpenSearchClient client;
|
||||
@InjectMocks private OpenSearchService service;
|
||||
|
||||
@Test
|
||||
void testSearchStoriesBasicQuery() {
|
||||
// Mock OpenSearch response
|
||||
// Test basic search functionality
|
||||
}
|
||||
|
||||
@Test
|
||||
void testComplexFilterQuery() {
|
||||
// Test complex boolean query building
|
||||
}
|
||||
|
||||
@Test
|
||||
void testRandomStorySelection() {
|
||||
// Test random query with seed
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Query Builder Tests:**
|
||||
- Test all 15+ filter conditions
|
||||
- Validate query structure and parameters
|
||||
- Test edge cases and null handling
|
||||
|
||||
### Integration Tests
|
||||
|
||||
**Full Search Integration:**
|
||||
```java
|
||||
@SpringBootTest
|
||||
@Testcontainers
|
||||
class OpenSearchIntegrationTest {
|
||||
@Container
|
||||
static OpenSearchContainer opensearch = new OpenSearchContainer("opensearchproject/opensearch:2.11.0");
|
||||
|
||||
@Test
|
||||
void testEndToEndStorySearch() {
|
||||
// Insert test data
|
||||
// Perform search via controller
|
||||
// Validate results
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Performance Tests
|
||||
|
||||
**Load Testing Scenarios:**
|
||||
1. **Concurrent Search Load:**
|
||||
- 50 concurrent users performing searches
|
||||
- Mixed query complexity
|
||||
- Duration: 10 minutes
|
||||
|
||||
2. **Bulk Indexing Performance:**
|
||||
- Index 10,000 stories in batches
|
||||
- Measure throughput and memory usage
|
||||
|
||||
3. **Random Query Performance:**
|
||||
- 1000 random story requests with different seeds
|
||||
- Compare with Typesense baseline
|
||||
|
||||
### Acceptance Tests
|
||||
|
||||
**Functional Requirements:**
|
||||
- All existing search functionality preserved
|
||||
- Random story selection improved reliability
|
||||
- Faceting accuracy maintained
|
||||
- Multi-library separation working
|
||||
|
||||
**Performance Requirements:**
|
||||
- Search response time < 100ms for 95th percentile
|
||||
- Random story selection < 50ms
|
||||
- Index update operations < 10ms
|
||||
- Memory usage < 2GB in production
|
||||
|
||||
---
|
||||
|
||||
## Risk Analysis & Mitigation
|
||||
|
||||
### Technical Risks
|
||||
|
||||
**Risk: OpenSearch Memory Usage**
|
||||
- *Probability: Medium*
|
||||
- *Impact: High*
|
||||
- *Mitigation: Resource monitoring, index optimization, container limits*
|
||||
|
||||
**Risk: Query Performance Regression**
|
||||
- *Probability: Low*
|
||||
- *Impact: High*
|
||||
- *Mitigation: Performance testing, query optimization, caching layer*
|
||||
|
||||
**Risk: Data Migration Accuracy**
|
||||
- *Probability: Low*
|
||||
- *Impact: Critical*
|
||||
- *Mitigation: Comprehensive validation, dual-write verification, rollback procedures*
|
||||
|
||||
**Risk: Complex Filter Compatibility**
|
||||
- *Probability: Medium*
|
||||
- *Impact: Medium*
|
||||
- *Mitigation: Extensive testing, gradual rollout, feature flags*
|
||||
|
||||
### Operational Risks
|
||||
|
||||
**Risk: Production Deployment Issues**
|
||||
- *Probability: Medium*
|
||||
- *Impact: High*
|
||||
- *Mitigation: Staging environment testing, gradual rollout, immediate rollback capability*
|
||||
|
||||
**Risk: Team Learning Curve**
|
||||
- *Probability: High*
|
||||
- *Impact: Low*
|
||||
- *Mitigation: Documentation, training, gradual responsibility transfer*
|
||||
|
||||
### Business Continuity
|
||||
|
||||
**Zero-Downtime Requirements:**
|
||||
- Maintain Typesense during entire migration
|
||||
- Feature flag-based switching
|
||||
- Immediate rollback capability
|
||||
- Health monitoring with automated alerts
|
||||
|
||||
---
|
||||
|
||||
## Success Criteria
|
||||
|
||||
### Functional Requirements ✅
|
||||
- [ ] All search functionality migrated successfully
|
||||
- [ ] Random story selection working reliably with seeds
|
||||
- [ ] Complex filtering (15+ conditions) working accurately
|
||||
- [ ] Faceting/aggregation results match expected values
|
||||
- [ ] Multi-library support maintained
|
||||
- [ ] Autocomplete functionality preserved
|
||||
|
||||
### Performance Requirements ✅
|
||||
- [ ] Search response time ≤ 100ms (95th percentile)
|
||||
- [ ] Random story selection ≤ 50ms
|
||||
- [ ] Index operations ≤ 10ms
|
||||
- [ ] Memory usage ≤ 2GB sustained
|
||||
- [ ] Zero search downtime during migration
|
||||
|
||||
### Technical Requirements ✅
|
||||
- [ ] Code quality maintained (test coverage ≥ 80%)
|
||||
- [ ] Documentation updated and comprehensive
|
||||
- [ ] Monitoring and alerting implemented
|
||||
- [ ] Rollback procedures tested and validated
|
||||
- [ ] Typesense dependencies cleanly removed
|
||||
|
||||
---
|
||||
|
||||
## Timeline Summary
|
||||
|
||||
| Phase | Duration | Key Deliverables | Risk Level |
|
||||
|-------|----------|------------------|------------|
|
||||
| 1. Infrastructure | 1 week | Docker setup, basic service | Low |
|
||||
| 2. Core Service | 1 week | Basic search operations | Medium |
|
||||
| 3. Advanced Features | 1 week | Complex filtering, random queries | High |
|
||||
| 4. Data Migration | 1 week | Full data migration, dual-write | High |
|
||||
| 5. API Integration | 1 week | Controller updates, testing | Medium |
|
||||
| 6. Production Rollout | 1 week | Gradual deployment, monitoring | High |
|
||||
| 7. Cleanup | 1 week | Remove Typesense, documentation | Low |
|
||||
|
||||
**Total Estimated Duration: 7 weeks**
|
||||
|
||||
---
|
||||
|
||||
## Configuration Management
|
||||
|
||||
### Environment Variables
|
||||
|
||||
```bash
|
||||
# OpenSearch Configuration
|
||||
OPENSEARCH_HOST=opensearch
|
||||
OPENSEARCH_PORT=9200
|
||||
OPENSEARCH_USERNAME=admin
|
||||
OPENSEARCH_PASSWORD=${OPENSEARCH_PASSWORD}
|
||||
|
||||
# Feature Flags
|
||||
STORYCOVE_OPENSEARCH_ENABLED=true
|
||||
STORYCOVE_SEARCH_PROVIDER=opensearch
|
||||
STORYCOVE_SEARCH_DUAL_WRITE=true
|
||||
STORYCOVE_OPENSEARCH_ROLLOUT_PERCENTAGE=100
|
||||
|
||||
# Performance Tuning
|
||||
OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx2g
|
||||
STORYCOVE_SEARCH_BATCH_SIZE=1000
|
||||
STORYCOVE_SEARCH_TIMEOUT=30s
|
||||
```
|
||||
|
||||
### Docker Compose Updates
|
||||
|
||||
```yaml
|
||||
# Add to docker-compose.yml
|
||||
opensearch:
|
||||
image: opensearchproject/opensearch:2.11.0
|
||||
environment:
|
||||
- discovery.type=single-node
|
||||
- DISABLE_SECURITY_PLUGIN=true
|
||||
- OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx2g
|
||||
volumes:
|
||||
- opensearch_data:/usr/share/opensearch/data
|
||||
networks:
|
||||
- storycove-network
|
||||
|
||||
volumes:
|
||||
opensearch_data:
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Conclusion
|
||||
|
||||
This specification provides a comprehensive roadmap for migrating StoryCove from Typesense to OpenSearch. The phased approach ensures minimal risk while delivering improved reliability and performance, particularly for random story queries.
|
||||
|
||||
The parallel implementation strategy allows for thorough validation and provides confidence in the migration while maintaining the ability to rollback if issues arise. Upon successful completion, StoryCove will have a more robust and scalable search infrastructure that better supports its growth and feature requirements.
|
||||
|
||||
**Next Steps:**
|
||||
1. Review and approve this specification
|
||||
2. Set up development environment with OpenSearch
|
||||
3. Begin Phase 1 implementation
|
||||
4. Establish monitoring and success metrics
|
||||
5. Execute migration according to timeline
|
||||
|
||||
---
|
||||
|
||||
*Document Version: 1.0*
|
||||
*Last Updated: 2025-01-17*
|
||||
*Author: Claude Code Assistant*
|
||||
122
README.md
122
README.md
@@ -161,43 +161,75 @@ cd backend
|
||||
|
||||
## 📖 Documentation
|
||||
|
||||
- **[API Documentation](docs/API.md)**: Complete REST API reference with examples
|
||||
- **[Data Model](docs/DATA_MODEL.md)**: Detailed database schema and relationships
|
||||
- **[Technical Specification](storycove-spec.md)**: Comprehensive technical specification
|
||||
- **[Technical Specification](storycove-spec.md)**: Complete technical specification with API documentation, data models, and all feature specifications
|
||||
- **[Web Scraper Specification](storycove-scraper-spec.md)**: URL content grabbing functionality
|
||||
- **Environment Configuration**: Multi-environment deployment setup (see above)
|
||||
- **Development Setup**: Local development environment setup (see below)
|
||||
|
||||
> **Note**: All feature specifications (Collections, Tag Enhancements, EPUB Import/Export) have been consolidated into the main technical specification for easier maintenance and reference.
|
||||
|
||||
## 🗄️ Data Model
|
||||
|
||||
StoryCove uses a PostgreSQL database with the following core entities:
|
||||
|
||||
### **Stories**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: title, summary, description, content_html, content_plain, source_url, word_count, rating, volume, cover_path, reading_position, last_read_at
|
||||
- **Relationships**: Many-to-One with Author, Many-to-One with Series, Many-to-Many with Tags
|
||||
- **Features**: Automatic word count calculation, HTML sanitization, plain text extraction, reading progress tracking
|
||||
- **Fields**: title, summary, description, content_html, content_plain, source_url, word_count, rating, volume, cover_path, is_read, reading_position, last_read_at, created_at, updated_at
|
||||
- **Relationships**: Many-to-One with Author, Many-to-One with Series, Many-to-Many with Tags, One-to-Many with ReadingPositions
|
||||
- **Features**: Automatic word count calculation, HTML sanitization, plain text extraction, reading progress tracking, duplicate detection
|
||||
|
||||
### **Authors**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: name, notes, author_rating, avatar_image_path
|
||||
- **Relationships**: One-to-Many with Stories, One-to-Many with Author URLs
|
||||
- **Features**: URL collection storage, rating system, statistics calculation
|
||||
- **Fields**: name, notes, author_rating, avatar_image_path, created_at, updated_at
|
||||
- **Relationships**: One-to-Many with Stories, One-to-Many with Author URLs (via @ElementCollection)
|
||||
- **Features**: URL collection storage, rating system, statistics calculation, average story rating calculation
|
||||
|
||||
### **Collections**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: name, description, rating, cover_image_path, is_archived, created_at, updated_at
|
||||
- **Relationships**: Many-to-Many with Tags, One-to-Many with CollectionStories
|
||||
- **Features**: Story ordering with gap-based positioning, statistics calculation, EPUB export, Typesense search
|
||||
|
||||
### **CollectionStories** (Junction Table)
|
||||
- **Composite Key**: collection_id, story_id
|
||||
- **Fields**: position, added_at
|
||||
- **Relationships**: Links Collections to Stories with ordering
|
||||
- **Features**: Gap-based positioning for efficient reordering
|
||||
|
||||
### **Series**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: name, description
|
||||
- **Fields**: name, description, created_at
|
||||
- **Relationships**: One-to-Many with Stories (ordered by volume)
|
||||
- **Features**: Volume-based story ordering, navigation methods
|
||||
- **Features**: Volume-based story ordering, navigation methods (next/previous story)
|
||||
|
||||
### **Tags**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: name (unique)
|
||||
- **Relationships**: Many-to-Many with Stories
|
||||
- **Features**: Autocomplete support, usage statistics
|
||||
- **Fields**: name (unique), color (hex), description, created_at
|
||||
- **Relationships**: Many-to-Many with Stories, Many-to-Many with Collections, One-to-Many with TagAliases
|
||||
- **Features**: Color coding, alias system, autocomplete support, usage statistics, AI-powered suggestions
|
||||
|
||||
### **Join Tables**
|
||||
- **story_tags**: Links stories to tags
|
||||
- **author_urls**: Stores multiple URLs per author
|
||||
### **TagAliases**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: alias_name (unique), canonical_tag_id, created_from_merge, created_at
|
||||
- **Relationships**: Many-to-One with Tag (canonical)
|
||||
- **Features**: Transparent alias resolution, merge tracking, autocomplete integration
|
||||
|
||||
### **ReadingPositions**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: story_id, chapter_index, chapter_title, word_position, character_position, percentage_complete, epub_cfi, context_before, context_after, created_at, updated_at
|
||||
- **Relationships**: Many-to-One with Story
|
||||
- **Features**: Advanced reading position tracking, EPUB CFI support, context preservation, percentage calculation
|
||||
|
||||
### **Libraries**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: name, description, is_default, created_at, updated_at
|
||||
- **Features**: Multi-library support, library switching functionality
|
||||
|
||||
### **Core Join Tables**
|
||||
- **story_tags**: Links stories to tags (Many-to-Many)
|
||||
- **collection_tags**: Links collections to tags (Many-to-Many)
|
||||
- **collection_stories**: Links collections to stories with ordering
|
||||
- **author_urls**: Stores multiple URLs per author (@ElementCollection)
|
||||
|
||||
## 🔌 REST API Reference
|
||||
|
||||
@@ -209,6 +241,7 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
||||
### **Stories** (`/api/stories`)
|
||||
- `GET /` - List stories (paginated)
|
||||
- `GET /{id}` - Get specific story
|
||||
- `GET /{id}/read` - Get story for reading interface
|
||||
- `POST /` - Create new story
|
||||
- `PUT /{id}` - Update story
|
||||
- `DELETE /{id}` - Delete story
|
||||
@@ -218,6 +251,10 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
||||
- `POST /{id}/tags/{tagId}` - Add tag to story
|
||||
- `DELETE /{id}/tags/{tagId}` - Remove tag from story
|
||||
- `POST /{id}/reading-progress` - Update reading position
|
||||
- `POST /{id}/reading-status` - Mark story as read/unread
|
||||
- `GET /{id}/collections` - Get collections containing story
|
||||
- `GET /random` - Get random story with optional filters
|
||||
- `GET /check-duplicate` - Check for duplicate stories
|
||||
- `GET /search` - Search stories (Typesense with faceting)
|
||||
- `GET /search/suggestions` - Get search suggestions
|
||||
- `GET /author/{authorId}` - Stories by author
|
||||
@@ -225,6 +262,16 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
||||
- `GET /tags/{tagName}` - Stories with tag
|
||||
- `GET /recent` - Recent stories
|
||||
- `GET /top-rated` - Top-rated stories
|
||||
- `POST /batch/add-to-collection` - Add multiple stories to collection
|
||||
- `POST /reindex` - Manual Typesense reindex
|
||||
- `POST /reindex-typesense` - Reindex stories in Typesense
|
||||
- `POST /recreate-typesense-collection` - Recreate Typesense collection
|
||||
|
||||
#### **EPUB Import/Export** (`/api/stories/epub`)
|
||||
- `POST /import` - Import story from EPUB file
|
||||
- `POST /export` - Export story as EPUB with options
|
||||
- `GET /{id}/epub` - Export story as EPUB (simple)
|
||||
- `POST /validate` - Validate EPUB file structure
|
||||
|
||||
### **Authors** (`/api/authors`)
|
||||
- `GET /` - List authors (paginated)
|
||||
@@ -244,14 +291,49 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
||||
### **Tags** (`/api/tags`)
|
||||
- `GET /` - List tags (paginated)
|
||||
- `GET /{id}` - Get specific tag
|
||||
- `POST /` - Create new tag
|
||||
- `PUT /{id}` - Update tag
|
||||
- `POST /` - Create new tag (with color and description)
|
||||
- `PUT /{id}` - Update tag (name, color, description)
|
||||
- `DELETE /{id}` - Delete tag
|
||||
- `GET /search` - Search tags
|
||||
- `GET /autocomplete` - Tag autocomplete
|
||||
- `GET /autocomplete` - Tag autocomplete with alias resolution
|
||||
- `GET /popular` - Most used tags
|
||||
- `GET /unused` - Unused tags
|
||||
- `GET /stats` - Tag statistics
|
||||
- `GET /collections` - Tags used by collections
|
||||
- `GET /resolve/{name}` - Resolve tag name (handles aliases)
|
||||
|
||||
#### **Tag Aliases** (`/api/tags/{tagId}/aliases`)
|
||||
- `POST /` - Add alias to tag
|
||||
- `DELETE /{aliasId}` - Remove alias from tag
|
||||
|
||||
#### **Tag Management**
|
||||
- `POST /merge` - Merge multiple tags into one
|
||||
- `POST /merge/preview` - Preview tag merge operation
|
||||
- `POST /suggest` - AI-powered tag suggestions for content
|
||||
|
||||
### **Collections** (`/api/collections`)
|
||||
- `GET /` - Search and list collections (Typesense)
|
||||
- `GET /{id}` - Get collection details
|
||||
- `POST /` - Create new collection (JSON or multipart)
|
||||
- `PUT /{id}` - Update collection metadata
|
||||
- `DELETE /{id}` - Delete collection
|
||||
- `PUT /{id}/archive` - Archive/unarchive collection
|
||||
- `POST /{id}/cover` - Upload collection cover image
|
||||
- `DELETE /{id}/cover` - Remove collection cover image
|
||||
- `GET /{id}/stats` - Get collection statistics
|
||||
|
||||
#### **Collection Story Management**
|
||||
- `POST /{id}/stories` - Add stories to collection
|
||||
- `DELETE /{id}/stories/{storyId}` - Remove story from collection
|
||||
- `PUT /{id}/stories/order` - Reorder stories in collection
|
||||
- `GET /{id}/read/{storyId}` - Get story with collection context
|
||||
|
||||
#### **Collection EPUB Export**
|
||||
- `GET /{id}/epub` - Export collection as EPUB
|
||||
- `POST /{id}/epub` - Export collection as EPUB with options
|
||||
|
||||
#### **Collection Management**
|
||||
- `POST /reindex-typesense` - Reindex collections in Typesense
|
||||
|
||||
### **Series** (`/api/series`)
|
||||
- `GET /` - List series (paginated)
|
||||
|
||||
305
TAG_ENHANCEMENT_SPECIFICATION.md
Normal file
305
TAG_ENHANCEMENT_SPECIFICATION.md
Normal file
@@ -0,0 +1,305 @@
|
||||
# Tag Enhancement Specification
|
||||
|
||||
> **✅ Implementation Status: COMPLETED**
|
||||
> This feature has been fully implemented and is available in the system.
|
||||
> All tag enhancements including colors, aliases, merging, and AI suggestions are working.
|
||||
> Last updated: January 2025
|
||||
|
||||
## Overview
|
||||
|
||||
This document outlines the comprehensive enhancement of the tagging functionality in StoryCove, including color tags, tag deletion, merging, and aliases. These features will be accessible through a new "Tag Maintenance" page linked from the Settings page.
|
||||
|
||||
## Features
|
||||
|
||||
### 1. Color Tags
|
||||
|
||||
**Purpose**: Assign optional colors to tags for visual distinction and better organization.
|
||||
|
||||
**Implementation Details**:
|
||||
- **Color Selection**: Predefined color palette that complements the app's theme
|
||||
- **Custom Colors**: Fallback option with full color picker for advanced users
|
||||
- **Default Behavior**: Tags without colors use consistent default styling
|
||||
- **Accessibility**: All colors ensure sufficient contrast ratios
|
||||
|
||||
**UI Design**:
|
||||
```
|
||||
Color Selection Interface:
|
||||
[Theme Blue] [Theme Green] [Theme Purple] [Theme Orange] ... [Custom ▼]
|
||||
```
|
||||
|
||||
**Database Changes**:
|
||||
```sql
|
||||
ALTER TABLE tags ADD COLUMN color VARCHAR(7); -- hex colors like #3B82F6
|
||||
ALTER TABLE tags ADD COLUMN description TEXT;
|
||||
```
|
||||
|
||||
### 2. Tag Deletion
|
||||
|
||||
**Purpose**: Remove unused or unwanted tags from the system.
|
||||
|
||||
**Safety Features**:
|
||||
- Show impact: "This tag is used by X stories"
|
||||
- Confirmation dialog with story count
|
||||
- Option to reassign stories to different tag before deletion
|
||||
- Simple workflow appropriate for single-user application
|
||||
|
||||
**Behavior**:
|
||||
- Display number of affected stories
|
||||
- Require confirmation for deletion
|
||||
- Optionally allow reassignment to another tag
|
||||
|
||||
### 3. Tag Merging
|
||||
|
||||
**Purpose**: Combine similar tags into a single canonical tag to reduce duplication.
|
||||
|
||||
**Workflow**:
|
||||
1. User selects multiple tags to merge
|
||||
2. User chooses which tag name becomes canonical
|
||||
3. System shows merge preview with story counts
|
||||
4. All story associations transfer to canonical tag
|
||||
5. **Automatic Aliasing**: Merged tags automatically become aliases
|
||||
|
||||
**Example**:
|
||||
```
|
||||
Merge Preview:
|
||||
• "magictf" (5 stories) → "magic tf" (12 stories)
|
||||
• Result: "magic tf" (17 stories)
|
||||
• "magictf" will become an alias for "magic tf"
|
||||
```
|
||||
|
||||
**Technical Implementation**:
|
||||
```sql
|
||||
-- Merge operation (atomic transaction)
|
||||
BEGIN TRANSACTION;
|
||||
UPDATE story_tags SET tag_id = target_tag_id WHERE tag_id = source_tag_id;
|
||||
INSERT INTO tag_aliases (alias_name, canonical_tag_id, created_from_merge)
|
||||
VALUES (source_tag_name, target_tag_id, TRUE);
|
||||
DELETE FROM tags WHERE id = source_tag_id;
|
||||
COMMIT;
|
||||
```
|
||||
|
||||
### 4. Tag Aliases
|
||||
|
||||
**Purpose**: Prevent tag duplication by allowing alternative names that resolve to canonical tags.
|
||||
|
||||
**Key Features**:
|
||||
- **Transparent Resolution**: Users type "magictf" and automatically get "magic tf"
|
||||
- **Hover Display**: Show aliases when hovering over tags
|
||||
- **Import Integration**: Automatic alias resolution during story imports
|
||||
- **Auto-Generation**: Created automatically during tag merges
|
||||
|
||||
**Database Schema**:
|
||||
```sql
|
||||
CREATE TABLE tag_aliases (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
alias_name VARCHAR(255) UNIQUE NOT NULL,
|
||||
canonical_tag_id UUID NOT NULL REFERENCES tags(id) ON DELETE CASCADE,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
created_from_merge BOOLEAN DEFAULT FALSE
|
||||
);
|
||||
|
||||
CREATE INDEX idx_tag_aliases_name ON tag_aliases(alias_name);
|
||||
```
|
||||
|
||||
**UI Behavior**:
|
||||
- Tags with aliases show subtle indicator (e.g., small "+" icon)
|
||||
- Hover tooltip displays:
|
||||
```
|
||||
magic tf
|
||||
────────────
|
||||
Aliases: magictf, magic_tf, magic-transformation
|
||||
```
|
||||
|
||||
## Tag Maintenance Page
|
||||
|
||||
### Access
|
||||
- Reachable only through Settings page
|
||||
- Button: "Tag Maintenance" or "Manage Tags"
|
||||
|
||||
### Main Interface
|
||||
|
||||
**Tag Management Table**:
|
||||
```
|
||||
┌─ Search: [____________] [Color Filter ▼] [Sort: Usage ▼]
|
||||
├─
|
||||
├─ ☐ magic tf 🔵 (17 stories) [+2 aliases] [Edit] [Delete]
|
||||
├─ ☐ transformation 🟢 (34 stories) [+1 alias] [Edit] [Delete]
|
||||
├─ ☐ sci-fi 🟣 (45 stories) [Edit] [Delete]
|
||||
└─
|
||||
[Merge Selected] [Bulk Delete] [Export/Import Tags]
|
||||
```
|
||||
|
||||
**Features**:
|
||||
- Searchable and filterable tag list
|
||||
- Sortable by name, usage count, creation date
|
||||
- Bulk selection for merge/delete operations
|
||||
- Visual indicators for color and alias count
|
||||
|
||||
### Tag Edit Modal
|
||||
|
||||
```
|
||||
Edit Tag: "magic tf"
|
||||
┌─ Name: [magic tf ]
|
||||
├─ Color: [🔵] [Theme Colors...] [Custom...]
|
||||
├─ Description: [Optional description]
|
||||
├─
|
||||
├─ Aliases (2):
|
||||
│ • magictf [Remove]
|
||||
│ • magic_tf [Remove]
|
||||
│ [Add Alias: ____________] [Add]
|
||||
├─
|
||||
├─ Used by 17 stories [View Stories]
|
||||
└─ [Save] [Cancel]
|
||||
```
|
||||
|
||||
**Functionality**:
|
||||
- Edit tag name, color, and description
|
||||
- Manage aliases (add/remove)
|
||||
- View associated stories
|
||||
- Prevent circular alias references
|
||||
|
||||
### Merge Interface
|
||||
|
||||
**Selection Process**:
|
||||
1. Select multiple tags from main table
|
||||
2. Click "Merge Selected"
|
||||
3. Choose canonical tag name
|
||||
4. Preview merge results
|
||||
5. Confirm operation
|
||||
|
||||
**Preview Display**:
|
||||
- Show before/after story counts
|
||||
- List all aliases that will be created
|
||||
- Highlight any conflicts or issues
|
||||
|
||||
## Integration Points
|
||||
|
||||
### 1. Import/Scraping Enhancement
|
||||
|
||||
```javascript
|
||||
// Tag resolution during imports
|
||||
const resolveTagName = async (inputTag) => {
|
||||
const alias = await tagApi.findAlias(inputTag);
|
||||
return alias ? alias.canonicalTag : inputTag;
|
||||
};
|
||||
```
|
||||
|
||||
### 2. Tag Input Components
|
||||
|
||||
**Enhanced Autocomplete**:
|
||||
- Include both canonical names and aliases in suggestions
|
||||
- Show resolution: "magictf → magic tf" in dropdown
|
||||
- Always save canonical name to database
|
||||
|
||||
### 3. Search Functionality
|
||||
|
||||
**Transparent Alias Search**:
|
||||
- Search for "magictf" includes stories tagged with "magic tf"
|
||||
- User doesn't need to know about canonical/alias distinction
|
||||
- Expand search queries to include all aliases
|
||||
|
||||
### 4. Display Components
|
||||
|
||||
**Tag Rendering**:
|
||||
- Apply colors consistently across all tag displays
|
||||
- Show alias indicator where appropriate
|
||||
- Implement hover tooltips for alias information
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1: Core Infrastructure
|
||||
- [ ] Database schema updates (tags.color, tag_aliases table)
|
||||
- [ ] Basic tag editing functionality (name, color, description)
|
||||
- [ ] Color palette component with theme colors
|
||||
- [ ] Tag edit modal interface
|
||||
|
||||
### Phase 2: Merging & Aliasing
|
||||
- [ ] Tag merge functionality with automatic alias creation
|
||||
- [ ] Alias resolution in import/scraping logic
|
||||
- [ ] Tag input component enhancements
|
||||
- [ ] Search integration with alias expansion
|
||||
|
||||
### Phase 3: UI Polish & Advanced Features
|
||||
- [ ] Hover tooltips for alias display
|
||||
- [ ] Bulk operations (merge multiple, bulk delete)
|
||||
- [ ] Advanced filtering and sorting options
|
||||
- [ ] Tag maintenance page integration with Settings
|
||||
|
||||
### Phase 4: Smart Features (Optional)
|
||||
- [ ] Auto-merge suggestions for similar tag names
|
||||
- [ ] Color auto-assignment based on usage patterns
|
||||
- [ ] Import intelligence and learning from user decisions
|
||||
|
||||
## Technical Considerations
|
||||
|
||||
### Performance
|
||||
- Index alias names for fast lookup during imports
|
||||
- Optimize tag queries with proper database indexing
|
||||
- Consider caching for frequently accessed tag/alias mappings
|
||||
|
||||
### Data Integrity
|
||||
- Prevent circular alias references
|
||||
- Atomic transactions for merge operations
|
||||
- Cascade deletion handling for tag relationships
|
||||
|
||||
### User Experience
|
||||
- Clear visual feedback for all operations
|
||||
- Comprehensive preview before destructive actions
|
||||
- Consistent color and styling across the application
|
||||
|
||||
### Accessibility
|
||||
- Sufficient color contrast for all tag colors
|
||||
- Keyboard navigation support
|
||||
- Screen reader compatibility
|
||||
- Don't rely solely on color for information
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### New Endpoints Needed
|
||||
- `GET /api/tags/{id}/aliases` - Get aliases for a tag
|
||||
- `POST /api/tags/merge` - Merge multiple tags
|
||||
- `POST /api/tags/{id}/aliases` - Add alias to tag
|
||||
- `DELETE /api/tags/{id}/aliases/{aliasId}` - Remove alias
|
||||
- `PUT /api/tags/{id}/color` - Update tag color
|
||||
- `GET /api/tags/resolve/{name}` - Resolve tag name (check aliases)
|
||||
|
||||
### Enhanced Endpoints
|
||||
- `GET /api/tags` - Include color and alias count in response
|
||||
- `PUT /api/tags/{id}` - Support color and description updates
|
||||
- `DELETE /api/tags/{id}` - Enhanced with story impact information
|
||||
|
||||
## Configuration
|
||||
|
||||
### Theme Color Palette
|
||||
Define a curated set of colors that work well with both light and dark themes:
|
||||
- Primary blues: #3B82F6, #1D4ED8, #60A5FA
|
||||
- Greens: #10B981, #059669, #34D399
|
||||
- Purples: #8B5CF6, #7C3AED, #A78BFA
|
||||
- Warm tones: #F59E0B, #D97706, #F97316
|
||||
- Neutrals: #6B7280, #4B5563, #9CA3AF
|
||||
|
||||
### Settings Integration
|
||||
- Add "Tag Maintenance" button to Settings page
|
||||
- Consider adding tag-related preferences (default colors, etc.)
|
||||
|
||||
## Success Criteria
|
||||
|
||||
1. **Color Tags**: Tags can be assigned colors that display consistently throughout the application
|
||||
2. **Tag Deletion**: Users can safely delete tags with appropriate warnings and reassignment options
|
||||
3. **Tag Merging**: Similar tags can be merged with automatic alias creation
|
||||
4. **Alias Resolution**: Imports automatically resolve aliases to canonical tags
|
||||
5. **User Experience**: All operations are intuitive with clear feedback and preview options
|
||||
6. **Performance**: Tag operations remain fast even with large numbers of tags and aliases
|
||||
7. **Data Integrity**: No orphaned references or circular alias chains
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
- **Tag Statistics**: Usage analytics and trends
|
||||
- **Tag Recommendations**: AI-powered tag suggestions during story import
|
||||
- **Tag Templates**: Predefined tag sets for common story types
|
||||
- **Export/Import**: Backup and restore tag configurations
|
||||
- **Tag Validation**: Rules for tag naming conventions
|
||||
|
||||
---
|
||||
|
||||
*This specification serves as the definitive guide for implementing the tag enhancement features in StoryCove. All implementation should refer back to this document to ensure consistency and completeness.*
|
||||
@@ -2,15 +2,15 @@ FROM openjdk:17-jdk-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY pom.xml .
|
||||
COPY src ./src
|
||||
# Install Maven
|
||||
RUN apt-get update && apt-get install -y maven && rm -rf /var/lib/apt/lists/*
|
||||
|
||||
RUN apt-get update && apt-get install -y maven && \
|
||||
mvn clean package -DskipTests && \
|
||||
apt-get remove -y maven && \
|
||||
apt-get autoremove -y && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
# Copy source code
|
||||
COPY . .
|
||||
|
||||
# Build the application
|
||||
RUN mvn clean package -DskipTests
|
||||
|
||||
EXPOSE 8080
|
||||
|
||||
CMD ["java", "-jar", "target/storycove-backend-0.0.1-SNAPSHOT.jar"]
|
||||
ENTRYPOINT ["java", "-jar", "target/storycove-backend-0.0.1-SNAPSHOT.jar"]
|
||||
4
backend/cookies_new.txt
Normal file
4
backend/cookies_new.txt
Normal file
@@ -0,0 +1,4 @@
|
||||
# Netscape HTTP Cookie File
|
||||
# https://curl.se/docs/http-cookies.html
|
||||
# This file was generated by libcurl! Edit at your own risk.
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
<parent>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-parent</artifactId>
|
||||
<version>3.2.0</version>
|
||||
<version>3.5.5</version>
|
||||
<relativePath/>
|
||||
</parent>
|
||||
|
||||
@@ -17,7 +17,7 @@
|
||||
|
||||
<properties>
|
||||
<java.version>17</java.version>
|
||||
<testcontainers.version>1.19.3</testcontainers.version>
|
||||
<testcontainers.version>1.21.3</testcontainers.version>
|
||||
</properties>
|
||||
|
||||
<dependencyManagement>
|
||||
@@ -49,6 +49,10 @@
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-validation</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-actuator</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.postgresql</groupId>
|
||||
<artifactId>postgresql</artifactId>
|
||||
@@ -56,18 +60,18 @@
|
||||
<dependency>
|
||||
<groupId>io.jsonwebtoken</groupId>
|
||||
<artifactId>jjwt-api</artifactId>
|
||||
<version>0.12.3</version>
|
||||
<version>0.13.0</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>io.jsonwebtoken</groupId>
|
||||
<artifactId>jjwt-impl</artifactId>
|
||||
<version>0.12.3</version>
|
||||
<version>0.13.0</version>
|
||||
<scope>runtime</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>io.jsonwebtoken</groupId>
|
||||
<artifactId>jjwt-jackson</artifactId>
|
||||
<version>0.12.3</version>
|
||||
<version>0.13.0</version>
|
||||
<scope>runtime</scope>
|
||||
</dependency>
|
||||
<dependency>
|
||||
@@ -80,9 +84,17 @@
|
||||
<artifactId>httpclient5</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.typesense</groupId>
|
||||
<artifactId>typesense-java</artifactId>
|
||||
<version>1.3.0</version>
|
||||
<groupId>org.opensearch.client</groupId>
|
||||
<artifactId>opensearch-java</artifactId>
|
||||
<version>3.2.0</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.httpcomponents.core5</groupId>
|
||||
<artifactId>httpcore5</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.apache.httpcomponents.core5</groupId>
|
||||
<artifactId>httpcore5-h2</artifactId>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.positiondev.epublib</groupId>
|
||||
@@ -119,6 +131,13 @@
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-maven-plugin</artifactId>
|
||||
</plugin>
|
||||
<plugin>
|
||||
<groupId>org.apache.maven.plugins</groupId>
|
||||
<artifactId>maven-compiler-plugin</artifactId>
|
||||
<configuration>
|
||||
<parameters>true</parameters>
|
||||
</configuration>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</build>
|
||||
</project>
|
||||
@@ -0,0 +1,64 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import com.storycove.service.LibraryService;
|
||||
import com.zaxxer.hikari.HikariConfig;
|
||||
import com.zaxxer.hikari.HikariDataSource;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
import org.springframework.context.annotation.DependsOn;
|
||||
import org.springframework.context.annotation.Primary;
|
||||
|
||||
import javax.sql.DataSource;
|
||||
|
||||
/**
|
||||
* Database configuration that sets up library-aware datasource routing.
|
||||
*
|
||||
* This configuration replaces the default Spring Boot datasource with a routing
|
||||
* datasource that automatically directs all database operations to the appropriate
|
||||
* library-specific database based on the current active library.
|
||||
*/
|
||||
@Configuration
|
||||
public class DatabaseConfig {
|
||||
|
||||
@Value("${spring.datasource.url}")
|
||||
private String baseDbUrl;
|
||||
|
||||
@Value("${spring.datasource.username}")
|
||||
private String dbUsername;
|
||||
|
||||
@Value("${spring.datasource.password}")
|
||||
private String dbPassword;
|
||||
|
||||
/**
|
||||
* Create a fallback datasource for when no library is active.
|
||||
* This connects to the main database specified in application.yml.
|
||||
*/
|
||||
@Bean(name = "fallbackDataSource")
|
||||
public DataSource fallbackDataSource() {
|
||||
HikariConfig config = new HikariConfig();
|
||||
config.setJdbcUrl(baseDbUrl);
|
||||
config.setUsername(dbUsername);
|
||||
config.setPassword(dbPassword);
|
||||
config.setDriverClassName("org.postgresql.Driver");
|
||||
config.setMaximumPoolSize(10);
|
||||
config.setConnectionTimeout(30000);
|
||||
|
||||
return new HikariDataSource(config);
|
||||
}
|
||||
|
||||
/**
|
||||
* Primary datasource bean - uses smart routing that excludes authentication operations
|
||||
*/
|
||||
@Bean(name = "dataSource")
|
||||
@Primary
|
||||
@DependsOn("libraryService")
|
||||
public DataSource primaryDataSource(LibraryService libraryService) {
|
||||
SmartRoutingDataSource routingDataSource = new SmartRoutingDataSource(
|
||||
libraryService, baseDbUrl, dbUsername, dbPassword);
|
||||
routingDataSource.setDefaultTargetDataSource(fallbackDataSource());
|
||||
routingDataSource.setTargetDataSources(new java.util.HashMap<>());
|
||||
return routingDataSource;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -0,0 +1,65 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import com.storycove.service.LibraryService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource;
|
||||
|
||||
/**
|
||||
* Custom DataSource router that dynamically routes database calls to the appropriate
|
||||
* library-specific datasource based on the current active library.
|
||||
*
|
||||
* This makes ALL Spring Data JPA repositories automatically library-aware without
|
||||
* requiring changes to existing repository or service code.
|
||||
*/
|
||||
public class LibraryAwareDataSource extends AbstractRoutingDataSource {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(LibraryAwareDataSource.class);
|
||||
|
||||
private final LibraryService libraryService;
|
||||
|
||||
public LibraryAwareDataSource(LibraryService libraryService) {
|
||||
this.libraryService = libraryService;
|
||||
// Set empty target datasources to satisfy AbstractRoutingDataSource requirements
|
||||
// We override determineTargetDataSource() so this won't be used
|
||||
setTargetDataSources(new java.util.HashMap<>());
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Object determineCurrentLookupKey() {
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
logger.debug("Routing database call to library: {}", currentLibraryId);
|
||||
return currentLibraryId;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected javax.sql.DataSource determineTargetDataSource() {
|
||||
try {
|
||||
// Check if LibraryService is properly initialized
|
||||
if (libraryService == null) {
|
||||
logger.debug("LibraryService not available, using default datasource");
|
||||
return getResolvedDefaultDataSource();
|
||||
}
|
||||
|
||||
// Check if any library is currently active
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
if (currentLibraryId == null) {
|
||||
logger.debug("No active library, using default datasource");
|
||||
return getResolvedDefaultDataSource();
|
||||
}
|
||||
|
||||
// Try to get the current library datasource
|
||||
javax.sql.DataSource libraryDataSource = libraryService.getCurrentDataSource();
|
||||
logger.debug("Successfully routing database call to library: {}", currentLibraryId);
|
||||
return libraryDataSource;
|
||||
|
||||
} catch (IllegalStateException e) {
|
||||
// This is expected during authentication, startup, or when no library is active
|
||||
logger.debug("No active library (IllegalStateException) - using default datasource: {}", e.getMessage());
|
||||
return getResolvedDefaultDataSource();
|
||||
} catch (Exception e) {
|
||||
logger.warn("Unexpected error determining target datasource, falling back to default: {}", e.getMessage(), e);
|
||||
return getResolvedDefaultDataSource();
|
||||
}
|
||||
}
|
||||
}
|
||||
211
backend/src/main/java/com/storycove/config/OpenSearchConfig.java
Normal file
211
backend/src/main/java/com/storycove/config/OpenSearchConfig.java
Normal file
@@ -0,0 +1,211 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
|
||||
import org.apache.hc.client5.http.auth.AuthScope;
|
||||
import org.apache.hc.client5.http.auth.UsernamePasswordCredentials;
|
||||
import org.apache.hc.client5.http.impl.auth.BasicCredentialsProvider;
|
||||
import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManager;
|
||||
import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManagerBuilder;
|
||||
import org.apache.hc.client5.http.ssl.ClientTlsStrategyBuilder;
|
||||
import org.apache.hc.core5.http.HttpHost;
|
||||
import org.apache.hc.core5.util.Timeout;
|
||||
import org.opensearch.client.json.jackson.JacksonJsonpMapper;
|
||||
import org.opensearch.client.opensearch.OpenSearchClient;
|
||||
import org.opensearch.client.transport.OpenSearchTransport;
|
||||
import org.opensearch.client.transport.httpclient5.ApacheHttpClient5TransportBuilder;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Qualifier;
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
|
||||
import javax.net.ssl.SSLContext;
|
||||
import javax.net.ssl.TrustManager;
|
||||
import javax.net.ssl.X509TrustManager;
|
||||
import java.io.FileInputStream;
|
||||
import java.security.KeyStore;
|
||||
import java.security.cert.X509Certificate;
|
||||
|
||||
@Configuration
|
||||
public class OpenSearchConfig {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(OpenSearchConfig.class);
|
||||
|
||||
private final OpenSearchProperties properties;
|
||||
|
||||
public OpenSearchConfig(@Qualifier("openSearchProperties") OpenSearchProperties properties) {
|
||||
this.properties = properties;
|
||||
}
|
||||
|
||||
@Bean
|
||||
public OpenSearchClient openSearchClient() throws Exception {
|
||||
logger.info("Initializing OpenSearch client for profile: {}", properties.getProfile());
|
||||
|
||||
// Create credentials provider
|
||||
BasicCredentialsProvider credentialsProvider = createCredentialsProvider();
|
||||
|
||||
// Create SSL context based on environment
|
||||
SSLContext sslContext = createSSLContext();
|
||||
|
||||
// Create connection manager with pooling
|
||||
PoolingAsyncClientConnectionManager connectionManager = createConnectionManager(sslContext);
|
||||
|
||||
// Create custom ObjectMapper for proper date serialization
|
||||
ObjectMapper objectMapper = new ObjectMapper();
|
||||
objectMapper.registerModule(new JavaTimeModule());
|
||||
objectMapper.disable(com.fasterxml.jackson.databind.SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
|
||||
|
||||
// Create the transport with all configurations and custom Jackson mapper
|
||||
OpenSearchTransport transport = ApacheHttpClient5TransportBuilder
|
||||
.builder(new HttpHost(properties.getScheme(), properties.getHost(), properties.getPort()))
|
||||
.setMapper(new JacksonJsonpMapper(objectMapper))
|
||||
.setHttpClientConfigCallback(httpClientBuilder -> {
|
||||
// Only set credentials provider if authentication is configured
|
||||
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
|
||||
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
|
||||
httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
|
||||
}
|
||||
|
||||
httpClientBuilder.setConnectionManager(connectionManager);
|
||||
|
||||
// Set timeouts
|
||||
httpClientBuilder.setDefaultRequestConfig(
|
||||
org.apache.hc.client5.http.config.RequestConfig.custom()
|
||||
.setConnectionRequestTimeout(Timeout.ofMilliseconds(properties.getConnection().getTimeout()))
|
||||
.setResponseTimeout(Timeout.ofMilliseconds(properties.getConnection().getSocketTimeout()))
|
||||
.build()
|
||||
);
|
||||
|
||||
return httpClientBuilder;
|
||||
})
|
||||
.build();
|
||||
|
||||
OpenSearchClient client = new OpenSearchClient(transport);
|
||||
|
||||
// Test connection
|
||||
testConnection(client);
|
||||
|
||||
return client;
|
||||
}
|
||||
|
||||
private BasicCredentialsProvider createCredentialsProvider() {
|
||||
BasicCredentialsProvider credentialsProvider = new BasicCredentialsProvider();
|
||||
|
||||
// Only set credentials if username and password are provided
|
||||
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
|
||||
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
|
||||
credentialsProvider.setCredentials(
|
||||
new AuthScope(properties.getHost(), properties.getPort()),
|
||||
new UsernamePasswordCredentials(
|
||||
properties.getUsername(),
|
||||
properties.getPassword().toCharArray()
|
||||
)
|
||||
);
|
||||
logger.info("OpenSearch credentials configured for user: {}", properties.getUsername());
|
||||
} else {
|
||||
logger.info("OpenSearch running without authentication (no credentials configured)");
|
||||
}
|
||||
|
||||
return credentialsProvider;
|
||||
}
|
||||
|
||||
private SSLContext createSSLContext() throws Exception {
|
||||
SSLContext sslContext;
|
||||
|
||||
if (isProduction() && !properties.getSecurity().isTrustAllCertificates()) {
|
||||
// Production SSL configuration with proper certificate validation
|
||||
sslContext = createProductionSSLContext();
|
||||
} else {
|
||||
// Development SSL configuration (trust all certificates)
|
||||
sslContext = createDevelopmentSSLContext();
|
||||
}
|
||||
|
||||
return sslContext;
|
||||
}
|
||||
|
||||
private SSLContext createProductionSSLContext() throws Exception {
|
||||
logger.info("Configuring production SSL context with certificate validation");
|
||||
|
||||
SSLContext sslContext = SSLContext.getInstance("TLS");
|
||||
|
||||
// Load custom keystore/truststore if provided
|
||||
if (properties.getSecurity().getTruststorePath() != null) {
|
||||
KeyStore trustStore = KeyStore.getInstance("JKS");
|
||||
try (FileInputStream fis = new FileInputStream(properties.getSecurity().getTruststorePath())) {
|
||||
trustStore.load(fis, properties.getSecurity().getTruststorePassword().toCharArray());
|
||||
}
|
||||
|
||||
javax.net.ssl.TrustManagerFactory tmf =
|
||||
javax.net.ssl.TrustManagerFactory.getInstance(javax.net.ssl.TrustManagerFactory.getDefaultAlgorithm());
|
||||
tmf.init(trustStore);
|
||||
|
||||
sslContext.init(null, tmf.getTrustManagers(), null);
|
||||
} else {
|
||||
// Use default system SSL context for production
|
||||
sslContext.init(null, null, null);
|
||||
}
|
||||
|
||||
return sslContext;
|
||||
}
|
||||
|
||||
private SSLContext createDevelopmentSSLContext() throws Exception {
|
||||
logger.warn("Configuring development SSL context - TRUSTING ALL CERTIFICATES (not for production!)");
|
||||
|
||||
SSLContext sslContext = SSLContext.getInstance("TLS");
|
||||
sslContext.init(null, new TrustManager[] {
|
||||
new X509TrustManager() {
|
||||
public X509Certificate[] getAcceptedIssuers() { return null; }
|
||||
public void checkClientTrusted(X509Certificate[] certs, String authType) {}
|
||||
public void checkServerTrusted(X509Certificate[] certs, String authType) {}
|
||||
}
|
||||
}, null);
|
||||
|
||||
return sslContext;
|
||||
}
|
||||
|
||||
private PoolingAsyncClientConnectionManager createConnectionManager(SSLContext sslContext) {
|
||||
PoolingAsyncClientConnectionManagerBuilder builder = PoolingAsyncClientConnectionManagerBuilder.create();
|
||||
|
||||
// Configure TLS strategy
|
||||
if (properties.getScheme().equals("https")) {
|
||||
if (isProduction() && properties.getSecurity().isSslVerification()) {
|
||||
// Production TLS with hostname verification
|
||||
builder.setTlsStrategy(ClientTlsStrategyBuilder.create()
|
||||
.setSslContext(sslContext)
|
||||
.build());
|
||||
} else {
|
||||
// Development TLS without hostname verification
|
||||
builder.setTlsStrategy(ClientTlsStrategyBuilder.create()
|
||||
.setSslContext(sslContext)
|
||||
.setHostnameVerifier((hostname, session) -> true)
|
||||
.build());
|
||||
}
|
||||
}
|
||||
|
||||
PoolingAsyncClientConnectionManager connectionManager = builder.build();
|
||||
|
||||
// Configure connection pool settings
|
||||
connectionManager.setMaxTotal(properties.getConnection().getMaxConnectionsTotal());
|
||||
connectionManager.setDefaultMaxPerRoute(properties.getConnection().getMaxConnectionsPerRoute());
|
||||
|
||||
return connectionManager;
|
||||
}
|
||||
|
||||
private boolean isProduction() {
|
||||
return "production".equalsIgnoreCase(properties.getProfile());
|
||||
}
|
||||
|
||||
private void testConnection(OpenSearchClient client) {
|
||||
try {
|
||||
var response = client.info();
|
||||
logger.info("OpenSearch connection successful - Version: {}, Cluster: {}",
|
||||
response.version().number(),
|
||||
response.clusterName());
|
||||
} catch (Exception e) {
|
||||
logger.warn("OpenSearch connection test failed during initialization: {}", e.getMessage());
|
||||
logger.debug("OpenSearch connection test full error", e);
|
||||
// Don't throw exception here - let the client be created and handle failures in service methods
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,164 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import org.springframework.boot.context.properties.ConfigurationProperties;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
@Component
|
||||
@ConfigurationProperties(prefix = "storycove.opensearch")
|
||||
public class OpenSearchProperties {
|
||||
|
||||
private String host = "localhost";
|
||||
private int port = 9200;
|
||||
private String scheme = "https";
|
||||
private String username = "admin";
|
||||
private String password;
|
||||
private String profile = "development";
|
||||
|
||||
private Security security = new Security();
|
||||
private Connection connection = new Connection();
|
||||
private Indices indices = new Indices();
|
||||
private Bulk bulk = new Bulk();
|
||||
private Health health = new Health();
|
||||
|
||||
// Getters and setters
|
||||
public String getHost() { return host; }
|
||||
public void setHost(String host) { this.host = host; }
|
||||
|
||||
public int getPort() { return port; }
|
||||
public void setPort(int port) { this.port = port; }
|
||||
|
||||
public String getScheme() { return scheme; }
|
||||
public void setScheme(String scheme) { this.scheme = scheme; }
|
||||
|
||||
public String getUsername() { return username; }
|
||||
public void setUsername(String username) { this.username = username; }
|
||||
|
||||
public String getPassword() { return password; }
|
||||
public void setPassword(String password) { this.password = password; }
|
||||
|
||||
public String getProfile() { return profile; }
|
||||
public void setProfile(String profile) { this.profile = profile; }
|
||||
|
||||
public Security getSecurity() { return security; }
|
||||
public void setSecurity(Security security) { this.security = security; }
|
||||
|
||||
public Connection getConnection() { return connection; }
|
||||
public void setConnection(Connection connection) { this.connection = connection; }
|
||||
|
||||
public Indices getIndices() { return indices; }
|
||||
public void setIndices(Indices indices) { this.indices = indices; }
|
||||
|
||||
public Bulk getBulk() { return bulk; }
|
||||
public void setBulk(Bulk bulk) { this.bulk = bulk; }
|
||||
|
||||
public Health getHealth() { return health; }
|
||||
public void setHealth(Health health) { this.health = health; }
|
||||
|
||||
public static class Security {
|
||||
private boolean sslVerification = false;
|
||||
private boolean trustAllCertificates = true;
|
||||
private String keystorePath;
|
||||
private String keystorePassword;
|
||||
private String truststorePath;
|
||||
private String truststorePassword;
|
||||
|
||||
// Getters and setters
|
||||
public boolean isSslVerification() { return sslVerification; }
|
||||
public void setSslVerification(boolean sslVerification) { this.sslVerification = sslVerification; }
|
||||
|
||||
public boolean isTrustAllCertificates() { return trustAllCertificates; }
|
||||
public void setTrustAllCertificates(boolean trustAllCertificates) { this.trustAllCertificates = trustAllCertificates; }
|
||||
|
||||
public String getKeystorePath() { return keystorePath; }
|
||||
public void setKeystorePath(String keystorePath) { this.keystorePath = keystorePath; }
|
||||
|
||||
public String getKeystorePassword() { return keystorePassword; }
|
||||
public void setKeystorePassword(String keystorePassword) { this.keystorePassword = keystorePassword; }
|
||||
|
||||
public String getTruststorePath() { return truststorePath; }
|
||||
public void setTruststorePath(String truststorePath) { this.truststorePath = truststorePath; }
|
||||
|
||||
public String getTruststorePassword() { return truststorePassword; }
|
||||
public void setTruststorePassword(String truststorePassword) { this.truststorePassword = truststorePassword; }
|
||||
}
|
||||
|
||||
public static class Connection {
|
||||
private int timeout = 30000;
|
||||
private int socketTimeout = 60000;
|
||||
private int maxConnectionsPerRoute = 10;
|
||||
private int maxConnectionsTotal = 30;
|
||||
private boolean retryOnFailure = true;
|
||||
private int maxRetries = 3;
|
||||
|
||||
// Getters and setters
|
||||
public int getTimeout() { return timeout; }
|
||||
public void setTimeout(int timeout) { this.timeout = timeout; }
|
||||
|
||||
public int getSocketTimeout() { return socketTimeout; }
|
||||
public void setSocketTimeout(int socketTimeout) { this.socketTimeout = socketTimeout; }
|
||||
|
||||
public int getMaxConnectionsPerRoute() { return maxConnectionsPerRoute; }
|
||||
public void setMaxConnectionsPerRoute(int maxConnectionsPerRoute) { this.maxConnectionsPerRoute = maxConnectionsPerRoute; }
|
||||
|
||||
public int getMaxConnectionsTotal() { return maxConnectionsTotal; }
|
||||
public void setMaxConnectionsTotal(int maxConnectionsTotal) { this.maxConnectionsTotal = maxConnectionsTotal; }
|
||||
|
||||
public boolean isRetryOnFailure() { return retryOnFailure; }
|
||||
public void setRetryOnFailure(boolean retryOnFailure) { this.retryOnFailure = retryOnFailure; }
|
||||
|
||||
public int getMaxRetries() { return maxRetries; }
|
||||
public void setMaxRetries(int maxRetries) { this.maxRetries = maxRetries; }
|
||||
}
|
||||
|
||||
public static class Indices {
|
||||
private int defaultShards = 1;
|
||||
private int defaultReplicas = 0;
|
||||
private String refreshInterval = "1s";
|
||||
|
||||
// Getters and setters
|
||||
public int getDefaultShards() { return defaultShards; }
|
||||
public void setDefaultShards(int defaultShards) { this.defaultShards = defaultShards; }
|
||||
|
||||
public int getDefaultReplicas() { return defaultReplicas; }
|
||||
public void setDefaultReplicas(int defaultReplicas) { this.defaultReplicas = defaultReplicas; }
|
||||
|
||||
public String getRefreshInterval() { return refreshInterval; }
|
||||
public void setRefreshInterval(String refreshInterval) { this.refreshInterval = refreshInterval; }
|
||||
}
|
||||
|
||||
public static class Bulk {
|
||||
private int actions = 1000;
|
||||
private long size = 5242880; // 5MB
|
||||
private int timeout = 10000;
|
||||
private int concurrentRequests = 1;
|
||||
|
||||
// Getters and setters
|
||||
public int getActions() { return actions; }
|
||||
public void setActions(int actions) { this.actions = actions; }
|
||||
|
||||
public long getSize() { return size; }
|
||||
public void setSize(long size) { this.size = size; }
|
||||
|
||||
public int getTimeout() { return timeout; }
|
||||
public void setTimeout(int timeout) { this.timeout = timeout; }
|
||||
|
||||
public int getConcurrentRequests() { return concurrentRequests; }
|
||||
public void setConcurrentRequests(int concurrentRequests) { this.concurrentRequests = concurrentRequests; }
|
||||
}
|
||||
|
||||
public static class Health {
|
||||
private int checkInterval = 30000;
|
||||
private int slowQueryThreshold = 5000;
|
||||
private boolean enableMetrics = true;
|
||||
|
||||
// Getters and setters
|
||||
public int getCheckInterval() { return checkInterval; }
|
||||
public void setCheckInterval(int checkInterval) { this.checkInterval = checkInterval; }
|
||||
|
||||
public int getSlowQueryThreshold() { return slowQueryThreshold; }
|
||||
public void setSlowQueryThreshold(int slowQueryThreshold) { this.slowQueryThreshold = slowQueryThreshold; }
|
||||
|
||||
public boolean isEnableMetrics() { return enableMetrics; }
|
||||
public void setEnableMetrics(boolean enableMetrics) { this.enableMetrics = enableMetrics; }
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,158 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import com.storycove.service.LibraryService;
|
||||
import com.zaxxer.hikari.HikariConfig;
|
||||
import com.zaxxer.hikari.HikariDataSource;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource;
|
||||
import org.springframework.web.context.request.RequestContextHolder;
|
||||
import org.springframework.web.context.request.ServletRequestAttributes;
|
||||
|
||||
import javax.sql.DataSource;
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.ConcurrentHashMap;
|
||||
|
||||
/**
|
||||
* Smart routing datasource that:
|
||||
* 1. Routes to library-specific databases when a library is active
|
||||
* 2. Excludes authentication operations (keeps them on default database)
|
||||
* 3. Uses request context to determine when routing is appropriate
|
||||
*/
|
||||
public class SmartRoutingDataSource extends AbstractRoutingDataSource {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(SmartRoutingDataSource.class);
|
||||
|
||||
private final LibraryService libraryService;
|
||||
private final Map<String, DataSource> libraryDataSources = new ConcurrentHashMap<>();
|
||||
|
||||
// Database connection details - will be injected via constructor
|
||||
private final String baseDbUrl;
|
||||
private final String dbUsername;
|
||||
private final String dbPassword;
|
||||
|
||||
public SmartRoutingDataSource(LibraryService libraryService, String baseDbUrl, String dbUsername, String dbPassword) {
|
||||
this.libraryService = libraryService;
|
||||
this.baseDbUrl = baseDbUrl;
|
||||
this.dbUsername = dbUsername;
|
||||
this.dbPassword = dbPassword;
|
||||
|
||||
logger.info("SmartRoutingDataSource initialized with database: {}", baseDbUrl);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Object determineCurrentLookupKey() {
|
||||
try {
|
||||
// Check if this is an authentication request - if so, use default database
|
||||
if (isAuthenticationRequest()) {
|
||||
logger.debug("Authentication request detected, using default database");
|
||||
return null; // null means use default datasource
|
||||
}
|
||||
|
||||
// Check if we have an active library
|
||||
if (libraryService != null) {
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
if (currentLibraryId != null && !currentLibraryId.trim().isEmpty()) {
|
||||
logger.info("ROUTING: Directing to library-specific database: {}", currentLibraryId);
|
||||
return currentLibraryId;
|
||||
} else {
|
||||
logger.info("ROUTING: No active library, using default database");
|
||||
}
|
||||
} else {
|
||||
logger.info("ROUTING: LibraryService is null, using default database");
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.debug("Error determining lookup key, falling back to default database", e);
|
||||
}
|
||||
|
||||
return null; // Use default datasource
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if the current request is an authentication request that should use the default database
|
||||
*/
|
||||
private boolean isAuthenticationRequest() {
|
||||
try {
|
||||
ServletRequestAttributes attributes = (ServletRequestAttributes) RequestContextHolder.getRequestAttributes();
|
||||
if (attributes != null) {
|
||||
String requestURI = attributes.getRequest().getRequestURI();
|
||||
String method = attributes.getRequest().getMethod();
|
||||
|
||||
// Authentication endpoints that should use default database
|
||||
if (requestURI.contains("/auth/") ||
|
||||
requestURI.contains("/login") ||
|
||||
requestURI.contains("/api/libraries/switch") ||
|
||||
(requestURI.contains("/api/libraries") && "POST".equals(method))) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
} catch (Exception e) {
|
||||
logger.debug("Could not determine request context", e);
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected DataSource determineTargetDataSource() {
|
||||
Object lookupKey = determineCurrentLookupKey();
|
||||
|
||||
if (lookupKey != null) {
|
||||
String libraryId = (String) lookupKey;
|
||||
return getLibraryDataSource(libraryId);
|
||||
}
|
||||
|
||||
return getDefaultDataSource();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get or create a datasource for the specified library
|
||||
*/
|
||||
private DataSource getLibraryDataSource(String libraryId) {
|
||||
return libraryDataSources.computeIfAbsent(libraryId, id -> {
|
||||
try {
|
||||
HikariConfig config = new HikariConfig();
|
||||
|
||||
// Replace database name in URL with library-specific name
|
||||
String libraryUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + "storycove_" + id);
|
||||
|
||||
config.setJdbcUrl(libraryUrl);
|
||||
config.setUsername(dbUsername);
|
||||
config.setPassword(dbPassword);
|
||||
config.setDriverClassName("org.postgresql.Driver");
|
||||
config.setMaximumPoolSize(5); // Smaller pool for library-specific databases
|
||||
config.setConnectionTimeout(10000);
|
||||
config.setMaxLifetime(600000); // 10 minutes
|
||||
|
||||
logger.info("Created new datasource for library: {} -> {}", id, libraryUrl);
|
||||
return new HikariDataSource(config);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to create datasource for library: {}", id, e);
|
||||
return getDefaultDataSource();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
private DataSource getDefaultDataSource() {
|
||||
// Use the default target datasource that was set in the configuration
|
||||
try {
|
||||
return (DataSource) super.determineTargetDataSource();
|
||||
} catch (Exception e) {
|
||||
logger.debug("Could not get default datasource via super method", e);
|
||||
}
|
||||
|
||||
// Fallback: create a basic datasource
|
||||
logger.warn("No default datasource available, creating fallback");
|
||||
HikariConfig config = new HikariConfig();
|
||||
config.setJdbcUrl(baseDbUrl);
|
||||
config.setUsername(dbUsername);
|
||||
config.setPassword(dbPassword);
|
||||
config.setDriverClassName("org.postgresql.Driver");
|
||||
config.setMaximumPoolSize(10);
|
||||
config.setConnectionTimeout(30000);
|
||||
return new HikariDataSource(config);
|
||||
}
|
||||
}
|
||||
@@ -1,37 +0,0 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
import org.typesense.api.Client;
|
||||
import org.typesense.resources.Node;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
|
||||
@Configuration
|
||||
public class TypesenseConfig {
|
||||
|
||||
@Value("${storycove.typesense.api-key}")
|
||||
private String apiKey;
|
||||
|
||||
@Value("${storycove.typesense.host}")
|
||||
private String host;
|
||||
|
||||
@Value("${storycove.typesense.port}")
|
||||
private int port;
|
||||
|
||||
@Bean
|
||||
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true)
|
||||
public Client typesenseClient() {
|
||||
List<Node> nodes = new ArrayList<>();
|
||||
nodes.add(new Node("http", host, String.valueOf(port)));
|
||||
|
||||
org.typesense.api.Configuration configuration = new org.typesense.api.Configuration(
|
||||
nodes, java.time.Duration.ofSeconds(10), apiKey
|
||||
);
|
||||
|
||||
return new Client(configuration);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,163 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.service.AuthorService;
|
||||
import com.storycove.service.OpenSearchService;
|
||||
import com.storycove.service.SearchServiceAdapter;
|
||||
import com.storycove.service.StoryService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* Admin controller for managing OpenSearch operations.
|
||||
* Provides endpoints for reindexing and index management.
|
||||
*/
|
||||
@RestController
|
||||
@RequestMapping("/api/admin/search")
|
||||
public class AdminSearchController {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AdminSearchController.class);
|
||||
|
||||
@Autowired
|
||||
private SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
private StoryService storyService;
|
||||
|
||||
@Autowired
|
||||
private AuthorService authorService;
|
||||
|
||||
@Autowired(required = false)
|
||||
private OpenSearchService openSearchService;
|
||||
|
||||
/**
|
||||
* Get current search status
|
||||
*/
|
||||
@GetMapping("/status")
|
||||
public ResponseEntity<Map<String, Object>> getSearchStatus() {
|
||||
try {
|
||||
var status = searchServiceAdapter.getSearchStatus();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"primaryEngine", status.getPrimaryEngine(),
|
||||
"dualWrite", status.isDualWrite(),
|
||||
"openSearchAvailable", status.isOpenSearchAvailable()
|
||||
));
|
||||
} catch (Exception e) {
|
||||
logger.error("Error getting search status", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"error", "Failed to get search status: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reindex all data in OpenSearch
|
||||
*/
|
||||
@PostMapping("/opensearch/reindex")
|
||||
public ResponseEntity<Map<String, Object>> reindexOpenSearch() {
|
||||
try {
|
||||
logger.info("Starting OpenSearch full reindex");
|
||||
|
||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch is not available or healthy"
|
||||
));
|
||||
}
|
||||
|
||||
// Get all data from services
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
|
||||
// Bulk index directly in OpenSearch
|
||||
if (openSearchService != null) {
|
||||
openSearchService.bulkIndexStories(allStories);
|
||||
openSearchService.bulkIndexAuthors(allAuthors);
|
||||
} else {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch service not available"
|
||||
));
|
||||
}
|
||||
|
||||
int totalIndexed = allStories.size() + allAuthors.size();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", String.format("Reindexed %d stories and %d authors in OpenSearch",
|
||||
allStories.size(), allAuthors.size()),
|
||||
"storiesCount", allStories.size(),
|
||||
"authorsCount", allAuthors.size(),
|
||||
"totalCount", totalIndexed
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during OpenSearch reindex", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch reindex failed: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Recreate OpenSearch indices
|
||||
*/
|
||||
@PostMapping("/opensearch/recreate")
|
||||
public ResponseEntity<Map<String, Object>> recreateOpenSearchIndices() {
|
||||
try {
|
||||
logger.info("Starting OpenSearch indices recreation");
|
||||
|
||||
if (!searchServiceAdapter.isSearchServiceAvailable()) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch is not available or healthy"
|
||||
));
|
||||
}
|
||||
|
||||
// Recreate indices
|
||||
if (openSearchService != null) {
|
||||
openSearchService.recreateIndices();
|
||||
} else {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch service not available"
|
||||
));
|
||||
}
|
||||
|
||||
// Get all data and reindex
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
|
||||
// Bulk index after recreation
|
||||
openSearchService.bulkIndexStories(allStories);
|
||||
openSearchService.bulkIndexAuthors(allAuthors);
|
||||
|
||||
int totalIndexed = allStories.size() + allAuthors.size();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", String.format("Recreated OpenSearch indices and indexed %d stories and %d authors",
|
||||
allStories.size(), allAuthors.size()),
|
||||
"storiesCount", allStories.size(),
|
||||
"authorsCount", allAuthors.size(),
|
||||
"totalCount", totalIndexed
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during OpenSearch indices recreation", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch indices recreation failed: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,6 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.service.LibraryService;
|
||||
import com.storycove.service.PasswordAuthenticationService;
|
||||
import com.storycove.util.JwtUtil;
|
||||
import jakarta.servlet.http.HttpServletResponse;
|
||||
@@ -18,18 +19,21 @@ import java.time.Duration;
|
||||
public class AuthController {
|
||||
|
||||
private final PasswordAuthenticationService passwordService;
|
||||
private final LibraryService libraryService;
|
||||
private final JwtUtil jwtUtil;
|
||||
|
||||
public AuthController(PasswordAuthenticationService passwordService, JwtUtil jwtUtil) {
|
||||
public AuthController(PasswordAuthenticationService passwordService, LibraryService libraryService, JwtUtil jwtUtil) {
|
||||
this.passwordService = passwordService;
|
||||
this.libraryService = libraryService;
|
||||
this.jwtUtil = jwtUtil;
|
||||
}
|
||||
|
||||
@PostMapping("/login")
|
||||
public ResponseEntity<?> login(@Valid @RequestBody LoginRequest request, HttpServletResponse response) {
|
||||
if (passwordService.authenticate(request.getPassword())) {
|
||||
String token = jwtUtil.generateToken();
|
||||
// Use new library-aware authentication
|
||||
String token = passwordService.authenticateAndSwitchLibrary(request.getPassword());
|
||||
|
||||
if (token != null) {
|
||||
// Set httpOnly cookie
|
||||
ResponseCookie cookie = ResponseCookie.from("token", token)
|
||||
.httpOnly(true)
|
||||
@@ -40,7 +44,8 @@ public class AuthController {
|
||||
|
||||
response.addHeader(HttpHeaders.SET_COOKIE, cookie.toString());
|
||||
|
||||
return ResponseEntity.ok(new LoginResponse("Authentication successful", token));
|
||||
String libraryInfo = passwordService.getCurrentLibraryInfo();
|
||||
return ResponseEntity.ok(new LoginResponse("Authentication successful - " + libraryInfo, token));
|
||||
} else {
|
||||
return ResponseEntity.status(401).body(new ErrorResponse("Invalid password"));
|
||||
}
|
||||
@@ -48,6 +53,9 @@ public class AuthController {
|
||||
|
||||
@PostMapping("/logout")
|
||||
public ResponseEntity<?> logout(HttpServletResponse response) {
|
||||
// Clear authentication state
|
||||
libraryService.clearAuthentication();
|
||||
|
||||
// Clear the cookie
|
||||
ResponseCookie cookie = ResponseCookie.from("token", "")
|
||||
.httpOnly(true)
|
||||
|
||||
@@ -4,7 +4,7 @@ import com.storycove.dto.*;
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.service.AuthorService;
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import com.storycove.service.SearchServiceAdapter;
|
||||
import jakarta.servlet.http.HttpServletRequest;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
@@ -32,12 +32,12 @@ public class AuthorController {
|
||||
|
||||
private final AuthorService authorService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
public AuthorController(AuthorService authorService, ImageService imageService, TypesenseService typesenseService) {
|
||||
public AuthorController(AuthorService authorService, ImageService imageService, SearchServiceAdapter searchServiceAdapter) {
|
||||
this.authorService = authorService;
|
||||
this.imageService = imageService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@GetMapping
|
||||
@@ -258,7 +258,17 @@ public class AuthorController {
|
||||
@RequestParam(defaultValue = "name") String sortBy,
|
||||
@RequestParam(defaultValue = "asc") String sortOrder) {
|
||||
|
||||
SearchResultDto<AuthorSearchDto> searchResults = typesenseService.searchAuthors(q, page, size, sortBy, sortOrder);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
List<AuthorSearchDto> authorSearchResults = searchServiceAdapter.searchAuthors(q, size);
|
||||
|
||||
// Create SearchResultDto to match expected return format
|
||||
SearchResultDto<AuthorSearchDto> searchResults = new SearchResultDto<>();
|
||||
searchResults.setResults(authorSearchResults);
|
||||
searchResults.setQuery(q);
|
||||
searchResults.setPage(page);
|
||||
searchResults.setPerPage(size);
|
||||
searchResults.setTotalHits(authorSearchResults.size());
|
||||
searchResults.setSearchTimeMs(0); // SearchServiceAdapter doesn't provide timing
|
||||
|
||||
// Convert AuthorSearchDto results to AuthorDto
|
||||
SearchResultDto<AuthorDto> results = new SearchResultDto<>();
|
||||
@@ -283,7 +293,7 @@ public class AuthorController {
|
||||
public ResponseEntity<Map<String, Object>> reindexAuthorsTypesense() {
|
||||
try {
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
typesenseService.reindexAllAuthors(allAuthors);
|
||||
searchServiceAdapter.bulkIndexAuthors(allAuthors);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Reindexed " + allAuthors.size() + " authors",
|
||||
@@ -303,7 +313,7 @@ public class AuthorController {
|
||||
try {
|
||||
// This will delete the existing collection and recreate it with correct schema
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
typesenseService.reindexAllAuthors(allAuthors);
|
||||
searchServiceAdapter.bulkIndexAuthors(allAuthors);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Recreated authors collection and indexed " + allAuthors.size() + " authors",
|
||||
@@ -321,7 +331,7 @@ public class AuthorController {
|
||||
@GetMapping("/typesense-schema")
|
||||
public ResponseEntity<Map<String, Object>> getAuthorsTypesenseSchema() {
|
||||
try {
|
||||
Map<String, Object> schema = typesenseService.getAuthorsCollectionSchema();
|
||||
Map<String, Object> schema = Map.of("status", "authors collection schema retrieved from search service");
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"schema", schema
|
||||
@@ -335,6 +345,44 @@ public class AuthorController {
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/clean-author-names")
|
||||
public ResponseEntity<Map<String, Object>> cleanAuthorNames() {
|
||||
try {
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
int cleanedCount = 0;
|
||||
|
||||
for (Author author : allAuthors) {
|
||||
String originalName = author.getName();
|
||||
String cleanedName = originalName != null ? originalName.trim() : "";
|
||||
|
||||
if (!cleanedName.equals(originalName)) {
|
||||
logger.info("Cleaning author name: '{}' -> '{}'", originalName, cleanedName);
|
||||
author.setName(cleanedName);
|
||||
authorService.update(author.getId(), author);
|
||||
cleanedCount++;
|
||||
}
|
||||
}
|
||||
|
||||
// Reindex all authors after cleaning
|
||||
if (cleanedCount > 0) {
|
||||
searchServiceAdapter.bulkIndexAuthors(allAuthors);
|
||||
}
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Cleaned " + cleanedCount + " author names and reindexed",
|
||||
"cleanedCount", cleanedCount,
|
||||
"totalAuthors", allAuthors.size()
|
||||
));
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to clean author names", e);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", false,
|
||||
"error", e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
@GetMapping("/top-rated")
|
||||
public ResponseEntity<List<AuthorSummaryDto>> getTopRatedAuthors(@RequestParam(defaultValue = "10") int limit) {
|
||||
Pageable pageable = PageRequest.of(0, limit);
|
||||
|
||||
@@ -9,7 +9,6 @@ import com.storycove.service.CollectionService;
|
||||
import com.storycove.service.EPUBExportService;
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.ReadingTimeService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
@@ -31,19 +30,16 @@ public class CollectionController {
|
||||
|
||||
private final CollectionService collectionService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
private final EPUBExportService epubExportService;
|
||||
|
||||
@Autowired
|
||||
public CollectionController(CollectionService collectionService,
|
||||
ImageService imageService,
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
ReadingTimeService readingTimeService,
|
||||
EPUBExportService epubExportService) {
|
||||
this.collectionService = collectionService;
|
||||
this.imageService = imageService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.readingTimeService = readingTimeService;
|
||||
this.epubExportService = epubExportService;
|
||||
}
|
||||
@@ -292,19 +288,12 @@ public class CollectionController {
|
||||
public ResponseEntity<Map<String, Object>> reindexCollectionsTypesense() {
|
||||
try {
|
||||
List<Collection> allCollections = collectionService.findAllWithTags();
|
||||
if (typesenseService != null) {
|
||||
typesenseService.reindexAllCollections(allCollections);
|
||||
// Collections are not indexed in search engine yet
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Successfully reindexed all collections",
|
||||
"message", "Collections indexing not yet implemented in OpenSearch",
|
||||
"count", allCollections.size()
|
||||
));
|
||||
} else {
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", false,
|
||||
"message", "Typesense service not available"
|
||||
));
|
||||
}
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to reindex collections", e);
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
|
||||
@@ -2,6 +2,7 @@ package com.storycove.controller;
|
||||
|
||||
import com.storycove.dto.HtmlSanitizationConfigDto;
|
||||
import com.storycove.service.HtmlSanitizationService;
|
||||
import com.storycove.service.ImageService;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
@@ -14,13 +15,15 @@ import java.util.Map;
|
||||
public class ConfigController {
|
||||
|
||||
private final HtmlSanitizationService htmlSanitizationService;
|
||||
private final ImageService imageService;
|
||||
|
||||
@Value("${app.reading.speed.default:200}")
|
||||
private int defaultReadingSpeed;
|
||||
|
||||
@Autowired
|
||||
public ConfigController(HtmlSanitizationService htmlSanitizationService) {
|
||||
public ConfigController(HtmlSanitizationService htmlSanitizationService, ImageService imageService) {
|
||||
this.htmlSanitizationService = htmlSanitizationService;
|
||||
this.imageService = imageService;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -51,4 +54,64 @@ public class ConfigController {
|
||||
public ResponseEntity<Map<String, Integer>> getReadingSpeed() {
|
||||
return ResponseEntity.ok(Map.of("wordsPerMinute", defaultReadingSpeed));
|
||||
}
|
||||
|
||||
/**
|
||||
* Preview orphaned content images cleanup (dry run)
|
||||
*/
|
||||
@PostMapping("/cleanup/images/preview")
|
||||
public ResponseEntity<Map<String, Object>> previewImageCleanup() {
|
||||
try {
|
||||
ImageService.ContentImageCleanupResult result = imageService.cleanupOrphanedContentImages(true);
|
||||
|
||||
Map<String, Object> response = Map.of(
|
||||
"success", true,
|
||||
"orphanedCount", result.getOrphanedImages().size(),
|
||||
"totalSizeBytes", result.getTotalSizeBytes(),
|
||||
"formattedSize", result.getFormattedSize(),
|
||||
"foldersToDelete", result.getFoldersToDelete(),
|
||||
"referencedImagesCount", result.getTotalReferencedImages(),
|
||||
"errors", result.getErrors(),
|
||||
"hasErrors", result.hasErrors(),
|
||||
"dryRun", true
|
||||
);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.status(500).body(Map.of(
|
||||
"success", false,
|
||||
"error", "Failed to preview image cleanup: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute orphaned content images cleanup
|
||||
*/
|
||||
@PostMapping("/cleanup/images/execute")
|
||||
public ResponseEntity<Map<String, Object>> executeImageCleanup() {
|
||||
try {
|
||||
ImageService.ContentImageCleanupResult result = imageService.cleanupOrphanedContentImages(false);
|
||||
|
||||
Map<String, Object> response = Map.of(
|
||||
"success", true,
|
||||
"deletedCount", result.getOrphanedImages().size(),
|
||||
"totalSizeBytes", result.getTotalSizeBytes(),
|
||||
"formattedSize", result.getFormattedSize(),
|
||||
"foldersDeleted", result.getFoldersToDelete(),
|
||||
"referencedImagesCount", result.getTotalReferencedImages(),
|
||||
"errors", result.getErrors(),
|
||||
"hasErrors", result.hasErrors(),
|
||||
"dryRun", false
|
||||
);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.status(500).body(Map.of(
|
||||
"success", false,
|
||||
"error", "Failed to execute image cleanup: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.LibraryService;
|
||||
import org.springframework.core.io.FileSystemResource;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.http.HttpHeaders;
|
||||
@@ -10,6 +11,7 @@ import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
import org.springframework.web.multipart.MultipartFile;
|
||||
|
||||
import jakarta.servlet.http.HttpServletRequest;
|
||||
import java.io.IOException;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
@@ -21,9 +23,17 @@ import java.util.Map;
|
||||
public class FileController {
|
||||
|
||||
private final ImageService imageService;
|
||||
private final LibraryService libraryService;
|
||||
|
||||
public FileController(ImageService imageService) {
|
||||
public FileController(ImageService imageService, LibraryService libraryService) {
|
||||
this.imageService = imageService;
|
||||
this.libraryService = libraryService;
|
||||
}
|
||||
|
||||
private String getCurrentLibraryId() {
|
||||
String libraryId = libraryService.getCurrentLibraryId();
|
||||
System.out.println("FileController - Current Library ID: " + libraryId);
|
||||
return libraryId != null ? libraryId : "default";
|
||||
}
|
||||
|
||||
@PostMapping("/upload/cover")
|
||||
@@ -34,7 +44,11 @@ public class FileController {
|
||||
Map<String, String> response = new HashMap<>();
|
||||
response.put("message", "Cover uploaded successfully");
|
||||
response.put("path", imagePath);
|
||||
response.put("url", "/api/files/images/" + imagePath);
|
||||
String currentLibraryId = getCurrentLibraryId();
|
||||
String imageUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
||||
response.put("url", imageUrl);
|
||||
|
||||
System.out.println("Upload response - path: " + imagePath + ", url: " + imageUrl);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
} catch (IllegalArgumentException e) {
|
||||
@@ -53,7 +67,8 @@ public class FileController {
|
||||
Map<String, String> response = new HashMap<>();
|
||||
response.put("message", "Avatar uploaded successfully");
|
||||
response.put("path", imagePath);
|
||||
response.put("url", "/api/files/images/" + imagePath);
|
||||
String currentLibraryId = getCurrentLibraryId();
|
||||
response.put("url", "/api/files/images/" + currentLibraryId + "/" + imagePath);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
} catch (IllegalArgumentException e) {
|
||||
@@ -64,17 +79,18 @@ public class FileController {
|
||||
}
|
||||
}
|
||||
|
||||
@GetMapping("/images/**")
|
||||
public ResponseEntity<Resource> serveImage(@RequestParam String path) {
|
||||
@GetMapping("/images/{libraryId}/**")
|
||||
public ResponseEntity<Resource> serveImage(@PathVariable String libraryId, HttpServletRequest request) {
|
||||
try {
|
||||
// Extract path from the URL
|
||||
String imagePath = path.replace("/api/files/images/", "");
|
||||
// Extract the full request path after /api/files/images/{libraryId}/
|
||||
String requestURI = request.getRequestURI();
|
||||
String imagePath = requestURI.replaceFirst(".*/api/files/images/" + libraryId + "/", "");
|
||||
|
||||
if (!imageService.imageExists(imagePath)) {
|
||||
if (!imageService.imageExistsInLibrary(imagePath, libraryId)) {
|
||||
return ResponseEntity.notFound().build();
|
||||
}
|
||||
|
||||
Path fullPath = imageService.getImagePath(imagePath);
|
||||
Path fullPath = imageService.getImagePathInLibrary(imagePath, libraryId);
|
||||
Resource resource = new FileSystemResource(fullPath);
|
||||
|
||||
if (!resource.exists()) {
|
||||
|
||||
@@ -0,0 +1,242 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.dto.LibraryDto;
|
||||
import com.storycove.service.LibraryService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
@RestController
|
||||
@RequestMapping("/api/libraries")
|
||||
public class LibraryController {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(LibraryController.class);
|
||||
|
||||
private final LibraryService libraryService;
|
||||
|
||||
@Autowired
|
||||
public LibraryController(LibraryService libraryService) {
|
||||
this.libraryService = libraryService;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all available libraries (for settings UI)
|
||||
*/
|
||||
@GetMapping
|
||||
public ResponseEntity<List<LibraryDto>> getAllLibraries() {
|
||||
try {
|
||||
List<LibraryDto> libraries = libraryService.getAllLibraries();
|
||||
return ResponseEntity.ok(libraries);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to get libraries", e);
|
||||
return ResponseEntity.internalServerError().build();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current active library info
|
||||
*/
|
||||
@GetMapping("/current")
|
||||
public ResponseEntity<LibraryDto> getCurrentLibrary() {
|
||||
try {
|
||||
var library = libraryService.getCurrentLibrary();
|
||||
if (library == null) {
|
||||
return ResponseEntity.noContent().build();
|
||||
}
|
||||
|
||||
LibraryDto dto = new LibraryDto(
|
||||
library.getId(),
|
||||
library.getName(),
|
||||
library.getDescription(),
|
||||
true, // always active since it's current
|
||||
library.isInitialized()
|
||||
);
|
||||
|
||||
return ResponseEntity.ok(dto);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to get current library", e);
|
||||
return ResponseEntity.internalServerError().build();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Switch to a different library (requires re-authentication)
|
||||
* This endpoint returns a switching status that the frontend can poll
|
||||
*/
|
||||
@PostMapping("/switch")
|
||||
public ResponseEntity<Map<String, Object>> initiateLibrarySwitch(@RequestBody Map<String, String> request) {
|
||||
try {
|
||||
String password = request.get("password");
|
||||
if (password == null || password.trim().isEmpty()) {
|
||||
return ResponseEntity.badRequest().body(Map.of("error", "Password required"));
|
||||
}
|
||||
|
||||
String libraryId = libraryService.authenticateAndGetLibrary(password);
|
||||
if (libraryId == null) {
|
||||
return ResponseEntity.status(401).body(Map.of("error", "Invalid password"));
|
||||
}
|
||||
|
||||
// Check if already on this library
|
||||
if (libraryId.equals(libraryService.getCurrentLibraryId())) {
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"status", "already_active",
|
||||
"message", "Already using this library"
|
||||
));
|
||||
}
|
||||
|
||||
// Initiate switch in background thread
|
||||
new Thread(() -> {
|
||||
try {
|
||||
libraryService.switchToLibrary(libraryId);
|
||||
logger.info("Library switch completed: {}", libraryId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Library switch failed: {}", libraryId, e);
|
||||
}
|
||||
}).start();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"status", "switching",
|
||||
"targetLibrary", libraryId,
|
||||
"message", "Switching to library, please wait..."
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to initiate library switch", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of("error", "Server error"));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check library switch status
|
||||
*/
|
||||
@GetMapping("/switch/status")
|
||||
public ResponseEntity<Map<String, Object>> getLibrarySwitchStatus() {
|
||||
try {
|
||||
var currentLibrary = libraryService.getCurrentLibrary();
|
||||
boolean isReady = currentLibrary != null;
|
||||
|
||||
Map<String, Object> response = new HashMap<>();
|
||||
response.put("ready", isReady);
|
||||
if (isReady) {
|
||||
response.put("currentLibrary", currentLibrary.getId());
|
||||
response.put("currentLibraryName", currentLibrary.getName());
|
||||
} else {
|
||||
response.put("currentLibrary", null);
|
||||
response.put("currentLibraryName", null);
|
||||
}
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to get switch status", e);
|
||||
return ResponseEntity.ok(Map.of("ready", false, "error", "Status check failed"));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Change password for current library
|
||||
*/
|
||||
@PostMapping("/password")
|
||||
public ResponseEntity<Map<String, Object>> changePassword(@RequestBody Map<String, String> request) {
|
||||
try {
|
||||
String currentPassword = request.get("currentPassword");
|
||||
String newPassword = request.get("newPassword");
|
||||
|
||||
if (currentPassword == null || newPassword == null) {
|
||||
return ResponseEntity.badRequest().body(Map.of("error", "Current and new passwords required"));
|
||||
}
|
||||
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
if (currentLibraryId == null) {
|
||||
return ResponseEntity.badRequest().body(Map.of("error", "No active library"));
|
||||
}
|
||||
|
||||
boolean success = libraryService.changeLibraryPassword(currentLibraryId, currentPassword, newPassword);
|
||||
if (success) {
|
||||
return ResponseEntity.ok(Map.of("success", true, "message", "Password changed successfully"));
|
||||
} else {
|
||||
return ResponseEntity.badRequest().body(Map.of("error", "Current password is incorrect"));
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to change password", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of("error", "Server error"));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new library
|
||||
*/
|
||||
@PostMapping("/create")
|
||||
public ResponseEntity<Map<String, Object>> createLibrary(@RequestBody Map<String, String> request) {
|
||||
try {
|
||||
String name = request.get("name");
|
||||
String description = request.get("description");
|
||||
String password = request.get("password");
|
||||
|
||||
if (name == null || name.trim().isEmpty() || password == null || password.trim().isEmpty()) {
|
||||
return ResponseEntity.badRequest().body(Map.of("error", "Name and password are required"));
|
||||
}
|
||||
|
||||
var newLibrary = libraryService.createNewLibrary(name.trim(), description, password);
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"library", Map.of(
|
||||
"id", newLibrary.getId(),
|
||||
"name", newLibrary.getName(),
|
||||
"description", newLibrary.getDescription()
|
||||
),
|
||||
"message", "Library created successfully. You can now log in with the new password to access it."
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to create library", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of("error", "Server error"));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update library metadata (name and description)
|
||||
*/
|
||||
@PutMapping("/{libraryId}/metadata")
|
||||
public ResponseEntity<Map<String, Object>> updateLibraryMetadata(
|
||||
@PathVariable String libraryId,
|
||||
@RequestBody Map<String, String> updates) {
|
||||
|
||||
try {
|
||||
String newName = updates.get("name");
|
||||
String newDescription = updates.get("description");
|
||||
|
||||
if (newName == null || newName.trim().isEmpty()) {
|
||||
return ResponseEntity.badRequest().body(Map.of("error", "Library name is required"));
|
||||
}
|
||||
|
||||
// Update the library
|
||||
libraryService.updateLibraryMetadata(libraryId, newName, newDescription);
|
||||
|
||||
// Return updated library info
|
||||
LibraryDto updatedLibrary = libraryService.getLibraryById(libraryId);
|
||||
if (updatedLibrary != null) {
|
||||
Map<String, Object> response = new HashMap<>();
|
||||
response.put("success", true);
|
||||
response.put("message", "Library metadata updated successfully");
|
||||
response.put("library", updatedLibrary);
|
||||
return ResponseEntity.ok(response);
|
||||
} else {
|
||||
return ResponseEntity.notFound().build();
|
||||
}
|
||||
|
||||
} catch (IllegalArgumentException e) {
|
||||
return ResponseEntity.badRequest().body(Map.of("error", e.getMessage()));
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update library metadata for {}: {}", libraryId, e.getMessage(), e);
|
||||
return ResponseEntity.internalServerError().body(Map.of("error", "Failed to update library metadata"));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -2,7 +2,7 @@ package com.storycove.controller;
|
||||
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.service.StoryService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import com.storycove.service.SearchServiceAdapter;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
@@ -14,25 +14,19 @@ import java.util.Map;
|
||||
@RequestMapping("/api/search")
|
||||
public class SearchController {
|
||||
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
private final StoryService storyService;
|
||||
|
||||
public SearchController(@Autowired(required = false) TypesenseService typesenseService, StoryService storyService) {
|
||||
this.typesenseService = typesenseService;
|
||||
public SearchController(SearchServiceAdapter searchServiceAdapter, StoryService storyService) {
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
this.storyService = storyService;
|
||||
}
|
||||
|
||||
@PostMapping("/reindex")
|
||||
public ResponseEntity<?> reindexAllStories() {
|
||||
if (typesenseService == null) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"error", "Typesense service is not available"
|
||||
));
|
||||
}
|
||||
|
||||
try {
|
||||
List<Story> allStories = storyService.findAll();
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
searchServiceAdapter.bulkIndexStories(allStories);
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"message", "Successfully reindexed all stories",
|
||||
@@ -47,17 +41,8 @@ public class SearchController {
|
||||
|
||||
@GetMapping("/health")
|
||||
public ResponseEntity<?> searchHealthCheck() {
|
||||
if (typesenseService == null) {
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"status", "disabled",
|
||||
"message", "Typesense service is disabled"
|
||||
));
|
||||
}
|
||||
|
||||
try {
|
||||
// Try a simple search to test connectivity
|
||||
typesenseService.searchSuggestions("test", 1);
|
||||
|
||||
// Search service is operational if it's injected
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"status", "healthy",
|
||||
"message", "Search service is operational"
|
||||
|
||||
@@ -14,6 +14,7 @@ import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.PageImpl;
|
||||
import org.springframework.data.domain.PageRequest;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.data.domain.Sort;
|
||||
@@ -40,7 +41,7 @@ public class StoryController {
|
||||
private final SeriesService seriesService;
|
||||
private final HtmlSanitizationService sanitizationService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
private final CollectionService collectionService;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
private final EPUBImportService epubImportService;
|
||||
@@ -52,7 +53,7 @@ public class StoryController {
|
||||
HtmlSanitizationService sanitizationService,
|
||||
ImageService imageService,
|
||||
CollectionService collectionService,
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
SearchServiceAdapter searchServiceAdapter,
|
||||
ReadingTimeService readingTimeService,
|
||||
EPUBImportService epubImportService,
|
||||
EPUBExportService epubExportService) {
|
||||
@@ -62,7 +63,7 @@ public class StoryController {
|
||||
this.sanitizationService = sanitizationService;
|
||||
this.imageService = imageService;
|
||||
this.collectionService = collectionService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
this.readingTimeService = readingTimeService;
|
||||
this.epubImportService = epubImportService;
|
||||
this.epubExportService = epubExportService;
|
||||
@@ -88,12 +89,34 @@ public class StoryController {
|
||||
@GetMapping("/random")
|
||||
public ResponseEntity<StorySummaryDto> getRandomStory(
|
||||
@RequestParam(required = false) String searchQuery,
|
||||
@RequestParam(required = false) List<String> tags) {
|
||||
@RequestParam(required = false) List<String> tags,
|
||||
@RequestParam(required = false) Long seed,
|
||||
// Advanced filters
|
||||
@RequestParam(required = false) Integer minWordCount,
|
||||
@RequestParam(required = false) Integer maxWordCount,
|
||||
@RequestParam(required = false) String createdAfter,
|
||||
@RequestParam(required = false) String createdBefore,
|
||||
@RequestParam(required = false) String lastReadAfter,
|
||||
@RequestParam(required = false) String lastReadBefore,
|
||||
@RequestParam(required = false) Integer minRating,
|
||||
@RequestParam(required = false) Integer maxRating,
|
||||
@RequestParam(required = false) Boolean unratedOnly,
|
||||
@RequestParam(required = false) String readingStatus,
|
||||
@RequestParam(required = false) Boolean hasReadingProgress,
|
||||
@RequestParam(required = false) Boolean hasCoverImage,
|
||||
@RequestParam(required = false) String sourceDomain,
|
||||
@RequestParam(required = false) String seriesFilter,
|
||||
@RequestParam(required = false) Integer minTagCount,
|
||||
@RequestParam(required = false) Boolean popularOnly,
|
||||
@RequestParam(required = false) Boolean hiddenGemsOnly) {
|
||||
|
||||
logger.info("Getting random story with filters - searchQuery: {}, tags: {}",
|
||||
searchQuery, tags);
|
||||
logger.info("Getting random story with filters - searchQuery: {}, tags: {}, seed: {}",
|
||||
searchQuery, tags, seed);
|
||||
|
||||
Optional<Story> randomStory = storyService.findRandomStory(searchQuery, tags);
|
||||
Optional<Story> randomStory = storyService.findRandomStory(searchQuery, tags, seed,
|
||||
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||
minRating, maxRating, unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
|
||||
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
|
||||
|
||||
if (randomStory.isPresent()) {
|
||||
StorySummaryDto storyDto = convertToSummaryDto(randomStory.get());
|
||||
@@ -206,15 +229,44 @@ public class StoryController {
|
||||
return ResponseEntity.ok(convertToDto(story));
|
||||
}
|
||||
|
||||
@PostMapping("/{id}/process-content-images")
|
||||
public ResponseEntity<Map<String, Object>> processContentImages(@PathVariable UUID id, @RequestBody ProcessContentImagesRequest request) {
|
||||
logger.info("Processing content images for story {}", id);
|
||||
|
||||
try {
|
||||
// Process the HTML content to download and replace image URLs
|
||||
ImageService.ContentImageProcessingResult result = imageService.processContentImages(request.getHtmlContent(), id);
|
||||
|
||||
// If there are warnings, let the client decide whether to proceed
|
||||
if (result.hasWarnings()) {
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"processedContent", result.getProcessedContent(),
|
||||
"warnings", result.getWarnings(),
|
||||
"downloadedImages", result.getDownloadedImages(),
|
||||
"hasWarnings", true
|
||||
));
|
||||
}
|
||||
|
||||
// Success - no warnings
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"processedContent", result.getProcessedContent(),
|
||||
"downloadedImages", result.getDownloadedImages(),
|
||||
"hasWarnings", false
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to process content images for story {}", id, e);
|
||||
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||
.body(Map.of("error", "Failed to process content images: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/reindex")
|
||||
public ResponseEntity<String> manualReindex() {
|
||||
if (typesenseService == null) {
|
||||
return ResponseEntity.ok("Typesense is not enabled, no reindexing performed");
|
||||
}
|
||||
|
||||
try {
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
searchServiceAdapter.bulkIndexStories(allStories);
|
||||
return ResponseEntity.ok("Successfully reindexed " + allStories.size() + " stories");
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.status(500).body("Failed to reindex stories: " + e.getMessage());
|
||||
@@ -225,7 +277,7 @@ public class StoryController {
|
||||
public ResponseEntity<Map<String, Object>> reindexStoriesTypesense() {
|
||||
try {
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
searchServiceAdapter.bulkIndexStories(allStories);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Reindexed " + allStories.size() + " stories",
|
||||
@@ -245,7 +297,7 @@ public class StoryController {
|
||||
try {
|
||||
// This will delete the existing collection and recreate it with correct schema
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
searchServiceAdapter.bulkIndexStories(allStories);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Recreated stories collection and indexed " + allStories.size() + " stories",
|
||||
@@ -270,16 +322,55 @@ public class StoryController {
|
||||
@RequestParam(required = false) Integer minRating,
|
||||
@RequestParam(required = false) Integer maxRating,
|
||||
@RequestParam(required = false) String sortBy,
|
||||
@RequestParam(required = false) String sortDir) {
|
||||
@RequestParam(required = false) String sortDir,
|
||||
@RequestParam(required = false) List<String> facetBy,
|
||||
// Advanced filters
|
||||
@RequestParam(required = false) Integer minWordCount,
|
||||
@RequestParam(required = false) Integer maxWordCount,
|
||||
@RequestParam(required = false) String createdAfter,
|
||||
@RequestParam(required = false) String createdBefore,
|
||||
@RequestParam(required = false) String lastReadAfter,
|
||||
@RequestParam(required = false) String lastReadBefore,
|
||||
@RequestParam(required = false) Boolean unratedOnly,
|
||||
@RequestParam(required = false) String readingStatus,
|
||||
@RequestParam(required = false) Boolean hasReadingProgress,
|
||||
@RequestParam(required = false) Boolean hasCoverImage,
|
||||
@RequestParam(required = false) String sourceDomain,
|
||||
@RequestParam(required = false) String seriesFilter,
|
||||
@RequestParam(required = false) Integer minTagCount,
|
||||
@RequestParam(required = false) Boolean popularOnly,
|
||||
@RequestParam(required = false) Boolean hiddenGemsOnly) {
|
||||
|
||||
|
||||
if (typesenseService != null) {
|
||||
SearchResultDto<StorySearchDto> results = typesenseService.searchStories(
|
||||
query, page, size, authors, tags, minRating, maxRating, sortBy, sortDir);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
try {
|
||||
// Convert authors list to single author string (for now, use first author)
|
||||
String authorFilter = (authors != null && !authors.isEmpty()) ? authors.get(0) : null;
|
||||
|
||||
// DEBUG: Log all received parameters
|
||||
logger.info("CONTROLLER DEBUG - Received parameters:");
|
||||
logger.info(" readingStatus: '{}'", readingStatus);
|
||||
logger.info(" seriesFilter: '{}'", seriesFilter);
|
||||
logger.info(" hasReadingProgress: {}", hasReadingProgress);
|
||||
logger.info(" hasCoverImage: {}", hasCoverImage);
|
||||
logger.info(" createdAfter: '{}'", createdAfter);
|
||||
logger.info(" lastReadAfter: '{}'", lastReadAfter);
|
||||
logger.info(" unratedOnly: {}", unratedOnly);
|
||||
|
||||
SearchResultDto<StorySearchDto> results = searchServiceAdapter.searchStories(
|
||||
query, tags, authorFilter, seriesFilter, minWordCount, maxWordCount,
|
||||
minRating != null ? minRating.floatValue() : null,
|
||||
null, // isRead - now handled by readingStatus advanced filter
|
||||
null, // isFavorite - now handled by readingStatus advanced filter
|
||||
sortBy, sortDir, page, size, facetBy,
|
||||
// Advanced filters
|
||||
createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||
unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
|
||||
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
|
||||
return ResponseEntity.ok(results);
|
||||
} else {
|
||||
// Fallback to basic search if Typesense is not available
|
||||
return ResponseEntity.badRequest().body(null);
|
||||
} catch (Exception e) {
|
||||
logger.error("Search failed", e);
|
||||
return ResponseEntity.internalServerError().body(null);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -288,10 +379,12 @@ public class StoryController {
|
||||
@RequestParam String query,
|
||||
@RequestParam(defaultValue = "5") int limit) {
|
||||
|
||||
if (typesenseService != null) {
|
||||
List<String> suggestions = typesenseService.searchSuggestions(query, limit);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
try {
|
||||
List<String> suggestions = searchServiceAdapter.getTagSuggestions(query, limit);
|
||||
return ResponseEntity.ok(suggestions);
|
||||
} else {
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to get search suggestions", e);
|
||||
return ResponseEntity.ok(new ArrayList<>());
|
||||
}
|
||||
}
|
||||
@@ -415,14 +508,19 @@ public class StoryController {
|
||||
story.setDescription(updateReq.getDescription());
|
||||
}
|
||||
if (updateReq.getContentHtml() != null) {
|
||||
story.setContentHtml(sanitizationService.sanitize(updateReq.getContentHtml()));
|
||||
logger.info("Content before sanitization (length: {}): {}",
|
||||
updateReq.getContentHtml().length(),
|
||||
updateReq.getContentHtml().substring(0, Math.min(500, updateReq.getContentHtml().length())));
|
||||
String sanitizedContent = sanitizationService.sanitize(updateReq.getContentHtml());
|
||||
logger.info("Content after sanitization (length: {}): {}",
|
||||
sanitizedContent.length(),
|
||||
sanitizedContent.substring(0, Math.min(500, sanitizedContent.length())));
|
||||
story.setContentHtml(sanitizedContent);
|
||||
}
|
||||
if (updateReq.getSourceUrl() != null) {
|
||||
story.setSourceUrl(updateReq.getSourceUrl());
|
||||
}
|
||||
if (updateReq.getVolume() != null) {
|
||||
story.setVolume(updateReq.getVolume());
|
||||
}
|
||||
// Volume will be handled in series logic below
|
||||
// Handle author - either by ID or by name
|
||||
if (updateReq.getAuthorId() != null) {
|
||||
Author author = authorService.findById(updateReq.getAuthorId());
|
||||
@@ -431,13 +529,34 @@ public class StoryController {
|
||||
Author author = findOrCreateAuthor(updateReq.getAuthorName().trim());
|
||||
story.setAuthor(author);
|
||||
}
|
||||
// Handle series - either by ID or by name
|
||||
// Handle series - either by ID, by name, or remove from series
|
||||
if (updateReq.getSeriesId() != null) {
|
||||
Series series = seriesService.findById(updateReq.getSeriesId());
|
||||
story.setSeries(series);
|
||||
} else if (updateReq.getSeriesName() != null && !updateReq.getSeriesName().trim().isEmpty()) {
|
||||
} else if (updateReq.getSeriesName() != null) {
|
||||
logger.info("Processing series update: seriesName='{}', isEmpty={}", updateReq.getSeriesName(), updateReq.getSeriesName().trim().isEmpty());
|
||||
if (updateReq.getSeriesName().trim().isEmpty()) {
|
||||
// Empty series name means remove from series
|
||||
logger.info("Removing story from series");
|
||||
if (story.getSeries() != null) {
|
||||
story.getSeries().removeStory(story);
|
||||
story.setSeries(null);
|
||||
story.setVolume(null);
|
||||
logger.info("Story removed from series");
|
||||
}
|
||||
} else {
|
||||
// Non-empty series name means add to series
|
||||
logger.info("Adding story to series: '{}', volume: {}", updateReq.getSeriesName().trim(), updateReq.getVolume());
|
||||
Series series = seriesService.findOrCreate(updateReq.getSeriesName().trim());
|
||||
story.setSeries(series);
|
||||
// Set volume only if series is being set
|
||||
if (updateReq.getVolume() != null) {
|
||||
story.setVolume(updateReq.getVolume());
|
||||
logger.info("Story added to series: {} with volume: {}", series.getName(), updateReq.getVolume());
|
||||
} else {
|
||||
logger.info("Story added to series: {} with no volume", series.getName());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Note: Tags are now handled in StoryService.updateWithTagNames()
|
||||
@@ -559,8 +678,11 @@ public class StoryController {
|
||||
TagDto tagDto = new TagDto();
|
||||
tagDto.setId(tag.getId());
|
||||
tagDto.setName(tag.getName());
|
||||
tagDto.setColor(tag.getColor());
|
||||
tagDto.setDescription(tag.getDescription());
|
||||
tagDto.setCreatedAt(tag.getCreatedAt());
|
||||
// storyCount can be set if needed, but it might be expensive to calculate for each tag
|
||||
tagDto.setStoryCount(tag.getStories() != null ? tag.getStories().size() : 0);
|
||||
tagDto.setAliasCount(tag.getAliases() != null ? tag.getAliases().size() : 0);
|
||||
return tagDto;
|
||||
}
|
||||
|
||||
|
||||
@@ -1,9 +1,13 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.dto.TagDto;
|
||||
import com.storycove.dto.TagAliasDto;
|
||||
import com.storycove.entity.Tag;
|
||||
import com.storycove.entity.TagAlias;
|
||||
import com.storycove.service.TagService;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.PageRequest;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
@@ -21,6 +25,7 @@ import java.util.stream.Collectors;
|
||||
@RequestMapping("/api/tags")
|
||||
public class TagController {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(TagController.class);
|
||||
private final TagService tagService;
|
||||
|
||||
public TagController(TagService tagService) {
|
||||
@@ -54,6 +59,8 @@ public class TagController {
|
||||
public ResponseEntity<TagDto> createTag(@Valid @RequestBody CreateTagRequest request) {
|
||||
Tag tag = new Tag();
|
||||
tag.setName(request.getName());
|
||||
tag.setColor(request.getColor());
|
||||
tag.setDescription(request.getDescription());
|
||||
|
||||
Tag savedTag = tagService.create(tag);
|
||||
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedTag));
|
||||
@@ -66,6 +73,12 @@ public class TagController {
|
||||
if (request.getName() != null) {
|
||||
existingTag.setName(request.getName());
|
||||
}
|
||||
if (request.getColor() != null) {
|
||||
existingTag.setColor(request.getColor());
|
||||
}
|
||||
if (request.getDescription() != null) {
|
||||
existingTag.setDescription(request.getDescription());
|
||||
}
|
||||
|
||||
Tag updatedTag = tagService.update(id, existingTag);
|
||||
return ResponseEntity.ok(convertToDto(updatedTag));
|
||||
@@ -95,7 +108,7 @@ public class TagController {
|
||||
@RequestParam String query,
|
||||
@RequestParam(defaultValue = "10") int limit) {
|
||||
|
||||
List<Tag> tags = tagService.findByNameStartingWith(query, limit);
|
||||
List<Tag> tags = tagService.findByNameOrAliasStartingWith(query, limit);
|
||||
List<TagDto> tagDtos = tags.stream().map(this::convertToDto).collect(Collectors.toList());
|
||||
|
||||
return ResponseEntity.ok(tagDtos);
|
||||
@@ -142,15 +155,124 @@ public class TagController {
|
||||
return ResponseEntity.ok(tagDtos);
|
||||
}
|
||||
|
||||
// Tag alias endpoints
|
||||
@PostMapping("/{tagId}/aliases")
|
||||
public ResponseEntity<TagAliasDto> addAlias(@PathVariable UUID tagId,
|
||||
@RequestBody Map<String, String> request) {
|
||||
String aliasName = request.get("aliasName");
|
||||
if (aliasName == null || aliasName.trim().isEmpty()) {
|
||||
return ResponseEntity.badRequest().build();
|
||||
}
|
||||
|
||||
try {
|
||||
TagAlias alias = tagService.addAlias(tagId, aliasName.trim());
|
||||
TagAliasDto dto = new TagAliasDto();
|
||||
dto.setId(alias.getId());
|
||||
dto.setAliasName(alias.getAliasName());
|
||||
dto.setCanonicalTagId(alias.getCanonicalTag().getId());
|
||||
dto.setCanonicalTagName(alias.getCanonicalTag().getName());
|
||||
dto.setCreatedFromMerge(alias.getCreatedFromMerge());
|
||||
dto.setCreatedAt(alias.getCreatedAt());
|
||||
|
||||
return ResponseEntity.status(HttpStatus.CREATED).body(dto);
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.badRequest().build();
|
||||
}
|
||||
}
|
||||
|
||||
@DeleteMapping("/{tagId}/aliases/{aliasId}")
|
||||
public ResponseEntity<?> removeAlias(@PathVariable UUID tagId, @PathVariable UUID aliasId) {
|
||||
try {
|
||||
tagService.removeAlias(tagId, aliasId);
|
||||
return ResponseEntity.ok(Map.of("message", "Alias removed successfully"));
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.badRequest().body(Map.of("error", e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
@GetMapping("/resolve/{name}")
|
||||
public ResponseEntity<TagDto> resolveTag(@PathVariable String name) {
|
||||
try {
|
||||
Tag resolvedTag = tagService.resolveTagByName(name);
|
||||
if (resolvedTag != null) {
|
||||
return ResponseEntity.ok(convertToDto(resolvedTag));
|
||||
} else {
|
||||
return ResponseEntity.notFound().build();
|
||||
}
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.notFound().build();
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/merge")
|
||||
public ResponseEntity<?> mergeTags(@Valid @RequestBody MergeTagsRequest request) {
|
||||
try {
|
||||
Tag resultTag = tagService.mergeTags(request.getSourceTagUUIDs(), request.getTargetTagUUID());
|
||||
return ResponseEntity.ok(convertToDto(resultTag));
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to merge tags", e);
|
||||
String errorMessage = e.getMessage() != null ? e.getMessage() : "Unknown error occurred";
|
||||
return ResponseEntity.badRequest().body(Map.of("error", errorMessage));
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/merge/preview")
|
||||
public ResponseEntity<?> previewMerge(@Valid @RequestBody MergeTagsRequest request) {
|
||||
try {
|
||||
MergePreviewResponse preview = tagService.previewMerge(request.getSourceTagUUIDs(), request.getTargetTagUUID());
|
||||
return ResponseEntity.ok(preview);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to preview merge", e);
|
||||
String errorMessage = e.getMessage() != null ? e.getMessage() : "Unknown error occurred";
|
||||
return ResponseEntity.badRequest().body(Map.of("error", errorMessage));
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/suggest")
|
||||
public ResponseEntity<List<TagSuggestion>> suggestTags(@RequestBody TagSuggestionRequest request) {
|
||||
try {
|
||||
List<TagSuggestion> suggestions = tagService.suggestTags(
|
||||
request.getTitle(),
|
||||
request.getContent(),
|
||||
request.getSummary(),
|
||||
request.getLimit() != null ? request.getLimit() : 10
|
||||
);
|
||||
return ResponseEntity.ok(suggestions);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to suggest tags", e);
|
||||
return ResponseEntity.ok(List.of()); // Return empty list on error
|
||||
}
|
||||
}
|
||||
|
||||
private TagDto convertToDto(Tag tag) {
|
||||
TagDto dto = new TagDto();
|
||||
dto.setId(tag.getId());
|
||||
dto.setName(tag.getName());
|
||||
dto.setColor(tag.getColor());
|
||||
dto.setDescription(tag.getDescription());
|
||||
dto.setStoryCount(tag.getStories() != null ? tag.getStories().size() : 0);
|
||||
dto.setCollectionCount(tag.getCollections() != null ? tag.getCollections().size() : 0);
|
||||
dto.setAliasCount(tag.getAliases() != null ? tag.getAliases().size() : 0);
|
||||
dto.setCreatedAt(tag.getCreatedAt());
|
||||
// updatedAt field not present in Tag entity per spec
|
||||
|
||||
// Convert aliases to DTOs for full context
|
||||
if (tag.getAliases() != null && !tag.getAliases().isEmpty()) {
|
||||
List<TagAliasDto> aliaseDtos = tag.getAliases().stream()
|
||||
.map(alias -> {
|
||||
TagAliasDto aliasDto = new TagAliasDto();
|
||||
aliasDto.setId(alias.getId());
|
||||
aliasDto.setAliasName(alias.getAliasName());
|
||||
aliasDto.setCanonicalTagId(alias.getCanonicalTag().getId());
|
||||
aliasDto.setCanonicalTagName(alias.getCanonicalTag().getName());
|
||||
aliasDto.setCreatedFromMerge(alias.getCreatedFromMerge());
|
||||
aliasDto.setCreatedAt(alias.getCreatedAt());
|
||||
return aliasDto;
|
||||
})
|
||||
.collect(Collectors.toList());
|
||||
dto.setAliases(aliaseDtos);
|
||||
}
|
||||
|
||||
return dto;
|
||||
}
|
||||
|
||||
@@ -168,15 +290,112 @@ public class TagController {
|
||||
// Request DTOs
|
||||
public static class CreateTagRequest {
|
||||
private String name;
|
||||
private String color;
|
||||
private String description;
|
||||
|
||||
public String getName() { return name; }
|
||||
public void setName(String name) { this.name = name; }
|
||||
|
||||
public String getColor() { return color; }
|
||||
public void setColor(String color) { this.color = color; }
|
||||
|
||||
public String getDescription() { return description; }
|
||||
public void setDescription(String description) { this.description = description; }
|
||||
}
|
||||
|
||||
public static class UpdateTagRequest {
|
||||
private String name;
|
||||
private String color;
|
||||
private String description;
|
||||
|
||||
public String getName() { return name; }
|
||||
public void setName(String name) { this.name = name; }
|
||||
|
||||
public String getColor() { return color; }
|
||||
public void setColor(String color) { this.color = color; }
|
||||
|
||||
public String getDescription() { return description; }
|
||||
public void setDescription(String description) { this.description = description; }
|
||||
}
|
||||
|
||||
public static class MergeTagsRequest {
|
||||
private List<String> sourceTagIds;
|
||||
private String targetTagId;
|
||||
|
||||
public List<String> getSourceTagIds() { return sourceTagIds; }
|
||||
public void setSourceTagIds(List<String> sourceTagIds) { this.sourceTagIds = sourceTagIds; }
|
||||
|
||||
public String getTargetTagId() { return targetTagId; }
|
||||
public void setTargetTagId(String targetTagId) { this.targetTagId = targetTagId; }
|
||||
|
||||
// Helper methods to convert to UUID
|
||||
public List<UUID> getSourceTagUUIDs() {
|
||||
return sourceTagIds != null ? sourceTagIds.stream().map(UUID::fromString).toList() : null;
|
||||
}
|
||||
|
||||
public UUID getTargetTagUUID() {
|
||||
return targetTagId != null ? UUID.fromString(targetTagId) : null;
|
||||
}
|
||||
}
|
||||
|
||||
public static class MergePreviewResponse {
|
||||
private String targetTagName;
|
||||
private int targetStoryCount;
|
||||
private int totalResultStoryCount;
|
||||
private List<String> aliasesToCreate;
|
||||
|
||||
public String getTargetTagName() { return targetTagName; }
|
||||
public void setTargetTagName(String targetTagName) { this.targetTagName = targetTagName; }
|
||||
|
||||
public int getTargetStoryCount() { return targetStoryCount; }
|
||||
public void setTargetStoryCount(int targetStoryCount) { this.targetStoryCount = targetStoryCount; }
|
||||
|
||||
public int getTotalResultStoryCount() { return totalResultStoryCount; }
|
||||
public void setTotalResultStoryCount(int totalResultStoryCount) { this.totalResultStoryCount = totalResultStoryCount; }
|
||||
|
||||
public List<String> getAliasesToCreate() { return aliasesToCreate; }
|
||||
public void setAliasesToCreate(List<String> aliasesToCreate) { this.aliasesToCreate = aliasesToCreate; }
|
||||
}
|
||||
|
||||
public static class TagSuggestionRequest {
|
||||
private String title;
|
||||
private String content;
|
||||
private String summary;
|
||||
private Integer limit;
|
||||
|
||||
public String getTitle() { return title; }
|
||||
public void setTitle(String title) { this.title = title; }
|
||||
|
||||
public String getContent() { return content; }
|
||||
public void setContent(String content) { this.content = content; }
|
||||
|
||||
public String getSummary() { return summary; }
|
||||
public void setSummary(String summary) { this.summary = summary; }
|
||||
|
||||
public Integer getLimit() { return limit; }
|
||||
public void setLimit(Integer limit) { this.limit = limit; }
|
||||
}
|
||||
|
||||
public static class TagSuggestion {
|
||||
private String tagName;
|
||||
private double confidence;
|
||||
private String reason;
|
||||
|
||||
public TagSuggestion() {}
|
||||
|
||||
public TagSuggestion(String tagName, double confidence, String reason) {
|
||||
this.tagName = tagName;
|
||||
this.confidence = confidence;
|
||||
this.reason = reason;
|
||||
}
|
||||
|
||||
public String getTagName() { return tagName; }
|
||||
public void setTagName(String tagName) { this.tagName = tagName; }
|
||||
|
||||
public double getConfidence() { return confidence; }
|
||||
public void setConfidence(double confidence) { this.confidence = confidence; }
|
||||
|
||||
public String getReason() { return reason; }
|
||||
public void setReason(String reason) { this.reason = reason; }
|
||||
}
|
||||
}
|
||||
61
backend/src/main/java/com/storycove/dto/LibraryDto.java
Normal file
61
backend/src/main/java/com/storycove/dto/LibraryDto.java
Normal file
@@ -0,0 +1,61 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
public class LibraryDto {
|
||||
private String id;
|
||||
private String name;
|
||||
private String description;
|
||||
private boolean isActive;
|
||||
private boolean isInitialized;
|
||||
|
||||
// Constructors
|
||||
public LibraryDto() {}
|
||||
|
||||
public LibraryDto(String id, String name, String description, boolean isActive, boolean isInitialized) {
|
||||
this.id = id;
|
||||
this.name = name;
|
||||
this.description = description;
|
||||
this.isActive = isActive;
|
||||
this.isInitialized = isInitialized;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(String id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getName() {
|
||||
return name;
|
||||
}
|
||||
|
||||
public void setName(String name) {
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
public String getDescription() {
|
||||
return description;
|
||||
}
|
||||
|
||||
public void setDescription(String description) {
|
||||
this.description = description;
|
||||
}
|
||||
|
||||
public boolean isActive() {
|
||||
return isActive;
|
||||
}
|
||||
|
||||
public void setActive(boolean active) {
|
||||
isActive = active;
|
||||
}
|
||||
|
||||
public boolean isInitialized() {
|
||||
return isInitialized;
|
||||
}
|
||||
|
||||
public void setInitialized(boolean initialized) {
|
||||
isInitialized = initialized;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import jakarta.validation.constraints.NotBlank;
|
||||
|
||||
public class ProcessContentImagesRequest {
|
||||
|
||||
@NotBlank(message = "HTML content is required")
|
||||
private String htmlContent;
|
||||
|
||||
public ProcessContentImagesRequest() {}
|
||||
|
||||
public ProcessContentImagesRequest(String htmlContent) {
|
||||
this.htmlContent = htmlContent;
|
||||
}
|
||||
|
||||
public String getHtmlContent() {
|
||||
return htmlContent;
|
||||
}
|
||||
|
||||
public void setHtmlContent(String htmlContent) {
|
||||
this.htmlContent = htmlContent;
|
||||
}
|
||||
}
|
||||
@@ -9,7 +9,6 @@ public class StorySearchDto {
|
||||
private UUID id;
|
||||
private String title;
|
||||
private String description;
|
||||
private String contentPlain;
|
||||
private String sourceUrl;
|
||||
private String coverPath;
|
||||
private Integer wordCount;
|
||||
@@ -18,6 +17,7 @@ public class StorySearchDto {
|
||||
|
||||
// Reading status
|
||||
private Boolean isRead;
|
||||
private Integer readingPosition;
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
// Author info
|
||||
@@ -34,6 +34,9 @@ public class StorySearchDto {
|
||||
private LocalDateTime createdAt;
|
||||
private LocalDateTime updatedAt;
|
||||
|
||||
// Alias for createdAt to match frontend expectations
|
||||
private LocalDateTime dateAdded;
|
||||
|
||||
// Search-specific fields
|
||||
private double searchScore;
|
||||
private List<String> highlights;
|
||||
@@ -65,13 +68,6 @@ public class StorySearchDto {
|
||||
this.description = description;
|
||||
}
|
||||
|
||||
public String getContentPlain() {
|
||||
return contentPlain;
|
||||
}
|
||||
|
||||
public void setContentPlain(String contentPlain) {
|
||||
this.contentPlain = contentPlain;
|
||||
}
|
||||
|
||||
public String getSourceUrl() {
|
||||
return sourceUrl;
|
||||
@@ -129,6 +125,14 @@ public class StorySearchDto {
|
||||
this.lastReadAt = lastReadAt;
|
||||
}
|
||||
|
||||
public Integer getReadingPosition() {
|
||||
return readingPosition;
|
||||
}
|
||||
|
||||
public void setReadingPosition(Integer readingPosition) {
|
||||
this.readingPosition = readingPosition;
|
||||
}
|
||||
|
||||
public UUID getAuthorId() {
|
||||
return authorId;
|
||||
}
|
||||
@@ -185,6 +189,14 @@ public class StorySearchDto {
|
||||
this.updatedAt = updatedAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getDateAdded() {
|
||||
return dateAdded;
|
||||
}
|
||||
|
||||
public void setDateAdded(LocalDateTime dateAdded) {
|
||||
this.dateAdded = dateAdded;
|
||||
}
|
||||
|
||||
public double getSearchScore() {
|
||||
return searchScore;
|
||||
}
|
||||
|
||||
77
backend/src/main/java/com/storycove/dto/TagAliasDto.java
Normal file
77
backend/src/main/java/com/storycove/dto/TagAliasDto.java
Normal file
@@ -0,0 +1,77 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import jakarta.validation.constraints.NotBlank;
|
||||
import jakarta.validation.constraints.Size;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.UUID;
|
||||
|
||||
public class TagAliasDto {
|
||||
|
||||
private UUID id;
|
||||
|
||||
@NotBlank(message = "Alias name is required")
|
||||
@Size(max = 100, message = "Alias name must not exceed 100 characters")
|
||||
private String aliasName;
|
||||
|
||||
private UUID canonicalTagId;
|
||||
private String canonicalTagName; // For convenience in frontend
|
||||
private Boolean createdFromMerge;
|
||||
private LocalDateTime createdAt;
|
||||
|
||||
public TagAliasDto() {}
|
||||
|
||||
public TagAliasDto(String aliasName, UUID canonicalTagId) {
|
||||
this.aliasName = aliasName;
|
||||
this.canonicalTagId = canonicalTagId;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public UUID getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(UUID id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getAliasName() {
|
||||
return aliasName;
|
||||
}
|
||||
|
||||
public void setAliasName(String aliasName) {
|
||||
this.aliasName = aliasName;
|
||||
}
|
||||
|
||||
public UUID getCanonicalTagId() {
|
||||
return canonicalTagId;
|
||||
}
|
||||
|
||||
public void setCanonicalTagId(UUID canonicalTagId) {
|
||||
this.canonicalTagId = canonicalTagId;
|
||||
}
|
||||
|
||||
public String getCanonicalTagName() {
|
||||
return canonicalTagName;
|
||||
}
|
||||
|
||||
public void setCanonicalTagName(String canonicalTagName) {
|
||||
this.canonicalTagName = canonicalTagName;
|
||||
}
|
||||
|
||||
public Boolean getCreatedFromMerge() {
|
||||
return createdFromMerge;
|
||||
}
|
||||
|
||||
public void setCreatedFromMerge(Boolean createdFromMerge) {
|
||||
this.createdFromMerge = createdFromMerge;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
public void setCreatedAt(LocalDateTime createdAt) {
|
||||
this.createdAt = createdAt;
|
||||
}
|
||||
}
|
||||
@@ -4,6 +4,7 @@ import jakarta.validation.constraints.NotBlank;
|
||||
import jakarta.validation.constraints.Size;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
public class TagDto {
|
||||
@@ -14,8 +15,16 @@ public class TagDto {
|
||||
@Size(max = 100, message = "Tag name must not exceed 100 characters")
|
||||
private String name;
|
||||
|
||||
@Size(max = 7, message = "Color must be a valid hex color code")
|
||||
private String color;
|
||||
|
||||
@Size(max = 500, message = "Description must not exceed 500 characters")
|
||||
private String description;
|
||||
|
||||
private Integer storyCount;
|
||||
private Integer collectionCount;
|
||||
private Integer aliasCount;
|
||||
private List<TagAliasDto> aliases;
|
||||
private LocalDateTime createdAt;
|
||||
private LocalDateTime updatedAt;
|
||||
|
||||
@@ -42,6 +51,22 @@ public class TagDto {
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
public String getColor() {
|
||||
return color;
|
||||
}
|
||||
|
||||
public void setColor(String color) {
|
||||
this.color = color;
|
||||
}
|
||||
|
||||
public String getDescription() {
|
||||
return description;
|
||||
}
|
||||
|
||||
public void setDescription(String description) {
|
||||
this.description = description;
|
||||
}
|
||||
|
||||
public Integer getStoryCount() {
|
||||
return storyCount;
|
||||
}
|
||||
@@ -58,6 +83,22 @@ public class TagDto {
|
||||
this.collectionCount = collectionCount;
|
||||
}
|
||||
|
||||
public Integer getAliasCount() {
|
||||
return aliasCount;
|
||||
}
|
||||
|
||||
public void setAliasCount(Integer aliasCount) {
|
||||
this.aliasCount = aliasCount;
|
||||
}
|
||||
|
||||
public List<TagAliasDto> getAliases() {
|
||||
return aliases;
|
||||
}
|
||||
|
||||
public void setAliases(List<TagAliasDto> aliases) {
|
||||
this.aliases = aliases;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
93
backend/src/main/java/com/storycove/entity/Library.java
Normal file
93
backend/src/main/java/com/storycove/entity/Library.java
Normal file
@@ -0,0 +1,93 @@
|
||||
package com.storycove.entity;
|
||||
|
||||
public class Library {
|
||||
private String id;
|
||||
private String name;
|
||||
private String description;
|
||||
private String passwordHash;
|
||||
private String dbName;
|
||||
private String typesenseCollection;
|
||||
private String imagePath;
|
||||
private boolean initialized;
|
||||
|
||||
// Constructors
|
||||
public Library() {}
|
||||
|
||||
public Library(String id, String name, String description, String passwordHash, String dbName) {
|
||||
this.id = id;
|
||||
this.name = name;
|
||||
this.description = description;
|
||||
this.passwordHash = passwordHash;
|
||||
this.dbName = dbName;
|
||||
this.typesenseCollection = "stories_" + id;
|
||||
this.imagePath = "/images/" + id;
|
||||
this.initialized = false;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public String getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(String id) {
|
||||
this.id = id;
|
||||
this.typesenseCollection = "stories_" + id;
|
||||
this.imagePath = "/images/" + id;
|
||||
}
|
||||
|
||||
public String getName() {
|
||||
return name;
|
||||
}
|
||||
|
||||
public void setName(String name) {
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
public String getDescription() {
|
||||
return description;
|
||||
}
|
||||
|
||||
public void setDescription(String description) {
|
||||
this.description = description;
|
||||
}
|
||||
|
||||
public String getPasswordHash() {
|
||||
return passwordHash;
|
||||
}
|
||||
|
||||
public void setPasswordHash(String passwordHash) {
|
||||
this.passwordHash = passwordHash;
|
||||
}
|
||||
|
||||
public String getDbName() {
|
||||
return dbName;
|
||||
}
|
||||
|
||||
public void setDbName(String dbName) {
|
||||
this.dbName = dbName;
|
||||
}
|
||||
|
||||
public String getTypesenseCollection() {
|
||||
return typesenseCollection;
|
||||
}
|
||||
|
||||
public void setTypesenseCollection(String typesenseCollection) {
|
||||
this.typesenseCollection = typesenseCollection;
|
||||
}
|
||||
|
||||
public String getImagePath() {
|
||||
return imagePath;
|
||||
}
|
||||
|
||||
public void setImagePath(String imagePath) {
|
||||
this.imagePath = imagePath;
|
||||
}
|
||||
|
||||
public boolean isInitialized() {
|
||||
return initialized;
|
||||
}
|
||||
|
||||
public void setInitialized(boolean initialized) {
|
||||
this.initialized = initialized;
|
||||
}
|
||||
}
|
||||
@@ -5,6 +5,7 @@ import jakarta.validation.constraints.NotBlank;
|
||||
import jakarta.validation.constraints.Size;
|
||||
import org.hibernate.annotations.CreationTimestamp;
|
||||
import com.fasterxml.jackson.annotation.JsonBackReference;
|
||||
import com.fasterxml.jackson.annotation.JsonManagedReference;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.HashSet;
|
||||
@@ -24,6 +25,14 @@ public class Tag {
|
||||
@Column(nullable = false, unique = true)
|
||||
private String name;
|
||||
|
||||
@Size(max = 7, message = "Color must be a valid hex color code")
|
||||
@Column(length = 7)
|
||||
private String color; // hex color like #3B82F6
|
||||
|
||||
@Size(max = 500, message = "Description must not exceed 500 characters")
|
||||
@Column(length = 500)
|
||||
private String description;
|
||||
|
||||
|
||||
@ManyToMany(mappedBy = "tags")
|
||||
@JsonBackReference("story-tags")
|
||||
@@ -33,6 +42,10 @@ public class Tag {
|
||||
@JsonBackReference("collection-tags")
|
||||
private Set<Collection> collections = new HashSet<>();
|
||||
|
||||
@OneToMany(mappedBy = "canonicalTag", cascade = CascadeType.ALL, orphanRemoval = true)
|
||||
@JsonManagedReference("tag-aliases")
|
||||
private Set<TagAlias> aliases = new HashSet<>();
|
||||
|
||||
@CreationTimestamp
|
||||
@Column(name = "created_at", nullable = false, updatable = false)
|
||||
private LocalDateTime createdAt;
|
||||
@@ -43,6 +56,12 @@ public class Tag {
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
public Tag(String name, String color, String description) {
|
||||
this.name = name;
|
||||
this.color = color;
|
||||
this.description = description;
|
||||
}
|
||||
|
||||
|
||||
|
||||
// Getters and Setters
|
||||
@@ -62,6 +81,22 @@ public class Tag {
|
||||
this.name = name;
|
||||
}
|
||||
|
||||
public String getColor() {
|
||||
return color;
|
||||
}
|
||||
|
||||
public void setColor(String color) {
|
||||
this.color = color;
|
||||
}
|
||||
|
||||
public String getDescription() {
|
||||
return description;
|
||||
}
|
||||
|
||||
public void setDescription(String description) {
|
||||
this.description = description;
|
||||
}
|
||||
|
||||
|
||||
public Set<Story> getStories() {
|
||||
return stories;
|
||||
@@ -79,6 +114,14 @@ public class Tag {
|
||||
this.collections = collections;
|
||||
}
|
||||
|
||||
public Set<TagAlias> getAliases() {
|
||||
return aliases;
|
||||
}
|
||||
|
||||
public void setAliases(Set<TagAlias> aliases) {
|
||||
this.aliases = aliases;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
113
backend/src/main/java/com/storycove/entity/TagAlias.java
Normal file
113
backend/src/main/java/com/storycove/entity/TagAlias.java
Normal file
@@ -0,0 +1,113 @@
|
||||
package com.storycove.entity;
|
||||
|
||||
import jakarta.persistence.*;
|
||||
import jakarta.validation.constraints.NotBlank;
|
||||
import jakarta.validation.constraints.Size;
|
||||
import org.hibernate.annotations.CreationTimestamp;
|
||||
import com.fasterxml.jackson.annotation.JsonManagedReference;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.UUID;
|
||||
|
||||
@Entity
|
||||
@Table(name = "tag_aliases")
|
||||
public class TagAlias {
|
||||
|
||||
@Id
|
||||
@GeneratedValue(strategy = GenerationType.UUID)
|
||||
private UUID id;
|
||||
|
||||
@NotBlank(message = "Alias name is required")
|
||||
@Size(max = 100, message = "Alias name must not exceed 100 characters")
|
||||
@Column(name = "alias_name", nullable = false, unique = true)
|
||||
private String aliasName;
|
||||
|
||||
@ManyToOne(fetch = FetchType.LAZY)
|
||||
@JoinColumn(name = "canonical_tag_id", nullable = false)
|
||||
@JsonManagedReference("tag-aliases")
|
||||
private Tag canonicalTag;
|
||||
|
||||
@Column(name = "created_from_merge", nullable = false)
|
||||
private Boolean createdFromMerge = false;
|
||||
|
||||
@CreationTimestamp
|
||||
@Column(name = "created_at", nullable = false, updatable = false)
|
||||
private LocalDateTime createdAt;
|
||||
|
||||
public TagAlias() {}
|
||||
|
||||
public TagAlias(String aliasName, Tag canonicalTag) {
|
||||
this.aliasName = aliasName;
|
||||
this.canonicalTag = canonicalTag;
|
||||
}
|
||||
|
||||
public TagAlias(String aliasName, Tag canonicalTag, Boolean createdFromMerge) {
|
||||
this.aliasName = aliasName;
|
||||
this.canonicalTag = canonicalTag;
|
||||
this.createdFromMerge = createdFromMerge;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public UUID getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(UUID id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getAliasName() {
|
||||
return aliasName;
|
||||
}
|
||||
|
||||
public void setAliasName(String aliasName) {
|
||||
this.aliasName = aliasName;
|
||||
}
|
||||
|
||||
public Tag getCanonicalTag() {
|
||||
return canonicalTag;
|
||||
}
|
||||
|
||||
public void setCanonicalTag(Tag canonicalTag) {
|
||||
this.canonicalTag = canonicalTag;
|
||||
}
|
||||
|
||||
public Boolean getCreatedFromMerge() {
|
||||
return createdFromMerge;
|
||||
}
|
||||
|
||||
public void setCreatedFromMerge(Boolean createdFromMerge) {
|
||||
this.createdFromMerge = createdFromMerge;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
public void setCreatedAt(LocalDateTime createdAt) {
|
||||
this.createdAt = createdAt;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (!(o instanceof TagAlias)) return false;
|
||||
TagAlias tagAlias = (TagAlias) o;
|
||||
return id != null && id.equals(tagAlias.id);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return getClass().hashCode();
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return "TagAlias{" +
|
||||
"id=" + id +
|
||||
", aliasName='" + aliasName + '\'' +
|
||||
", canonicalTag=" + (canonicalTag != null ? canonicalTag.getName() : null) +
|
||||
", createdFromMerge=" + createdFromMerge +
|
||||
'}';
|
||||
}
|
||||
}
|
||||
@@ -4,7 +4,6 @@ import com.storycove.entity.Author;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Modifying;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
|
||||
@@ -2,7 +2,6 @@ package com.storycove.repository;
|
||||
|
||||
import com.storycove.entity.Collection;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Modifying;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
|
||||
@@ -7,7 +7,6 @@ import com.storycove.entity.Tag;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Modifying;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
|
||||
@@ -0,0 +1,60 @@
|
||||
package com.storycove.repository;
|
||||
|
||||
import com.storycove.entity.TagAlias;
|
||||
import com.storycove.entity.Tag;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
|
||||
@Repository
|
||||
public interface TagAliasRepository extends JpaRepository<TagAlias, UUID> {
|
||||
|
||||
/**
|
||||
* Find alias by exact alias name (case-insensitive)
|
||||
*/
|
||||
@Query("SELECT ta FROM TagAlias ta WHERE LOWER(ta.aliasName) = LOWER(:aliasName)")
|
||||
Optional<TagAlias> findByAliasNameIgnoreCase(@Param("aliasName") String aliasName);
|
||||
|
||||
/**
|
||||
* Find all aliases for a specific canonical tag
|
||||
*/
|
||||
List<TagAlias> findByCanonicalTag(Tag canonicalTag);
|
||||
|
||||
/**
|
||||
* Find all aliases for a specific canonical tag ID
|
||||
*/
|
||||
@Query("SELECT ta FROM TagAlias ta WHERE ta.canonicalTag.id = :tagId")
|
||||
List<TagAlias> findByCanonicalTagId(@Param("tagId") UUID tagId);
|
||||
|
||||
/**
|
||||
* Find aliases created from merge operations
|
||||
*/
|
||||
List<TagAlias> findByCreatedFromMergeTrue();
|
||||
|
||||
/**
|
||||
* Check if an alias name already exists
|
||||
*/
|
||||
boolean existsByAliasNameIgnoreCase(String aliasName);
|
||||
|
||||
/**
|
||||
* Delete all aliases for a specific tag
|
||||
*/
|
||||
void deleteByCanonicalTag(Tag canonicalTag);
|
||||
|
||||
/**
|
||||
* Count aliases for a specific tag
|
||||
*/
|
||||
@Query("SELECT COUNT(ta) FROM TagAlias ta WHERE ta.canonicalTag.id = :tagId")
|
||||
long countByCanonicalTagId(@Param("tagId") UUID tagId);
|
||||
|
||||
/**
|
||||
* Find aliases that start with the given prefix (case-insensitive)
|
||||
*/
|
||||
@Query("SELECT ta FROM TagAlias ta WHERE LOWER(ta.aliasName) LIKE LOWER(CONCAT(:prefix, '%'))")
|
||||
List<TagAlias> findByAliasNameStartingWithIgnoreCase(@Param("prefix") String prefix);
|
||||
}
|
||||
@@ -17,8 +17,12 @@ public interface TagRepository extends JpaRepository<Tag, UUID> {
|
||||
|
||||
Optional<Tag> findByName(String name);
|
||||
|
||||
Optional<Tag> findByNameIgnoreCase(String name);
|
||||
|
||||
boolean existsByName(String name);
|
||||
|
||||
boolean existsByNameIgnoreCase(String name);
|
||||
|
||||
List<Tag> findByNameContainingIgnoreCase(String name);
|
||||
|
||||
Page<Tag> findByNameContainingIgnoreCase(String name, Pageable pageable);
|
||||
|
||||
@@ -1,84 +0,0 @@
|
||||
package com.storycove.scheduled;
|
||||
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.service.StoryService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||
import org.springframework.scheduling.annotation.Scheduled;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* Scheduled task to periodically reindex all stories in Typesense
|
||||
* to ensure search index stays synchronized with database changes.
|
||||
*/
|
||||
@Component
|
||||
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true)
|
||||
public class TypesenseIndexScheduler {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(TypesenseIndexScheduler.class);
|
||||
|
||||
private final StoryService storyService;
|
||||
private final TypesenseService typesenseService;
|
||||
|
||||
@Autowired
|
||||
public TypesenseIndexScheduler(StoryService storyService,
|
||||
@Autowired(required = false) TypesenseService typesenseService) {
|
||||
this.storyService = storyService;
|
||||
this.typesenseService = typesenseService;
|
||||
}
|
||||
|
||||
/**
|
||||
* Scheduled task that runs periodically to reindex all stories in Typesense.
|
||||
* This ensures the search index stays synchronized with any database changes
|
||||
* that might have occurred outside of the normal story update flow.
|
||||
*
|
||||
* Interval is configurable via storycove.typesense.reindex-interval property (default: 1 hour).
|
||||
*/
|
||||
@Scheduled(fixedRateString = "${storycove.typesense.reindex-interval:3600000}")
|
||||
public void reindexAllStories() {
|
||||
if (typesenseService == null) {
|
||||
logger.debug("TypesenseService is not available, skipping scheduled reindexing");
|
||||
return;
|
||||
}
|
||||
|
||||
logger.info("Starting scheduled Typesense reindexing at {}", LocalDateTime.now());
|
||||
|
||||
try {
|
||||
long startTime = System.currentTimeMillis();
|
||||
|
||||
// Get all stories from database with eagerly loaded associations
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
|
||||
if (allStories.isEmpty()) {
|
||||
logger.info("No stories found in database, skipping reindexing");
|
||||
return;
|
||||
}
|
||||
|
||||
// Perform full reindex
|
||||
typesenseService.reindexAllStories(allStories);
|
||||
|
||||
long endTime = System.currentTimeMillis();
|
||||
long duration = endTime - startTime;
|
||||
|
||||
logger.info("Completed scheduled Typesense reindexing of {} stories in {}ms",
|
||||
allStories.size(), duration);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to complete scheduled Typesense reindexing", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Manual trigger for reindexing - can be called from other services or endpoints if needed
|
||||
*/
|
||||
public void triggerManualReindex() {
|
||||
logger.info("Manual Typesense reindexing triggered");
|
||||
reindexAllStories();
|
||||
}
|
||||
}
|
||||
@@ -3,6 +3,7 @@ package com.storycove.security;
|
||||
import com.storycove.util.JwtUtil;
|
||||
import jakarta.servlet.FilterChain;
|
||||
import jakarta.servlet.ServletException;
|
||||
import jakarta.servlet.http.Cookie;
|
||||
import jakarta.servlet.http.HttpServletRequest;
|
||||
import jakarta.servlet.http.HttpServletResponse;
|
||||
import org.springframework.security.authentication.UsernamePasswordAuthenticationToken;
|
||||
@@ -28,13 +29,27 @@ public class JwtAuthenticationFilter extends OncePerRequestFilter {
|
||||
HttpServletResponse response,
|
||||
FilterChain filterChain) throws ServletException, IOException {
|
||||
|
||||
String authHeader = request.getHeader("Authorization");
|
||||
String token = null;
|
||||
|
||||
// First try to get token from Authorization header
|
||||
String authHeader = request.getHeader("Authorization");
|
||||
if (authHeader != null && authHeader.startsWith("Bearer ")) {
|
||||
token = authHeader.substring(7);
|
||||
}
|
||||
|
||||
// If no token in header, try to get from cookies
|
||||
if (token == null) {
|
||||
Cookie[] cookies = request.getCookies();
|
||||
if (cookies != null) {
|
||||
for (Cookie cookie : cookies) {
|
||||
if ("token".equals(cookie.getName())) {
|
||||
token = cookie.getValue();
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (token != null && jwtUtil.validateToken(token) && !jwtUtil.isTokenExpired(token)) {
|
||||
String subject = jwtUtil.getSubjectFromToken(token);
|
||||
|
||||
|
||||
@@ -11,21 +11,21 @@ import org.springframework.stereotype.Component;
|
||||
import java.util.List;
|
||||
|
||||
@Component
|
||||
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true)
|
||||
@ConditionalOnProperty(name = "storycove.search.enabled", havingValue = "true", matchIfMissing = true)
|
||||
public class AuthorIndexScheduler {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AuthorIndexScheduler.class);
|
||||
|
||||
private final AuthorService authorService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
public AuthorIndexScheduler(AuthorService authorService, TypesenseService typesenseService) {
|
||||
public AuthorIndexScheduler(AuthorService authorService, SearchServiceAdapter searchServiceAdapter) {
|
||||
this.authorService = authorService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@Scheduled(fixedRateString = "${storycove.typesense.author-reindex-interval:7200000}") // 2 hours default
|
||||
@Scheduled(fixedRateString = "${storycove.search.author-reindex-interval:7200000}") // 2 hours default
|
||||
public void reindexAllAuthors() {
|
||||
try {
|
||||
logger.info("Starting scheduled author reindexing...");
|
||||
@@ -34,7 +34,7 @@ public class AuthorIndexScheduler {
|
||||
logger.info("Found {} authors to reindex", allAuthors.size());
|
||||
|
||||
if (!allAuthors.isEmpty()) {
|
||||
typesenseService.reindexAllAuthors(allAuthors);
|
||||
searchServiceAdapter.bulkIndexAuthors(allAuthors);
|
||||
logger.info("Successfully completed scheduled author reindexing");
|
||||
} else {
|
||||
logger.info("No authors found to reindex");
|
||||
|
||||
@@ -28,12 +28,12 @@ public class AuthorService {
|
||||
private static final Logger logger = LoggerFactory.getLogger(AuthorService.class);
|
||||
|
||||
private final AuthorRepository authorRepository;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
public AuthorService(AuthorRepository authorRepository, @Autowired(required = false) TypesenseService typesenseService) {
|
||||
public AuthorService(AuthorRepository authorRepository, SearchServiceAdapter searchServiceAdapter) {
|
||||
this.authorRepository = authorRepository;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
@@ -132,14 +132,8 @@ public class AuthorService {
|
||||
validateAuthorForCreate(author);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Index in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.indexAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to index author in Typesense: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Index in OpenSearch
|
||||
searchServiceAdapter.indexAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -156,14 +150,8 @@ public class AuthorService {
|
||||
updateAuthorFields(existingAuthor, authorUpdates);
|
||||
Author savedAuthor = authorRepository.save(existingAuthor);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in OpenSearch
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -178,14 +166,8 @@ public class AuthorService {
|
||||
|
||||
authorRepository.delete(author);
|
||||
|
||||
// Remove from Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.deleteAuthor(id.toString());
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to delete author from Typesense: " + author.getName(), e);
|
||||
}
|
||||
}
|
||||
// Remove from OpenSearch
|
||||
searchServiceAdapter.deleteAuthor(id);
|
||||
}
|
||||
|
||||
public Author addUrl(UUID id, String url) {
|
||||
@@ -193,14 +175,8 @@ public class AuthorService {
|
||||
author.addUrl(url);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after adding URL: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in OpenSearch
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -210,14 +186,8 @@ public class AuthorService {
|
||||
author.removeUrl(url);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after removing URL: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in OpenSearch
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -242,7 +212,7 @@ public class AuthorService {
|
||||
rating, author.getName(), author.getAuthorRating());
|
||||
|
||||
author.setAuthorRating(rating);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
authorRepository.save(author);
|
||||
|
||||
// Flush and refresh to ensure the entity is up-to-date
|
||||
authorRepository.flush();
|
||||
@@ -251,14 +221,8 @@ public class AuthorService {
|
||||
logger.debug("Saved author rating: {} for author: {}",
|
||||
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(refreshedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after rating: " + refreshedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in OpenSearch
|
||||
searchServiceAdapter.updateAuthor(refreshedAuthor);
|
||||
|
||||
return refreshedAuthor;
|
||||
}
|
||||
@@ -301,14 +265,8 @@ public class AuthorService {
|
||||
author.setAvatarImagePath(avatarPath);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after setting avatar: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in OpenSearch
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
@@ -318,14 +276,8 @@ public class AuthorService {
|
||||
author.setAvatarImagePath(null);
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after removing avatar: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
// Update in OpenSearch
|
||||
searchServiceAdapter.updateAuthor(savedAuthor);
|
||||
|
||||
return savedAuthor;
|
||||
}
|
||||
|
||||
@@ -11,14 +11,10 @@ import com.storycove.repository.CollectionRepository;
|
||||
import com.storycove.repository.CollectionStoryRepository;
|
||||
import com.storycove.repository.StoryRepository;
|
||||
import com.storycove.repository.TagRepository;
|
||||
import com.storycove.service.exception.DuplicateResourceException;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.PageRequest;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
|
||||
@@ -35,7 +31,7 @@ public class CollectionService {
|
||||
private final CollectionStoryRepository collectionStoryRepository;
|
||||
private final StoryRepository storyRepository;
|
||||
private final TagRepository tagRepository;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
|
||||
@Autowired
|
||||
@@ -43,13 +39,13 @@ public class CollectionService {
|
||||
CollectionStoryRepository collectionStoryRepository,
|
||||
StoryRepository storyRepository,
|
||||
TagRepository tagRepository,
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
SearchServiceAdapter searchServiceAdapter,
|
||||
ReadingTimeService readingTimeService) {
|
||||
this.collectionRepository = collectionRepository;
|
||||
this.collectionStoryRepository = collectionStoryRepository;
|
||||
this.storyRepository = storyRepository;
|
||||
this.tagRepository = tagRepository;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
this.readingTimeService = readingTimeService;
|
||||
}
|
||||
|
||||
@@ -58,15 +54,12 @@ public class CollectionService {
|
||||
* This method MUST be used instead of JPA queries for listing collections
|
||||
*/
|
||||
public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) {
|
||||
if (typesenseService == null) {
|
||||
logger.warn("Typesense service not available, returning empty results");
|
||||
// Collections are currently handled at database level, not indexed in search engine
|
||||
// Return empty result for now as collections search is not implemented in OpenSearch
|
||||
logger.warn("Collections search not yet implemented in OpenSearch, returning empty results");
|
||||
return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0);
|
||||
}
|
||||
|
||||
// Delegate to TypesenseService for all search operations
|
||||
return typesenseService.searchCollections(query, tags, includeArchived, page, limit);
|
||||
}
|
||||
|
||||
/**
|
||||
* Find collection by ID with full details
|
||||
*/
|
||||
@@ -111,10 +104,7 @@ public class CollectionService {
|
||||
savedCollection = findById(savedCollection.getId());
|
||||
}
|
||||
|
||||
// Index in Typesense
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexCollection(savedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("Created collection: {} with {} stories", name, initialStoryIds != null ? initialStoryIds.size() : 0);
|
||||
return savedCollection;
|
||||
@@ -144,10 +134,7 @@ public class CollectionService {
|
||||
|
||||
Collection savedCollection = collectionRepository.save(collection);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexCollection(savedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("Updated collection: {}", id);
|
||||
return savedCollection;
|
||||
@@ -159,10 +146,7 @@ public class CollectionService {
|
||||
public void deleteCollection(UUID id) {
|
||||
Collection collection = findByIdBasic(id);
|
||||
|
||||
// Remove from Typesense first
|
||||
if (typesenseService != null) {
|
||||
typesenseService.removeCollection(id);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
collectionRepository.delete(collection);
|
||||
logger.info("Deleted collection: {}", id);
|
||||
@@ -177,10 +161,7 @@ public class CollectionService {
|
||||
|
||||
Collection savedCollection = collectionRepository.save(collection);
|
||||
|
||||
// Update in Typesense
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexCollection(savedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("{} collection: {}", archived ? "Archived" : "Unarchived", id);
|
||||
return savedCollection;
|
||||
@@ -225,10 +206,7 @@ public class CollectionService {
|
||||
}
|
||||
|
||||
// Update collection in Typesense
|
||||
if (typesenseService != null) {
|
||||
Collection updatedCollection = findById(collectionId);
|
||||
typesenseService.indexCollection(updatedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
long totalStories = collectionStoryRepository.countByCollectionId(collectionId);
|
||||
|
||||
@@ -253,10 +231,7 @@ public class CollectionService {
|
||||
collectionStoryRepository.delete(collectionStory);
|
||||
|
||||
// Update collection in Typesense
|
||||
if (typesenseService != null) {
|
||||
Collection updatedCollection = findById(collectionId);
|
||||
typesenseService.indexCollection(updatedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("Removed story {} from collection {}", storyId, collectionId);
|
||||
}
|
||||
@@ -266,7 +241,7 @@ public class CollectionService {
|
||||
*/
|
||||
@Transactional
|
||||
public void reorderStories(UUID collectionId, List<Map<String, Object>> storyOrders) {
|
||||
Collection collection = findByIdBasic(collectionId);
|
||||
findByIdBasic(collectionId); // Validate collection exists
|
||||
|
||||
// Two-phase update to avoid unique constraint violations:
|
||||
// Phase 1: Set all positions to negative values (temporary)
|
||||
@@ -289,10 +264,7 @@ public class CollectionService {
|
||||
}
|
||||
|
||||
// Update collection in Typesense
|
||||
if (typesenseService != null) {
|
||||
Collection updatedCollection = findById(collectionId);
|
||||
typesenseService.indexCollection(updatedCollection);
|
||||
}
|
||||
// Collections are not indexed in search engine yet
|
||||
|
||||
logger.info("Reordered {} stories in collection {}", storyOrders.size(), collectionId);
|
||||
}
|
||||
@@ -427,7 +399,7 @@ public class CollectionService {
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all collections for indexing (used by TypesenseService)
|
||||
* Get all collections for indexing (used by SearchServiceAdapter)
|
||||
*/
|
||||
public List<Collection> findAllForIndexing() {
|
||||
return collectionRepository.findAllActiveCollections();
|
||||
|
||||
@@ -1,10 +1,12 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.storycove.entity.*;
|
||||
import com.storycove.repository.*;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Qualifier;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.context.ApplicationContext;
|
||||
import org.springframework.context.ApplicationContextAware;
|
||||
import org.springframework.core.io.ByteArrayResource;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.stereotype.Service;
|
||||
@@ -23,11 +25,17 @@ import java.util.zip.ZipInputStream;
|
||||
import java.util.zip.ZipOutputStream;
|
||||
|
||||
@Service
|
||||
public class DatabaseManagementService {
|
||||
public class DatabaseManagementService implements ApplicationContextAware {
|
||||
|
||||
@Autowired
|
||||
@Qualifier("dataSource") // Use the primary routing datasource
|
||||
private DataSource dataSource;
|
||||
|
||||
// Use the routing datasource which automatically handles library switching
|
||||
private DataSource getDataSource() {
|
||||
return dataSource;
|
||||
}
|
||||
|
||||
@Autowired
|
||||
private StoryRepository storyRepository;
|
||||
|
||||
@@ -44,7 +52,10 @@ public class DatabaseManagementService {
|
||||
private CollectionRepository collectionRepository;
|
||||
|
||||
@Autowired
|
||||
private TypesenseService typesenseService;
|
||||
private SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
@Autowired
|
||||
private ReadingPositionRepository readingPositionRepository;
|
||||
@@ -52,6 +63,13 @@ public class DatabaseManagementService {
|
||||
@Value("${storycove.images.upload-dir:/app/images}")
|
||||
private String uploadDir;
|
||||
|
||||
private ApplicationContext applicationContext;
|
||||
|
||||
@Override
|
||||
public void setApplicationContext(ApplicationContext applicationContext) {
|
||||
this.applicationContext = applicationContext;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a comprehensive backup including database and files in ZIP format
|
||||
*/
|
||||
@@ -80,7 +98,12 @@ public class DatabaseManagementService {
|
||||
* Restore from complete backup (ZIP format)
|
||||
*/
|
||||
public void restoreFromCompleteBackup(InputStream backupStream) throws IOException, SQLException {
|
||||
System.err.println("Starting complete backup restore...");
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
System.err.println("Starting complete backup restore for library: " + currentLibraryId);
|
||||
if (currentLibraryId == null) {
|
||||
throw new IllegalStateException("No current library active - please authenticate and select a library first");
|
||||
}
|
||||
|
||||
Path tempDir = Files.createTempDirectory("storycove-restore");
|
||||
System.err.println("Created temp directory: " + tempDir);
|
||||
|
||||
@@ -122,6 +145,17 @@ public class DatabaseManagementService {
|
||||
System.err.println("No files directory found in backup - skipping file restore.");
|
||||
}
|
||||
|
||||
// 6. Trigger complete search index reindex after data restoration
|
||||
try {
|
||||
System.err.println("Starting search index reindex after restore...");
|
||||
SearchServiceAdapter searchServiceAdapter = applicationContext.getBean(SearchServiceAdapter.class);
|
||||
searchServiceAdapter.performCompleteReindex();
|
||||
System.err.println("Search index reindex completed successfully.");
|
||||
} catch (Exception e) {
|
||||
System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
|
||||
// Don't fail the entire restore for search issues
|
||||
}
|
||||
|
||||
System.err.println("Complete backup restore finished successfully.");
|
||||
|
||||
} catch (Exception e) {
|
||||
@@ -139,7 +173,7 @@ public class DatabaseManagementService {
|
||||
public Resource createBackup() throws SQLException, IOException {
|
||||
StringBuilder sqlDump = new StringBuilder();
|
||||
|
||||
try (Connection connection = dataSource.getConnection()) {
|
||||
try (Connection connection = getDataSource().getConnection()) {
|
||||
// Add header
|
||||
sqlDump.append("-- StoryCove Database Backup\n");
|
||||
sqlDump.append("-- Generated at: ").append(new java.util.Date()).append("\n\n");
|
||||
@@ -225,10 +259,13 @@ public class DatabaseManagementService {
|
||||
}
|
||||
|
||||
// Execute the SQL statements
|
||||
try (Connection connection = dataSource.getConnection()) {
|
||||
try (Connection connection = getDataSource().getConnection()) {
|
||||
connection.setAutoCommit(false);
|
||||
|
||||
try {
|
||||
// Ensure database schema exists before restoring data
|
||||
ensureDatabaseSchemaExists(connection);
|
||||
|
||||
// Parse SQL statements properly (handle semicolons inside string literals)
|
||||
List<String> statements = parseStatements(sqlContent.toString());
|
||||
|
||||
@@ -261,14 +298,22 @@ public class DatabaseManagementService {
|
||||
|
||||
// Reindex search after successful restore
|
||||
try {
|
||||
System.err.println("Starting Typesense reindex after successful restore...");
|
||||
typesenseService.recreateStoriesCollection();
|
||||
typesenseService.recreateAuthorsCollection();
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
System.err.println("Starting search reindex after successful restore for library: " + currentLibraryId);
|
||||
if (currentLibraryId == null) {
|
||||
System.err.println("ERROR: No current library set during restore - cannot reindex search!");
|
||||
throw new IllegalStateException("No current library active during restore");
|
||||
}
|
||||
|
||||
// Manually trigger reindexing using the correct database connection
|
||||
System.err.println("Triggering manual reindex from library-specific database for library: " + currentLibraryId);
|
||||
reindexStoriesAndAuthorsFromCurrentDatabase();
|
||||
|
||||
// Note: Collections collection will be recreated when needed by the service
|
||||
System.err.println("Typesense reindex completed successfully.");
|
||||
System.err.println("Search reindex completed successfully for library: " + currentLibraryId);
|
||||
} catch (Exception e) {
|
||||
// Log the error but don't fail the restore
|
||||
System.err.println("Warning: Failed to reindex Typesense after restore: " + e.getMessage());
|
||||
System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
|
||||
e.printStackTrace();
|
||||
}
|
||||
|
||||
@@ -306,7 +351,7 @@ public class DatabaseManagementService {
|
||||
totalDeleted = collectionCount + storyCount + authorCount + seriesCount + tagCount;
|
||||
|
||||
// Note: Search indexes will need to be manually recreated after clearing
|
||||
// Use the settings page to recreate Typesense collections after clearing the database
|
||||
// Use the settings page to recreate search indices after clearing the database
|
||||
|
||||
} catch (Exception e) {
|
||||
throw new RuntimeException("Failed to clear database: " + e.getMessage(), e);
|
||||
@@ -419,10 +464,14 @@ public class DatabaseManagementService {
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all uploaded files
|
||||
* Clear all uploaded files for the current library
|
||||
*/
|
||||
private void clearAllFiles() {
|
||||
Path imagesPath = Paths.get(uploadDir);
|
||||
// Use library-specific image path
|
||||
String libraryImagePath = libraryService.getCurrentImagePath();
|
||||
Path imagesPath = Paths.get(uploadDir + libraryImagePath);
|
||||
|
||||
System.err.println("Clearing files for library: " + libraryService.getCurrentLibraryId() + " at path: " + imagesPath);
|
||||
|
||||
if (Files.exists(imagesPath)) {
|
||||
try {
|
||||
@@ -431,6 +480,7 @@ public class DatabaseManagementService {
|
||||
.forEach(filePath -> {
|
||||
try {
|
||||
Files.deleteIfExists(filePath);
|
||||
System.err.println("Deleted file: " + filePath);
|
||||
} catch (IOException e) {
|
||||
System.err.println("Warning: Failed to delete file: " + filePath + " - " + e.getMessage());
|
||||
}
|
||||
@@ -438,19 +488,27 @@ public class DatabaseManagementService {
|
||||
} catch (IOException e) {
|
||||
System.err.println("Warning: Failed to clear files directory: " + e.getMessage());
|
||||
}
|
||||
} else {
|
||||
System.err.println("Library image directory does not exist: " + imagesPath);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear search indexes
|
||||
* Clear search indexes (recreate empty collections)
|
||||
*/
|
||||
private void clearSearchIndexes() {
|
||||
try {
|
||||
System.err.println("Clearing search indexes after complete clear...");
|
||||
typesenseService.recreateStoriesCollection();
|
||||
typesenseService.recreateAuthorsCollection();
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
System.err.println("Clearing search indexes after complete clear for library: " + currentLibraryId);
|
||||
if (currentLibraryId == null) {
|
||||
System.err.println("WARNING: No current library set during clear - skipping search index clear");
|
||||
return;
|
||||
}
|
||||
|
||||
// For clearing, we only want to recreate empty collections (no data to index)
|
||||
searchServiceAdapter.recreateIndices();
|
||||
// Note: Collections collection will be recreated when needed by the service
|
||||
System.err.println("Search indexes cleared successfully.");
|
||||
System.err.println("Search indexes cleared successfully for library: " + currentLibraryId);
|
||||
} catch (Exception e) {
|
||||
// Log the error but don't fail the clear operation
|
||||
System.err.println("Warning: Failed to clear search indexes: " + e.getMessage());
|
||||
@@ -458,6 +516,219 @@ public class DatabaseManagementService {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Ensure database schema exists before restoring backup data.
|
||||
* This creates all necessary tables, indexes, and constraints if they don't exist.
|
||||
*/
|
||||
private void ensureDatabaseSchemaExists(Connection connection) throws SQLException {
|
||||
try {
|
||||
// Check if a key table exists to determine if schema is already created
|
||||
String checkTableQuery = "SELECT 1 FROM information_schema.tables WHERE table_name = 'stories' LIMIT 1";
|
||||
try (PreparedStatement stmt = connection.prepareStatement(checkTableQuery);
|
||||
var resultSet = stmt.executeQuery()) {
|
||||
if (resultSet.next()) {
|
||||
System.err.println("Database schema already exists, skipping schema creation.");
|
||||
return; // Schema exists
|
||||
}
|
||||
}
|
||||
|
||||
System.err.println("Creating database schema for restore in library: " + libraryService.getCurrentLibraryId());
|
||||
|
||||
// Create the schema using the same DDL as LibraryService
|
||||
String[] createTableStatements = {
|
||||
// Authors table
|
||||
"""
|
||||
CREATE TABLE authors (
|
||||
author_rating integer,
|
||||
created_at timestamp(6) not null,
|
||||
updated_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
avatar_image_path varchar(255),
|
||||
name varchar(255) not null,
|
||||
notes TEXT,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Author URLs table
|
||||
"""
|
||||
CREATE TABLE author_urls (
|
||||
author_id uuid not null,
|
||||
url varchar(255)
|
||||
)
|
||||
""",
|
||||
|
||||
// Series table
|
||||
"""
|
||||
CREATE TABLE series (
|
||||
created_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
description varchar(1000),
|
||||
name varchar(255) not null,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Tags table
|
||||
"""
|
||||
CREATE TABLE tags (
|
||||
color varchar(7),
|
||||
created_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
description varchar(500),
|
||||
name varchar(255) not null unique,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Tag aliases table
|
||||
"""
|
||||
CREATE TABLE tag_aliases (
|
||||
created_from_merge boolean not null,
|
||||
created_at timestamp(6) not null,
|
||||
canonical_tag_id uuid not null,
|
||||
id uuid not null,
|
||||
alias_name varchar(255) not null unique,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Collections table
|
||||
"""
|
||||
CREATE TABLE collections (
|
||||
is_archived boolean not null,
|
||||
rating integer,
|
||||
created_at timestamp(6) not null,
|
||||
updated_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
cover_image_path varchar(500),
|
||||
name varchar(500) not null,
|
||||
description TEXT,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Stories table
|
||||
"""
|
||||
CREATE TABLE stories (
|
||||
is_read boolean,
|
||||
rating integer,
|
||||
reading_position integer,
|
||||
volume integer,
|
||||
word_count integer,
|
||||
created_at timestamp(6) not null,
|
||||
last_read_at timestamp(6),
|
||||
updated_at timestamp(6) not null,
|
||||
author_id uuid,
|
||||
id uuid not null,
|
||||
series_id uuid,
|
||||
description varchar(1000),
|
||||
content_html TEXT,
|
||||
content_plain TEXT,
|
||||
cover_path varchar(255),
|
||||
source_url varchar(255),
|
||||
summary TEXT,
|
||||
title varchar(255) not null,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Reading positions table
|
||||
"""
|
||||
CREATE TABLE reading_positions (
|
||||
chapter_index integer,
|
||||
character_position integer,
|
||||
percentage_complete float(53),
|
||||
word_position integer,
|
||||
created_at timestamp(6) not null,
|
||||
updated_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
story_id uuid not null,
|
||||
context_after varchar(500),
|
||||
context_before varchar(500),
|
||||
chapter_title varchar(255),
|
||||
epub_cfi TEXT,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Junction tables
|
||||
"""
|
||||
CREATE TABLE story_tags (
|
||||
story_id uuid not null,
|
||||
tag_id uuid not null,
|
||||
primary key (story_id, tag_id)
|
||||
)
|
||||
""",
|
||||
|
||||
"""
|
||||
CREATE TABLE collection_stories (
|
||||
position integer not null,
|
||||
added_at timestamp(6) not null,
|
||||
collection_id uuid not null,
|
||||
story_id uuid not null,
|
||||
primary key (collection_id, story_id),
|
||||
unique (collection_id, position)
|
||||
)
|
||||
""",
|
||||
|
||||
"""
|
||||
CREATE TABLE collection_tags (
|
||||
collection_id uuid not null,
|
||||
tag_id uuid not null,
|
||||
primary key (collection_id, tag_id)
|
||||
)
|
||||
"""
|
||||
};
|
||||
|
||||
String[] createIndexStatements = {
|
||||
"CREATE INDEX idx_reading_position_story ON reading_positions (story_id)"
|
||||
};
|
||||
|
||||
String[] createConstraintStatements = {
|
||||
// Foreign key constraints
|
||||
"ALTER TABLE author_urls ADD CONSTRAINT FKdqhp51m0uveybsts098gd79uo FOREIGN KEY (author_id) REFERENCES authors",
|
||||
"ALTER TABLE stories ADD CONSTRAINT FKhwecpqeaxy40ftrctef1u7gw7 FOREIGN KEY (author_id) REFERENCES authors",
|
||||
"ALTER TABLE stories ADD CONSTRAINT FK1kulyvy7wwcolp2gkndt57cp7 FOREIGN KEY (series_id) REFERENCES series",
|
||||
"ALTER TABLE reading_positions ADD CONSTRAINT FKglfhdhflan3pgyr2u0gxi21i5 FOREIGN KEY (story_id) REFERENCES stories",
|
||||
"ALTER TABLE story_tags ADD CONSTRAINT FKmans33ijt0nf65t0sng2r848j FOREIGN KEY (tag_id) REFERENCES tags",
|
||||
"ALTER TABLE story_tags ADD CONSTRAINT FKq9guid7swnjxwdpgxj3jo1rsi FOREIGN KEY (story_id) REFERENCES stories",
|
||||
"ALTER TABLE tag_aliases ADD CONSTRAINT FKqfsawmcj3ey4yycb6958y24ch FOREIGN KEY (canonical_tag_id) REFERENCES tags",
|
||||
"ALTER TABLE collection_stories ADD CONSTRAINT FKr55ho4vhj0wp03x13iskr1jds FOREIGN KEY (collection_id) REFERENCES collections",
|
||||
"ALTER TABLE collection_stories ADD CONSTRAINT FK7n41tbbrt7r2e81hpu3612r1o FOREIGN KEY (story_id) REFERENCES stories",
|
||||
"ALTER TABLE collection_tags ADD CONSTRAINT FKceq7ggev8n8ibjui1x5yo4x67 FOREIGN KEY (tag_id) REFERENCES tags",
|
||||
"ALTER TABLE collection_tags ADD CONSTRAINT FKq9sa5s8csdpbphrvb48tts8jt FOREIGN KEY (collection_id) REFERENCES collections"
|
||||
};
|
||||
|
||||
// Create tables
|
||||
for (String sql : createTableStatements) {
|
||||
try (var statement = connection.createStatement()) {
|
||||
statement.executeUpdate(sql);
|
||||
}
|
||||
}
|
||||
|
||||
// Create indexes
|
||||
for (String sql : createIndexStatements) {
|
||||
try (var statement = connection.createStatement()) {
|
||||
statement.executeUpdate(sql);
|
||||
}
|
||||
}
|
||||
|
||||
// Create constraints
|
||||
for (String sql : createConstraintStatements) {
|
||||
try (var statement = connection.createStatement()) {
|
||||
statement.executeUpdate(sql);
|
||||
}
|
||||
}
|
||||
|
||||
System.err.println("Database schema created successfully for restore.");
|
||||
|
||||
} catch (SQLException e) {
|
||||
System.err.println("Error creating database schema: " + e.getMessage());
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Add database dump to ZIP archive
|
||||
*/
|
||||
@@ -479,12 +750,17 @@ public class DatabaseManagementService {
|
||||
}
|
||||
|
||||
/**
|
||||
* Add all files to ZIP archive
|
||||
* Add all files to ZIP archive for the current library
|
||||
*/
|
||||
private void addFilesToZip(ZipOutputStream zipOut) throws IOException {
|
||||
Path imagesPath = Paths.get(uploadDir);
|
||||
// Use library-specific image path
|
||||
String libraryImagePath = libraryService.getCurrentImagePath();
|
||||
Path imagesPath = Paths.get(uploadDir + libraryImagePath);
|
||||
|
||||
System.err.println("Adding files to backup for library: " + libraryService.getCurrentLibraryId() + " from path: " + imagesPath);
|
||||
|
||||
if (!Files.exists(imagesPath)) {
|
||||
System.err.println("Library image directory does not exist, skipping file backup: " + imagesPath);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -499,6 +775,7 @@ public class DatabaseManagementService {
|
||||
zipOut.putNextEntry(entry);
|
||||
Files.copy(filePath, zipOut);
|
||||
zipOut.closeEntry();
|
||||
System.err.println("Added file to backup: " + zipEntryName);
|
||||
} catch (IOException e) {
|
||||
throw new RuntimeException("Failed to add file to backup: " + filePath, e);
|
||||
}
|
||||
@@ -515,9 +792,19 @@ public class DatabaseManagementService {
|
||||
metadata.put("timestamp", LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||
metadata.put("generator", "StoryCove Database Management Service");
|
||||
|
||||
// Add library information
|
||||
var currentLibrary = libraryService.getCurrentLibrary();
|
||||
if (currentLibrary != null) {
|
||||
Map<String, Object> libraryInfo = new HashMap<>();
|
||||
libraryInfo.put("id", currentLibrary.getId());
|
||||
libraryInfo.put("name", currentLibrary.getName());
|
||||
libraryInfo.put("description", currentLibrary.getDescription());
|
||||
metadata.put("library", libraryInfo);
|
||||
}
|
||||
|
||||
// Add statistics
|
||||
Map<String, Object> stats = new HashMap<>();
|
||||
try (Connection connection = dataSource.getConnection()) {
|
||||
try (Connection connection = getDataSource().getConnection()) {
|
||||
stats.put("stories", getTableCount(connection, "stories"));
|
||||
stats.put("authors", getTableCount(connection, "authors"));
|
||||
stats.put("collections", getTableCount(connection, "collections"));
|
||||
@@ -526,8 +813,9 @@ public class DatabaseManagementService {
|
||||
}
|
||||
metadata.put("statistics", stats);
|
||||
|
||||
// Count files
|
||||
Path imagesPath = Paths.get(uploadDir);
|
||||
// Count files for current library
|
||||
String libraryImagePath = libraryService.getCurrentImagePath();
|
||||
Path imagesPath = Paths.get(uploadDir + libraryImagePath);
|
||||
int fileCount = 0;
|
||||
if (Files.exists(imagesPath)) {
|
||||
fileCount = (int) Files.walk(imagesPath).filter(Files::isRegularFile).count();
|
||||
@@ -587,6 +875,7 @@ public class DatabaseManagementService {
|
||||
// Validate metadata
|
||||
try {
|
||||
ObjectMapper mapper = new ObjectMapper();
|
||||
@SuppressWarnings("unchecked")
|
||||
Map<String, Object> metadata = mapper.readValue(Files.newInputStream(metadataFile), Map.class);
|
||||
|
||||
String format = (String) metadata.get("format");
|
||||
@@ -605,10 +894,14 @@ public class DatabaseManagementService {
|
||||
}
|
||||
|
||||
/**
|
||||
* Restore files from backup
|
||||
* Restore files from backup to the current library's directory
|
||||
*/
|
||||
private void restoreFiles(Path filesDir) throws IOException {
|
||||
Path targetDir = Paths.get(uploadDir);
|
||||
// Use library-specific image path
|
||||
String libraryImagePath = libraryService.getCurrentImagePath();
|
||||
Path targetDir = Paths.get(uploadDir + libraryImagePath);
|
||||
|
||||
System.err.println("Restoring files for library: " + libraryService.getCurrentLibraryId() + " to path: " + targetDir);
|
||||
Files.createDirectories(targetDir);
|
||||
|
||||
Files.walk(filesDir)
|
||||
@@ -620,6 +913,7 @@ public class DatabaseManagementService {
|
||||
|
||||
Files.createDirectories(targetFile.getParent());
|
||||
Files.copy(sourceFile, targetFile, StandardCopyOption.REPLACE_EXISTING);
|
||||
System.err.println("Restored file: " + relativePath + " to " + targetFile);
|
||||
} catch (IOException e) {
|
||||
throw new RuntimeException("Failed to restore file: " + sourceFile, e);
|
||||
}
|
||||
@@ -655,4 +949,168 @@ public class DatabaseManagementService {
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Manually reindex stories and authors from the current library's database
|
||||
* This bypasses the repository layer and uses direct database access
|
||||
*/
|
||||
private void reindexStoriesAndAuthorsFromCurrentDatabase() throws SQLException {
|
||||
try (Connection connection = getDataSource().getConnection()) {
|
||||
// First, recreate empty collections
|
||||
try {
|
||||
searchServiceAdapter.recreateIndices();
|
||||
} catch (Exception e) {
|
||||
throw new SQLException("Failed to recreate search indices", e);
|
||||
}
|
||||
|
||||
// Count and reindex stories with full author and series information
|
||||
int storyCount = 0;
|
||||
String storyQuery = "SELECT s.id, s.title, s.summary, s.description, s.content_html, s.content_plain, s.source_url, s.cover_path, " +
|
||||
"s.word_count, s.rating, s.volume, s.is_read, s.reading_position, s.last_read_at, s.author_id, s.series_id, " +
|
||||
"s.created_at, s.updated_at, " +
|
||||
"a.name as author_name, a.notes as author_notes, a.avatar_image_path as author_avatar, a.author_rating, " +
|
||||
"a.created_at as author_created_at, a.updated_at as author_updated_at, " +
|
||||
"ser.name as series_name, ser.description as series_description, " +
|
||||
"ser.created_at as series_created_at " +
|
||||
"FROM stories s " +
|
||||
"LEFT JOIN authors a ON s.author_id = a.id " +
|
||||
"LEFT JOIN series ser ON s.series_id = ser.id";
|
||||
|
||||
try (PreparedStatement stmt = connection.prepareStatement(storyQuery);
|
||||
ResultSet rs = stmt.executeQuery()) {
|
||||
|
||||
while (rs.next()) {
|
||||
// Create a complete Story object for indexing
|
||||
var story = createStoryFromResultSet(rs);
|
||||
searchServiceAdapter.indexStory(story);
|
||||
storyCount++;
|
||||
}
|
||||
}
|
||||
|
||||
// Count and reindex authors
|
||||
int authorCount = 0;
|
||||
String authorQuery = "SELECT id, name, notes, avatar_image_path, author_rating, created_at, updated_at FROM authors";
|
||||
|
||||
try (PreparedStatement stmt = connection.prepareStatement(authorQuery);
|
||||
ResultSet rs = stmt.executeQuery()) {
|
||||
|
||||
while (rs.next()) {
|
||||
// Create a minimal Author object for indexing
|
||||
var author = createAuthorFromResultSet(rs);
|
||||
searchServiceAdapter.indexAuthor(author);
|
||||
authorCount++;
|
||||
}
|
||||
}
|
||||
|
||||
System.err.println("Reindexed " + storyCount + " stories and " + authorCount + " authors from library database");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a Story entity from ResultSet for indexing purposes (includes joined author/series data)
|
||||
*/
|
||||
private com.storycove.entity.Story createStoryFromResultSet(ResultSet rs) throws SQLException {
|
||||
var story = new com.storycove.entity.Story();
|
||||
story.setId(UUID.fromString(rs.getString("id")));
|
||||
story.setTitle(rs.getString("title"));
|
||||
story.setSummary(rs.getString("summary"));
|
||||
story.setDescription(rs.getString("description"));
|
||||
story.setContentHtml(rs.getString("content_html"));
|
||||
// Note: contentPlain will be auto-generated from contentHtml by the entity
|
||||
story.setSourceUrl(rs.getString("source_url"));
|
||||
story.setCoverPath(rs.getString("cover_path"));
|
||||
story.setWordCount(rs.getInt("word_count"));
|
||||
story.setRating(rs.getInt("rating"));
|
||||
story.setVolume(rs.getInt("volume"));
|
||||
story.setIsRead(rs.getBoolean("is_read"));
|
||||
story.setReadingPosition(rs.getInt("reading_position"));
|
||||
|
||||
var lastReadAtTimestamp = rs.getTimestamp("last_read_at");
|
||||
if (lastReadAtTimestamp != null) {
|
||||
story.setLastReadAt(lastReadAtTimestamp.toLocalDateTime());
|
||||
}
|
||||
|
||||
var createdAtTimestamp = rs.getTimestamp("created_at");
|
||||
if (createdAtTimestamp != null) {
|
||||
story.setCreatedAt(createdAtTimestamp.toLocalDateTime());
|
||||
}
|
||||
|
||||
var updatedAtTimestamp = rs.getTimestamp("updated_at");
|
||||
if (updatedAtTimestamp != null) {
|
||||
story.setUpdatedAt(updatedAtTimestamp.toLocalDateTime());
|
||||
}
|
||||
|
||||
// Set complete author information
|
||||
String authorIdStr = rs.getString("author_id");
|
||||
if (authorIdStr != null) {
|
||||
var author = new com.storycove.entity.Author();
|
||||
author.setId(UUID.fromString(authorIdStr));
|
||||
author.setName(rs.getString("author_name"));
|
||||
author.setNotes(rs.getString("author_notes"));
|
||||
author.setAvatarImagePath(rs.getString("author_avatar"));
|
||||
|
||||
Integer authorRating = rs.getInt("author_rating");
|
||||
if (!rs.wasNull()) {
|
||||
author.setAuthorRating(authorRating);
|
||||
}
|
||||
|
||||
var authorCreatedAt = rs.getTimestamp("author_created_at");
|
||||
if (authorCreatedAt != null) {
|
||||
author.setCreatedAt(authorCreatedAt.toLocalDateTime());
|
||||
}
|
||||
|
||||
var authorUpdatedAt = rs.getTimestamp("author_updated_at");
|
||||
if (authorUpdatedAt != null) {
|
||||
author.setUpdatedAt(authorUpdatedAt.toLocalDateTime());
|
||||
}
|
||||
|
||||
story.setAuthor(author);
|
||||
}
|
||||
|
||||
// Set complete series information
|
||||
String seriesIdStr = rs.getString("series_id");
|
||||
if (seriesIdStr != null) {
|
||||
var series = new com.storycove.entity.Series();
|
||||
series.setId(UUID.fromString(seriesIdStr));
|
||||
series.setName(rs.getString("series_name"));
|
||||
series.setDescription(rs.getString("series_description"));
|
||||
|
||||
var seriesCreatedAt = rs.getTimestamp("series_created_at");
|
||||
if (seriesCreatedAt != null) {
|
||||
series.setCreatedAt(seriesCreatedAt.toLocalDateTime());
|
||||
}
|
||||
|
||||
story.setSeries(series);
|
||||
}
|
||||
|
||||
return story;
|
||||
}
|
||||
|
||||
/**
|
||||
* Create an Author entity from ResultSet for indexing purposes
|
||||
*/
|
||||
private com.storycove.entity.Author createAuthorFromResultSet(ResultSet rs) throws SQLException {
|
||||
var author = new com.storycove.entity.Author();
|
||||
author.setId(UUID.fromString(rs.getString("id")));
|
||||
author.setName(rs.getString("name"));
|
||||
author.setNotes(rs.getString("notes"));
|
||||
author.setAvatarImagePath(rs.getString("avatar_image_path"));
|
||||
|
||||
Integer rating = rs.getInt("author_rating");
|
||||
if (!rs.wasNull()) {
|
||||
author.setAuthorRating(rating);
|
||||
}
|
||||
|
||||
var createdAtTimestamp = rs.getTimestamp("created_at");
|
||||
if (createdAtTimestamp != null) {
|
||||
author.setCreatedAt(createdAtTimestamp.toLocalDateTime());
|
||||
}
|
||||
|
||||
var updatedAtTimestamp = rs.getTimestamp("updated_at");
|
||||
if (updatedAtTimestamp != null) {
|
||||
author.setUpdatedAt(updatedAtTimestamp.toLocalDateTime());
|
||||
}
|
||||
|
||||
return author;
|
||||
}
|
||||
}
|
||||
@@ -21,7 +21,6 @@ import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.IOException;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
|
||||
@@ -26,8 +26,6 @@ import java.io.InputStream;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
@Service
|
||||
@Transactional
|
||||
@@ -76,6 +74,34 @@ public class EPUBImportService {
|
||||
|
||||
Story savedStory = storyService.create(story);
|
||||
|
||||
// Process embedded images if content contains any
|
||||
String originalContent = story.getContentHtml();
|
||||
if (originalContent != null && originalContent.contains("<img")) {
|
||||
try {
|
||||
ImageService.ContentImageProcessingResult imageResult =
|
||||
imageService.processContentImages(originalContent, savedStory.getId());
|
||||
|
||||
// Update story content with processed images if changed
|
||||
if (!imageResult.getProcessedContent().equals(originalContent)) {
|
||||
savedStory.setContentHtml(imageResult.getProcessedContent());
|
||||
savedStory = storyService.update(savedStory.getId(), savedStory);
|
||||
|
||||
// Log the image processing results
|
||||
System.out.println("EPUB Import - Image processing completed for story " + savedStory.getId() +
|
||||
". Downloaded " + imageResult.getDownloadedImages().size() + " images.");
|
||||
|
||||
if (imageResult.hasWarnings()) {
|
||||
System.out.println("EPUB Import - Image processing warnings: " +
|
||||
String.join(", ", imageResult.getWarnings()));
|
||||
}
|
||||
}
|
||||
} catch (Exception e) {
|
||||
// Log error but don't fail the import
|
||||
System.err.println("EPUB Import - Failed to process embedded images for story " +
|
||||
savedStory.getId() + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
EPUBImportResponse response = EPUBImportResponse.success(savedStory.getId(), savedStory.getTitle());
|
||||
response.setWordCount(savedStory.getWordCount());
|
||||
response.setTotalChapters(book.getSpine().size());
|
||||
|
||||
@@ -54,7 +54,7 @@ public class HtmlSanitizationService {
|
||||
"p", "br", "div", "span", "h1", "h2", "h3", "h4", "h5", "h6",
|
||||
"b", "strong", "i", "em", "u", "s", "strike", "del", "ins",
|
||||
"sup", "sub", "small", "big", "mark", "pre", "code",
|
||||
"ul", "ol", "li", "dl", "dt", "dd", "a",
|
||||
"ul", "ol", "li", "dl", "dt", "dd", "a", "img",
|
||||
"table", "thead", "tbody", "tfoot", "tr", "th", "td", "caption",
|
||||
"blockquote", "cite", "q", "hr"
|
||||
));
|
||||
@@ -65,7 +65,7 @@ public class HtmlSanitizationService {
|
||||
}
|
||||
|
||||
private void createSafelist() {
|
||||
this.allowlist = new Safelist();
|
||||
this.allowlist = Safelist.relaxed();
|
||||
|
||||
// Add allowed tags
|
||||
if (config.getAllowedTags() != null) {
|
||||
@@ -83,7 +83,12 @@ public class HtmlSanitizationService {
|
||||
}
|
||||
}
|
||||
|
||||
// Configure allowed protocols for specific attributes (e.g., href)
|
||||
// Special handling for img tags - allow all src attributes and validate later
|
||||
allowlist.removeProtocols("img", "src", "http", "https");
|
||||
// This is the key: preserve relative URLs by not restricting them
|
||||
allowlist.preserveRelativeLinks(true);
|
||||
|
||||
// Configure allowed protocols for other attributes
|
||||
if (config.getAllowedProtocols() != null) {
|
||||
for (Map.Entry<String, Map<String, List<String>>> tagEntry : config.getAllowedProtocols().entrySet()) {
|
||||
String tag = tagEntry.getKey();
|
||||
@@ -94,7 +99,8 @@ public class HtmlSanitizationService {
|
||||
String attribute = attrEntry.getKey();
|
||||
List<String> protocols = attrEntry.getValue();
|
||||
|
||||
if (protocols != null) {
|
||||
if (protocols != null && !("img".equals(tag) && "src".equals(attribute))) {
|
||||
// Skip img src since we handled it above
|
||||
allowlist.addProtocols(tag, attribute, protocols.toArray(new String[0]));
|
||||
}
|
||||
}
|
||||
@@ -102,6 +108,8 @@ public class HtmlSanitizationService {
|
||||
}
|
||||
}
|
||||
|
||||
logger.info("Configured Jsoup Safelist with preserveRelativeLinks=true for local image URLs");
|
||||
|
||||
// Remove specific attributes if needed (deprecated in favor of protocol control)
|
||||
if (config.getRemovedAttributes() != null) {
|
||||
for (Map.Entry<String, List<String>> entry : config.getRemovedAttributes().entrySet()) {
|
||||
@@ -133,8 +141,10 @@ public class HtmlSanitizationService {
|
||||
if (html == null || html.trim().isEmpty()) {
|
||||
return "";
|
||||
}
|
||||
|
||||
return Jsoup.clean(html, allowlist);
|
||||
logger.info("Content before sanitization: "+html);
|
||||
String saniztedHtml = Jsoup.clean(html, allowlist.preserveRelativeLinks(true));
|
||||
logger.info("Content after sanitization: "+saniztedHtml);
|
||||
return saniztedHtml;
|
||||
}
|
||||
|
||||
public String extractPlainText(String html) {
|
||||
|
||||
@@ -1,5 +1,8 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.web.multipart.MultipartFile;
|
||||
@@ -7,18 +10,22 @@ import org.springframework.web.multipart.MultipartFile;
|
||||
import javax.imageio.ImageIO;
|
||||
import java.awt.*;
|
||||
import java.awt.image.BufferedImage;
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.io.IOException;
|
||||
import java.io.*;
|
||||
import java.net.HttpURLConnection;
|
||||
import java.net.URL;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.util.Set;
|
||||
import java.util.UUID;
|
||||
import java.util.*;
|
||||
import java.util.List;
|
||||
import java.util.regex.Matcher;
|
||||
import java.util.regex.Pattern;
|
||||
|
||||
@Service
|
||||
public class ImageService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(ImageService.class);
|
||||
|
||||
private static final Set<String> ALLOWED_CONTENT_TYPES = Set.of(
|
||||
"image/jpeg", "image/jpg", "image/png"
|
||||
);
|
||||
@@ -28,7 +35,18 @@ public class ImageService {
|
||||
);
|
||||
|
||||
@Value("${storycove.images.upload-dir:/app/images}")
|
||||
private String uploadDir;
|
||||
private String baseUploadDir;
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
@Autowired
|
||||
private StoryService storyService;
|
||||
|
||||
private String getUploadDir() {
|
||||
String libraryPath = libraryService.getCurrentImagePath();
|
||||
return baseUploadDir + libraryPath;
|
||||
}
|
||||
|
||||
@Value("${storycove.images.cover.max-width:800}")
|
||||
private int coverMaxWidth;
|
||||
@@ -44,7 +62,8 @@ public class ImageService {
|
||||
|
||||
public enum ImageType {
|
||||
COVER("covers"),
|
||||
AVATAR("avatars");
|
||||
AVATAR("avatars"),
|
||||
CONTENT("content");
|
||||
|
||||
private final String directory;
|
||||
|
||||
@@ -61,7 +80,7 @@ public class ImageService {
|
||||
validateFile(file);
|
||||
|
||||
// Create directories if they don't exist
|
||||
Path typeDir = Paths.get(uploadDir, imageType.getDirectory());
|
||||
Path typeDir = Paths.get(getUploadDir(), imageType.getDirectory());
|
||||
Files.createDirectories(typeDir);
|
||||
|
||||
// Generate unique filename
|
||||
@@ -88,7 +107,7 @@ public class ImageService {
|
||||
}
|
||||
|
||||
try {
|
||||
Path fullPath = Paths.get(uploadDir, imagePath);
|
||||
Path fullPath = Paths.get(getUploadDir(), imagePath);
|
||||
return Files.deleteIfExists(fullPath);
|
||||
} catch (IOException e) {
|
||||
return false;
|
||||
@@ -96,7 +115,7 @@ public class ImageService {
|
||||
}
|
||||
|
||||
public Path getImagePath(String imagePath) {
|
||||
return Paths.get(uploadDir, imagePath);
|
||||
return Paths.get(getUploadDir(), imagePath);
|
||||
}
|
||||
|
||||
public boolean imageExists(String imagePath) {
|
||||
@@ -107,6 +126,19 @@ public class ImageService {
|
||||
return Files.exists(getImagePath(imagePath));
|
||||
}
|
||||
|
||||
public boolean imageExistsInLibrary(String imagePath, String libraryId) {
|
||||
if (imagePath == null || imagePath.trim().isEmpty() || libraryId == null) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return Files.exists(getImagePathInLibrary(imagePath, libraryId));
|
||||
}
|
||||
|
||||
public Path getImagePathInLibrary(String imagePath, String libraryId) {
|
||||
String libraryPath = libraryService.getImagePathForLibrary(libraryId);
|
||||
return Paths.get(baseUploadDir + libraryPath, imagePath);
|
||||
}
|
||||
|
||||
private void validateFile(MultipartFile file) throws IOException {
|
||||
if (file == null || file.isEmpty()) {
|
||||
throw new IllegalArgumentException("File is empty");
|
||||
@@ -160,6 +192,9 @@ public class ImageService {
|
||||
maxWidth = avatarMaxSize;
|
||||
maxHeight = avatarMaxSize;
|
||||
break;
|
||||
case CONTENT:
|
||||
// Content images are not resized
|
||||
return new Dimension(originalWidth, originalHeight);
|
||||
default:
|
||||
return new Dimension(originalWidth, originalHeight);
|
||||
}
|
||||
@@ -206,4 +241,504 @@ public class ImageService {
|
||||
String extension = getFileExtension(filename);
|
||||
return ALLOWED_EXTENSIONS.contains(extension);
|
||||
}
|
||||
|
||||
// Content image processing methods
|
||||
|
||||
/**
|
||||
* Process HTML content and download all referenced images, replacing URLs with local paths
|
||||
*/
|
||||
public ContentImageProcessingResult processContentImages(String htmlContent, UUID storyId) {
|
||||
logger.info("Processing content images for story: {}, content length: {}", storyId,
|
||||
htmlContent != null ? htmlContent.length() : 0);
|
||||
|
||||
List<String> warnings = new ArrayList<>();
|
||||
List<String> downloadedImages = new ArrayList<>();
|
||||
|
||||
if (htmlContent == null || htmlContent.trim().isEmpty()) {
|
||||
logger.info("No content to process for story: {}", storyId);
|
||||
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
|
||||
}
|
||||
|
||||
// Find all img tags with src attributes
|
||||
Pattern imgPattern = Pattern.compile("<img[^>]+src=[\"']([^\"']+)[\"'][^>]*>", Pattern.CASE_INSENSITIVE);
|
||||
Matcher matcher = imgPattern.matcher(htmlContent);
|
||||
|
||||
int imageCount = 0;
|
||||
int externalImageCount = 0;
|
||||
|
||||
StringBuffer processedContent = new StringBuffer();
|
||||
|
||||
while (matcher.find()) {
|
||||
String fullImgTag = matcher.group(0);
|
||||
String imageUrl = matcher.group(1);
|
||||
imageCount++;
|
||||
|
||||
logger.info("Found image #{}: {} in tag: {}", imageCount, imageUrl, fullImgTag);
|
||||
|
||||
try {
|
||||
// Skip if it's already a local path or data URL
|
||||
if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) {
|
||||
logger.info("Skipping local/data URL: {}", imageUrl);
|
||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||
continue;
|
||||
}
|
||||
|
||||
externalImageCount++;
|
||||
logger.info("Processing external image #{}: {}", externalImageCount, imageUrl);
|
||||
|
||||
// Download and store the image
|
||||
String localPath = downloadImageFromUrl(imageUrl, storyId);
|
||||
downloadedImages.add(localPath);
|
||||
|
||||
// Generate local URL
|
||||
String localUrl = getLocalImageUrl(storyId, localPath);
|
||||
logger.info("Downloaded image: {} -> {}", imageUrl, localUrl);
|
||||
|
||||
// Replace the src attribute with the local path - handle both single and double quotes
|
||||
String newImgTag = fullImgTag
|
||||
.replaceFirst("src=\"" + Pattern.quote(imageUrl) + "\"", "src=\"" + localUrl + "\"")
|
||||
.replaceFirst("src='" + Pattern.quote(imageUrl) + "'", "src=\"" + localUrl + "\"");
|
||||
|
||||
// If replacement didn't work, try a more generic approach
|
||||
if (newImgTag.equals(fullImgTag)) {
|
||||
logger.warn("Standard replacement failed for image URL: {}, trying generic replacement", imageUrl);
|
||||
newImgTag = fullImgTag.replaceAll("src\\s*=\\s*[\"']?" + Pattern.quote(imageUrl) + "[\"']?", "src=\"" + localUrl + "\"");
|
||||
}
|
||||
|
||||
logger.info("Replaced img tag: {} -> {}", fullImgTag, newImgTag);
|
||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to download image: {} - {}", imageUrl, e.getMessage(), e);
|
||||
warnings.add("Failed to download image: " + imageUrl + " - " + e.getMessage());
|
||||
// Keep original URL in case of failure
|
||||
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||
}
|
||||
}
|
||||
|
||||
matcher.appendTail(processedContent);
|
||||
|
||||
logger.info("Finished processing images for story: {}. Found {} total images, {} external. Downloaded {} images, {} warnings",
|
||||
storyId, imageCount, externalImageCount, downloadedImages.size(), warnings.size());
|
||||
|
||||
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
|
||||
}
|
||||
|
||||
/**
|
||||
* Download an image from a URL and store it locally
|
||||
*/
|
||||
private String downloadImageFromUrl(String imageUrl, UUID storyId) throws IOException {
|
||||
URL url = new URL(imageUrl);
|
||||
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
|
||||
|
||||
// Set a reasonable user agent to avoid blocks
|
||||
connection.setRequestProperty("User-Agent", "Mozilla/5.0 (StoryCove Image Processor)");
|
||||
connection.setConnectTimeout(30000); // 30 seconds
|
||||
connection.setReadTimeout(30000);
|
||||
|
||||
try (InputStream inputStream = connection.getInputStream()) {
|
||||
// Get content type to determine file extension
|
||||
String contentType = connection.getContentType();
|
||||
String extension = getExtensionFromContentType(contentType);
|
||||
|
||||
if (extension == null) {
|
||||
// Try to extract from URL
|
||||
extension = getExtensionFromUrl(imageUrl);
|
||||
}
|
||||
|
||||
if (extension == null || !ALLOWED_EXTENSIONS.contains(extension.toLowerCase())) {
|
||||
throw new IllegalArgumentException("Unsupported image format: " + contentType);
|
||||
}
|
||||
|
||||
// Create directories for content images
|
||||
Path contentDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory(), storyId.toString());
|
||||
Files.createDirectories(contentDir);
|
||||
|
||||
// Generate unique filename
|
||||
String filename = UUID.randomUUID().toString() + "." + extension.toLowerCase();
|
||||
Path filePath = contentDir.resolve(filename);
|
||||
|
||||
// Read and validate the image
|
||||
byte[] imageData = inputStream.readAllBytes();
|
||||
ByteArrayInputStream bais = new ByteArrayInputStream(imageData);
|
||||
BufferedImage image = ImageIO.read(bais);
|
||||
|
||||
if (image == null) {
|
||||
throw new IOException("Invalid image format");
|
||||
}
|
||||
|
||||
// Save the image
|
||||
Files.write(filePath, imageData);
|
||||
|
||||
// Return relative path
|
||||
return ImageType.CONTENT.getDirectory() + "/" + storyId.toString() + "/" + filename;
|
||||
|
||||
} finally {
|
||||
connection.disconnect();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate local image URL for serving
|
||||
*/
|
||||
private String getLocalImageUrl(UUID storyId, String imagePath) {
|
||||
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||
if (currentLibraryId == null || currentLibraryId.trim().isEmpty()) {
|
||||
logger.warn("Current library ID is null or empty when generating local image URL for story: {}", storyId);
|
||||
return "/api/files/images/default/" + imagePath;
|
||||
}
|
||||
String localUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
||||
logger.info("Generated local image URL: {} for story: {}", localUrl, storyId);
|
||||
return localUrl;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get file extension from content type
|
||||
*/
|
||||
private String getExtensionFromContentType(String contentType) {
|
||||
if (contentType == null) return null;
|
||||
|
||||
switch (contentType.toLowerCase()) {
|
||||
case "image/jpeg":
|
||||
case "image/jpg":
|
||||
return "jpg";
|
||||
case "image/png":
|
||||
return "png";
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract file extension from URL
|
||||
*/
|
||||
private String getExtensionFromUrl(String url) {
|
||||
try {
|
||||
String path = new URL(url).getPath();
|
||||
int lastDot = path.lastIndexOf('.');
|
||||
if (lastDot > 0 && lastDot < path.length() - 1) {
|
||||
return path.substring(lastDot + 1).toLowerCase();
|
||||
}
|
||||
} catch (Exception ignored) {
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Cleanup orphaned content images that are no longer referenced in any story
|
||||
*/
|
||||
public ContentImageCleanupResult cleanupOrphanedContentImages(boolean dryRun) {
|
||||
logger.info("Starting orphaned content image cleanup (dryRun: {})", dryRun);
|
||||
|
||||
final Set<String> referencedImages;
|
||||
List<String> orphanedImages = new ArrayList<>();
|
||||
List<String> errors = new ArrayList<>();
|
||||
long totalSizeBytes = 0;
|
||||
int foldersToDelete = 0;
|
||||
|
||||
// Step 1: Collect all image references from all story content
|
||||
logger.info("Scanning all story content for image references...");
|
||||
referencedImages = collectAllImageReferences();
|
||||
logger.info("Found {} unique image references in story content", referencedImages.size());
|
||||
|
||||
try {
|
||||
// Step 2: Scan the content images directory
|
||||
Path contentImagesDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory());
|
||||
|
||||
if (!Files.exists(contentImagesDir)) {
|
||||
logger.info("Content images directory does not exist: {}", contentImagesDir);
|
||||
return new ContentImageCleanupResult(orphanedImages, 0, 0, referencedImages.size(), errors, dryRun);
|
||||
}
|
||||
|
||||
logger.info("Scanning content images directory: {}", contentImagesDir);
|
||||
|
||||
// Walk through all story directories
|
||||
Files.walk(contentImagesDir, 2)
|
||||
.filter(Files::isDirectory)
|
||||
.filter(path -> !path.equals(contentImagesDir)) // Skip the root content directory
|
||||
.forEach(storyDir -> {
|
||||
try {
|
||||
String storyId = storyDir.getFileName().toString();
|
||||
logger.debug("Checking story directory: {}", storyId);
|
||||
|
||||
// Check if this story still exists
|
||||
boolean storyExists = storyService.findByIdOptional(UUID.fromString(storyId)).isPresent();
|
||||
|
||||
if (!storyExists) {
|
||||
logger.info("Found orphaned story directory (story deleted): {}", storyId);
|
||||
// Mark entire directory for deletion
|
||||
try {
|
||||
Files.walk(storyDir)
|
||||
.filter(Files::isRegularFile)
|
||||
.forEach(file -> {
|
||||
try {
|
||||
long size = Files.size(file);
|
||||
orphanedImages.add(file.toString());
|
||||
// Add to total size (will be updated in main scope)
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to get size for " + file + ": " + e.getMessage());
|
||||
}
|
||||
});
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to scan orphaned story directory " + storyDir + ": " + e.getMessage());
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Check individual files in the story directory
|
||||
try {
|
||||
Files.walk(storyDir)
|
||||
.filter(Files::isRegularFile)
|
||||
.forEach(imageFile -> {
|
||||
try {
|
||||
String imagePath = getRelativeImagePath(imageFile);
|
||||
|
||||
if (!referencedImages.contains(imagePath)) {
|
||||
logger.debug("Found orphaned image: {}", imagePath);
|
||||
orphanedImages.add(imageFile.toString());
|
||||
}
|
||||
} catch (Exception e) {
|
||||
errors.add("Error checking image file " + imageFile + ": " + e.getMessage());
|
||||
}
|
||||
});
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to scan story directory " + storyDir + ": " + e.getMessage());
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
errors.add("Error processing story directory " + storyDir + ": " + e.getMessage());
|
||||
}
|
||||
});
|
||||
|
||||
// Calculate total size and count empty directories
|
||||
for (String orphanedImage : orphanedImages) {
|
||||
try {
|
||||
Path imagePath = Paths.get(orphanedImage);
|
||||
if (Files.exists(imagePath)) {
|
||||
totalSizeBytes += Files.size(imagePath);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to get size for " + orphanedImage + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
// Count empty directories that would be removed
|
||||
try {
|
||||
foldersToDelete = (int) Files.walk(contentImagesDir)
|
||||
.filter(Files::isDirectory)
|
||||
.filter(path -> !path.equals(contentImagesDir))
|
||||
.filter(this::isDirectoryEmptyOrWillBeEmpty)
|
||||
.count();
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to count empty directories: " + e.getMessage());
|
||||
}
|
||||
|
||||
// Step 3: Delete orphaned files if not dry run
|
||||
if (!dryRun && !orphanedImages.isEmpty()) {
|
||||
logger.info("Deleting {} orphaned images...", orphanedImages.size());
|
||||
|
||||
Set<Path> directoriesToCheck = new HashSet<>();
|
||||
|
||||
for (String orphanedImage : orphanedImages) {
|
||||
try {
|
||||
Path imagePath = Paths.get(orphanedImage);
|
||||
if (Files.exists(imagePath)) {
|
||||
directoriesToCheck.add(imagePath.getParent());
|
||||
Files.delete(imagePath);
|
||||
logger.debug("Deleted orphaned image: {}", imagePath);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to delete " + orphanedImage + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up empty directories
|
||||
for (Path dir : directoriesToCheck) {
|
||||
try {
|
||||
if (Files.exists(dir) && isDirEmpty(dir)) {
|
||||
Files.delete(dir);
|
||||
logger.info("Deleted empty story directory: {}", dir);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
errors.add("Failed to delete empty directory " + dir + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
logger.info("Orphaned content image cleanup completed. Found {} orphaned files ({} bytes)",
|
||||
orphanedImages.size(), totalSizeBytes);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during orphaned content image cleanup", e);
|
||||
errors.add("General cleanup error: " + e.getMessage());
|
||||
}
|
||||
|
||||
return new ContentImageCleanupResult(orphanedImages, totalSizeBytes, foldersToDelete, referencedImages.size(), errors, dryRun);
|
||||
}
|
||||
|
||||
/**
|
||||
* Collect all image references from all story content
|
||||
*/
|
||||
private Set<String> collectAllImageReferences() {
|
||||
Set<String> referencedImages = new HashSet<>();
|
||||
|
||||
try {
|
||||
// Get all stories
|
||||
List<com.storycove.entity.Story> allStories = storyService.findAllWithAssociations();
|
||||
|
||||
// Pattern to match local image URLs in content
|
||||
Pattern imagePattern = Pattern.compile("src=[\"']([^\"']*(?:content/[^\"']*\\.(jpg|jpeg|png)))[\"']", Pattern.CASE_INSENSITIVE);
|
||||
|
||||
for (com.storycove.entity.Story story : allStories) {
|
||||
if (story.getContentHtml() != null) {
|
||||
Matcher matcher = imagePattern.matcher(story.getContentHtml());
|
||||
|
||||
while (matcher.find()) {
|
||||
String imageSrc = matcher.group(1);
|
||||
|
||||
// Convert to relative path format that matches our file system
|
||||
String relativePath = convertSrcToRelativePath(imageSrc);
|
||||
if (relativePath != null) {
|
||||
referencedImages.add(relativePath);
|
||||
logger.debug("Found image reference in story {}: {}", story.getId(), relativePath);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error collecting image references from stories", e);
|
||||
}
|
||||
|
||||
return referencedImages;
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert an image src attribute to relative file path
|
||||
*/
|
||||
private String convertSrcToRelativePath(String src) {
|
||||
try {
|
||||
// Handle both /api/files/images/libraryId/content/... and relative content/... paths
|
||||
if (src.contains("/content/")) {
|
||||
int contentIndex = src.indexOf("/content/");
|
||||
return src.substring(contentIndex + 1); // Remove leading slash, keep "content/..."
|
||||
}
|
||||
} catch (Exception e) {
|
||||
logger.debug("Failed to convert src to relative path: {}", src);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get relative image path from absolute file path
|
||||
*/
|
||||
private String getRelativeImagePath(Path imageFile) {
|
||||
try {
|
||||
Path uploadDir = Paths.get(getUploadDir());
|
||||
Path relativePath = uploadDir.relativize(imageFile);
|
||||
return relativePath.toString().replace('\\', '/'); // Normalize path separators
|
||||
} catch (Exception e) {
|
||||
logger.debug("Failed to get relative path for: {}", imageFile);
|
||||
return imageFile.toString();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if directory is empty or will be empty after cleanup
|
||||
*/
|
||||
private boolean isDirectoryEmptyOrWillBeEmpty(Path dir) {
|
||||
try {
|
||||
return Files.walk(dir)
|
||||
.filter(Files::isRegularFile)
|
||||
.count() == 0;
|
||||
} catch (IOException e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if directory is empty
|
||||
*/
|
||||
private boolean isDirEmpty(Path dir) {
|
||||
try {
|
||||
return Files.list(dir).count() == 0;
|
||||
} catch (IOException e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clean up content images for a story
|
||||
*/
|
||||
public void deleteContentImages(UUID storyId) {
|
||||
try {
|
||||
Path contentDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory(), storyId.toString());
|
||||
if (Files.exists(contentDir)) {
|
||||
Files.walk(contentDir)
|
||||
.sorted(Comparator.reverseOrder())
|
||||
.map(Path::toFile)
|
||||
.forEach(java.io.File::delete);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
// Log but don't throw - this is cleanup
|
||||
System.err.println("Failed to clean up content images for story " + storyId + ": " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Result class for content image processing
|
||||
*/
|
||||
public static class ContentImageProcessingResult {
|
||||
private final String processedContent;
|
||||
private final List<String> warnings;
|
||||
private final List<String> downloadedImages;
|
||||
|
||||
public ContentImageProcessingResult(String processedContent, List<String> warnings, List<String> downloadedImages) {
|
||||
this.processedContent = processedContent;
|
||||
this.warnings = warnings;
|
||||
this.downloadedImages = downloadedImages;
|
||||
}
|
||||
|
||||
public String getProcessedContent() { return processedContent; }
|
||||
public List<String> getWarnings() { return warnings; }
|
||||
public List<String> getDownloadedImages() { return downloadedImages; }
|
||||
public boolean hasWarnings() { return !warnings.isEmpty(); }
|
||||
}
|
||||
|
||||
/**
|
||||
* Result class for orphaned image cleanup
|
||||
*/
|
||||
public static class ContentImageCleanupResult {
|
||||
private final List<String> orphanedImages;
|
||||
private final long totalSizeBytes;
|
||||
private final int foldersToDelete;
|
||||
private final int totalReferencedImages;
|
||||
private final List<String> errors;
|
||||
private final boolean dryRun;
|
||||
|
||||
public ContentImageCleanupResult(List<String> orphanedImages, long totalSizeBytes, int foldersToDelete,
|
||||
int totalReferencedImages, List<String> errors, boolean dryRun) {
|
||||
this.orphanedImages = orphanedImages;
|
||||
this.totalSizeBytes = totalSizeBytes;
|
||||
this.foldersToDelete = foldersToDelete;
|
||||
this.totalReferencedImages = totalReferencedImages;
|
||||
this.errors = errors;
|
||||
this.dryRun = dryRun;
|
||||
}
|
||||
|
||||
public List<String> getOrphanedImages() { return orphanedImages; }
|
||||
public long getTotalSizeBytes() { return totalSizeBytes; }
|
||||
public int getFoldersToDelete() { return foldersToDelete; }
|
||||
public int getTotalReferencedImages() { return totalReferencedImages; }
|
||||
public List<String> getErrors() { return errors; }
|
||||
public boolean isDryRun() { return dryRun; }
|
||||
public boolean hasErrors() { return !errors.isEmpty(); }
|
||||
|
||||
public String getFormattedSize() {
|
||||
if (totalSizeBytes < 1024) return totalSizeBytes + " B";
|
||||
if (totalSizeBytes < 1024 * 1024) return String.format("%.1f KB", totalSizeBytes / 1024.0);
|
||||
if (totalSizeBytes < 1024 * 1024 * 1024) return String.format("%.1f MB", totalSizeBytes / (1024.0 * 1024.0));
|
||||
return String.format("%.1f GB", totalSizeBytes / (1024.0 * 1024.0 * 1024.0));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,73 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Qualifier;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import javax.sql.DataSource;
|
||||
import java.sql.Connection;
|
||||
import java.sql.SQLException;
|
||||
|
||||
/**
|
||||
* Base service class that provides library-aware database access.
|
||||
*
|
||||
* This approach is safer than routing at the datasource level because:
|
||||
* 1. It doesn't interfere with Spring's initialization process
|
||||
* 2. It allows fine-grained control over which operations are library-aware
|
||||
* 3. It provides clear separation between authentication (uses default DB) and library operations
|
||||
*/
|
||||
@Component
|
||||
public class LibraryAwareService {
|
||||
|
||||
@Autowired
|
||||
private LibraryService libraryService;
|
||||
|
||||
@Autowired
|
||||
@Qualifier("dataSource")
|
||||
private DataSource defaultDataSource;
|
||||
|
||||
/**
|
||||
* Get a database connection for the current active library.
|
||||
* Falls back to default datasource if no library is active.
|
||||
*/
|
||||
public Connection getCurrentLibraryConnection() throws SQLException {
|
||||
try {
|
||||
// Try to get library-specific connection
|
||||
DataSource libraryDataSource = libraryService.getCurrentDataSource();
|
||||
return libraryDataSource.getConnection();
|
||||
} catch (IllegalStateException e) {
|
||||
// No active library - use default datasource
|
||||
return defaultDataSource.getConnection();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a database connection for the default/fallback database.
|
||||
* Use this for authentication and system-level operations.
|
||||
*/
|
||||
public Connection getDefaultConnection() throws SQLException {
|
||||
return defaultDataSource.getConnection();
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a library is currently active
|
||||
*/
|
||||
public boolean hasActiveLibrary() {
|
||||
try {
|
||||
return libraryService.getCurrentLibraryId() != null;
|
||||
} catch (Exception e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the current active library ID, or null if none
|
||||
*/
|
||||
public String getCurrentLibraryId() {
|
||||
try {
|
||||
return libraryService.getCurrentLibraryId();
|
||||
} catch (Exception e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
||||
830
backend/src/main/java/com/storycove/service/LibraryService.java
Normal file
830
backend/src/main/java/com/storycove/service/LibraryService.java
Normal file
@@ -0,0 +1,830 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.Library;
|
||||
import com.storycove.dto.LibraryDto;
|
||||
import com.fasterxml.jackson.core.type.TypeReference;
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.zaxxer.hikari.HikariConfig;
|
||||
import com.zaxxer.hikari.HikariDataSource;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.context.ApplicationContext;
|
||||
import org.springframework.context.ApplicationContextAware;
|
||||
import org.springframework.security.crypto.bcrypt.BCryptPasswordEncoder;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import jakarta.annotation.PostConstruct;
|
||||
import jakarta.annotation.PreDestroy;
|
||||
import javax.sql.DataSource;
|
||||
import java.io.File;
|
||||
import java.io.IOException;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.sql.SQLException;
|
||||
import java.util.*;
|
||||
import java.util.concurrent.ConcurrentHashMap;
|
||||
|
||||
@Service
|
||||
public class LibraryService implements ApplicationContextAware {
|
||||
private static final Logger logger = LoggerFactory.getLogger(LibraryService.class);
|
||||
|
||||
@Value("${spring.datasource.url}")
|
||||
private String baseDbUrl;
|
||||
|
||||
@Value("${spring.datasource.username}")
|
||||
private String dbUsername;
|
||||
|
||||
@Value("${spring.datasource.password}")
|
||||
private String dbPassword;
|
||||
|
||||
|
||||
private final ObjectMapper objectMapper = new ObjectMapper();
|
||||
private final BCryptPasswordEncoder passwordEncoder = new BCryptPasswordEncoder();
|
||||
private final Map<String, Library> libraries = new ConcurrentHashMap<>();
|
||||
|
||||
// Spring ApplicationContext for accessing other services without circular dependencies
|
||||
private ApplicationContext applicationContext;
|
||||
|
||||
// Current active resources
|
||||
private volatile String currentLibraryId;
|
||||
|
||||
// Security: Track if user has explicitly authenticated in this session
|
||||
private volatile boolean explicitlyAuthenticated = false;
|
||||
|
||||
private static final String LIBRARIES_CONFIG_PATH = "/app/config/libraries.json";
|
||||
private static final Path libraryConfigDir = Paths.get("/app/config");
|
||||
|
||||
@Override
|
||||
public void setApplicationContext(ApplicationContext applicationContext) {
|
||||
this.applicationContext = applicationContext;
|
||||
}
|
||||
|
||||
@PostConstruct
|
||||
public void initialize() {
|
||||
loadLibrariesFromFile();
|
||||
|
||||
// If no libraries exist, create a default one
|
||||
if (libraries.isEmpty()) {
|
||||
createDefaultLibrary();
|
||||
}
|
||||
|
||||
// Security: Do NOT automatically switch to any library on startup
|
||||
// Users must authenticate before accessing any library
|
||||
explicitlyAuthenticated = false;
|
||||
currentLibraryId = null;
|
||||
|
||||
if (!libraries.isEmpty()) {
|
||||
logger.info("Loaded {} libraries. Authentication required to access any library.", libraries.size());
|
||||
} else {
|
||||
logger.info("No libraries found. A default library will be created on first authentication.");
|
||||
}
|
||||
|
||||
logger.info("Security: Application startup completed. All users must re-authenticate.");
|
||||
}
|
||||
|
||||
@PreDestroy
|
||||
public void cleanup() {
|
||||
currentLibraryId = null;
|
||||
explicitlyAuthenticated = false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear authentication state (for logout)
|
||||
*/
|
||||
public void clearAuthentication() {
|
||||
explicitlyAuthenticated = false;
|
||||
currentLibraryId = null;
|
||||
logger.info("Authentication cleared - user must re-authenticate to access libraries");
|
||||
}
|
||||
|
||||
|
||||
public String authenticateAndGetLibrary(String password) {
|
||||
for (Library library : libraries.values()) {
|
||||
if (passwordEncoder.matches(password, library.getPasswordHash())) {
|
||||
// Mark as explicitly authenticated for this session
|
||||
explicitlyAuthenticated = true;
|
||||
logger.info("User explicitly authenticated for library: {}", library.getId());
|
||||
return library.getId();
|
||||
}
|
||||
}
|
||||
return null; // Authentication failed
|
||||
}
|
||||
|
||||
/**
|
||||
* Switch to library after authentication with forced reindexing
|
||||
* This ensures OpenSearch is always up-to-date after login
|
||||
*/
|
||||
public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception {
|
||||
logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId);
|
||||
switchToLibrary(libraryId, true);
|
||||
}
|
||||
|
||||
public synchronized void switchToLibrary(String libraryId) throws Exception {
|
||||
switchToLibrary(libraryId, false);
|
||||
}
|
||||
|
||||
public synchronized void switchToLibrary(String libraryId, boolean forceReindex) throws Exception {
|
||||
// Security: Only allow library switching after explicit authentication
|
||||
if (!explicitlyAuthenticated) {
|
||||
throw new IllegalStateException("Library switching requires explicit authentication. Please log in first.");
|
||||
}
|
||||
|
||||
if (libraryId.equals(currentLibraryId) && !forceReindex) {
|
||||
return; // Already active and no forced reindex requested
|
||||
}
|
||||
|
||||
Library library = libraries.get(libraryId);
|
||||
if (library == null) {
|
||||
throw new IllegalArgumentException("Library not found: " + libraryId);
|
||||
}
|
||||
|
||||
String previousLibraryId = currentLibraryId;
|
||||
|
||||
if (libraryId.equals(currentLibraryId) && forceReindex) {
|
||||
logger.info("Forcing reindex for current library: {} ({})", library.getName(), libraryId);
|
||||
} else {
|
||||
logger.info("Switching to library: {} ({})", library.getName(), libraryId);
|
||||
}
|
||||
|
||||
// Close current resources
|
||||
closeCurrentResources();
|
||||
|
||||
// Set new active library (datasource routing handled by SmartRoutingDataSource)
|
||||
currentLibraryId = libraryId;
|
||||
// OpenSearch indexes are global - no per-library initialization needed
|
||||
logger.info("Library switched to OpenSearch mode for library: {}", libraryId);
|
||||
|
||||
logger.info("Successfully switched to library: {}", library.getName());
|
||||
|
||||
// Perform complete reindex AFTER library switch is fully complete
|
||||
// This ensures database routing is properly established
|
||||
if (forceReindex || !libraryId.equals(previousLibraryId)) {
|
||||
logger.info("Starting post-switch OpenSearch reindex for library: {}", libraryId);
|
||||
|
||||
// Run reindex asynchronously to avoid blocking authentication response
|
||||
// and allow time for database routing to fully stabilize
|
||||
String finalLibraryId = libraryId;
|
||||
new Thread(() -> {
|
||||
try {
|
||||
// Give routing time to stabilize
|
||||
Thread.sleep(500);
|
||||
logger.info("Starting async OpenSearch reindex for library: {}", finalLibraryId);
|
||||
|
||||
SearchServiceAdapter searchService = applicationContext.getBean(SearchServiceAdapter.class);
|
||||
// Get all stories and authors for reindexing
|
||||
StoryService storyService = applicationContext.getBean(StoryService.class);
|
||||
AuthorService authorService = applicationContext.getBean(AuthorService.class);
|
||||
|
||||
var allStories = storyService.findAllWithAssociations();
|
||||
var allAuthors = authorService.findAllWithStories();
|
||||
|
||||
searchService.bulkIndexStories(allStories);
|
||||
searchService.bulkIndexAuthors(allAuthors);
|
||||
|
||||
logger.info("Completed async OpenSearch reindexing for library: {} ({} stories, {} authors)",
|
||||
finalLibraryId, allStories.size(), allAuthors.size());
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to async reindex OpenSearch for library {}: {}", finalLibraryId, e.getMessage());
|
||||
}
|
||||
}, "OpenSearchReindex-" + libraryId).start();
|
||||
}
|
||||
}
|
||||
|
||||
public DataSource getCurrentDataSource() {
|
||||
if (currentLibraryId == null) {
|
||||
throw new IllegalStateException("No active library - please authenticate first");
|
||||
}
|
||||
// Return the Spring-managed primary datasource which handles routing automatically
|
||||
try {
|
||||
return applicationContext.getBean("dataSource", DataSource.class);
|
||||
} catch (Exception e) {
|
||||
throw new IllegalStateException("Failed to get routing datasource", e);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
public String getCurrentLibraryId() {
|
||||
return currentLibraryId;
|
||||
}
|
||||
|
||||
public Library getCurrentLibrary() {
|
||||
if (currentLibraryId == null) {
|
||||
return null;
|
||||
}
|
||||
return libraries.get(currentLibraryId);
|
||||
}
|
||||
|
||||
public List<LibraryDto> getAllLibraries() {
|
||||
List<LibraryDto> result = new ArrayList<>();
|
||||
for (Library library : libraries.values()) {
|
||||
boolean isActive = library.getId().equals(currentLibraryId);
|
||||
result.add(new LibraryDto(
|
||||
library.getId(),
|
||||
library.getName(),
|
||||
library.getDescription(),
|
||||
isActive,
|
||||
library.isInitialized()
|
||||
));
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
public LibraryDto getLibraryById(String libraryId) {
|
||||
Library library = libraries.get(libraryId);
|
||||
if (library != null) {
|
||||
boolean isActive = library.getId().equals(currentLibraryId);
|
||||
return new LibraryDto(
|
||||
library.getId(),
|
||||
library.getName(),
|
||||
library.getDescription(),
|
||||
isActive,
|
||||
library.isInitialized()
|
||||
);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
public String getCurrentImagePath() {
|
||||
Library current = getCurrentLibrary();
|
||||
return current != null ? current.getImagePath() : "/images/default";
|
||||
}
|
||||
|
||||
public String getImagePathForLibrary(String libraryId) {
|
||||
if (libraryId == null) {
|
||||
return "/images/default";
|
||||
}
|
||||
|
||||
Library library = libraries.get(libraryId);
|
||||
return library != null ? library.getImagePath() : "/images/default";
|
||||
}
|
||||
|
||||
public boolean changeLibraryPassword(String libraryId, String currentPassword, String newPassword) {
|
||||
Library library = libraries.get(libraryId);
|
||||
if (library == null) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Verify current password
|
||||
if (!passwordEncoder.matches(currentPassword, library.getPasswordHash())) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Update password
|
||||
library.setPasswordHash(passwordEncoder.encode(newPassword));
|
||||
saveLibrariesToFile();
|
||||
|
||||
logger.info("Password changed for library: {}", library.getName());
|
||||
return true;
|
||||
}
|
||||
|
||||
public Library createNewLibrary(String name, String description, String password) {
|
||||
// Generate unique ID
|
||||
String id = name.toLowerCase().replaceAll("[^a-z0-9]", "");
|
||||
int counter = 1;
|
||||
String originalId = id;
|
||||
while (libraries.containsKey(id)) {
|
||||
id = originalId + counter++;
|
||||
}
|
||||
|
||||
Library newLibrary = new Library(
|
||||
id,
|
||||
name,
|
||||
description,
|
||||
passwordEncoder.encode(password),
|
||||
"storycove_" + id
|
||||
);
|
||||
|
||||
try {
|
||||
// Test database creation by creating a connection
|
||||
DataSource testDs = createDataSource(newLibrary.getDbName());
|
||||
testDs.getConnection().close(); // This will create the database and schema if it doesn't exist
|
||||
|
||||
// Initialize library resources (image directories)
|
||||
initializeNewLibraryResources(id);
|
||||
|
||||
newLibrary.setInitialized(true);
|
||||
logger.info("Database and resources created for library: {}", newLibrary.getDbName());
|
||||
} catch (Exception e) {
|
||||
logger.warn("Database/resource creation failed for library {}: {}", id, e.getMessage());
|
||||
// Continue anyway - resources will be created when needed
|
||||
}
|
||||
|
||||
libraries.put(id, newLibrary);
|
||||
saveLibrariesToFile();
|
||||
|
||||
logger.info("Created new library: {} ({})", name, id);
|
||||
return newLibrary;
|
||||
}
|
||||
|
||||
private void loadLibrariesFromFile() {
|
||||
try {
|
||||
File configFile = new File(LIBRARIES_CONFIG_PATH);
|
||||
if (configFile.exists()) {
|
||||
String content = Files.readString(Paths.get(LIBRARIES_CONFIG_PATH));
|
||||
Map<String, Object> config = objectMapper.readValue(content, new TypeReference<Map<String, Object>>() {});
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
Map<String, Map<String, Object>> librariesData = (Map<String, Map<String, Object>>) config.get("libraries");
|
||||
|
||||
for (Map.Entry<String, Map<String, Object>> entry : librariesData.entrySet()) {
|
||||
String id = entry.getKey();
|
||||
Map<String, Object> data = entry.getValue();
|
||||
|
||||
Library library = new Library();
|
||||
library.setId(id);
|
||||
library.setName((String) data.get("name"));
|
||||
library.setDescription((String) data.get("description"));
|
||||
library.setPasswordHash((String) data.get("passwordHash"));
|
||||
library.setDbName((String) data.get("dbName"));
|
||||
library.setInitialized((Boolean) data.getOrDefault("initialized", false));
|
||||
|
||||
libraries.put(id, library);
|
||||
logger.info("Loaded library: {} ({})", library.getName(), id);
|
||||
}
|
||||
} else {
|
||||
logger.info("No libraries configuration file found, will create default");
|
||||
}
|
||||
} catch (IOException e) {
|
||||
logger.error("Failed to load libraries configuration", e);
|
||||
}
|
||||
}
|
||||
|
||||
private void createDefaultLibrary() {
|
||||
// Check if we're migrating from the old single-library system
|
||||
String existingDbName = extractDatabaseName(baseDbUrl);
|
||||
|
||||
Library defaultLibrary = new Library(
|
||||
"main",
|
||||
"Main Library",
|
||||
"Your existing story collection (migrated)",
|
||||
passwordEncoder.encode("temp-password-change-me"), // Temporary password
|
||||
existingDbName // Use existing database name
|
||||
);
|
||||
defaultLibrary.setInitialized(true); // Mark as initialized since it has existing data
|
||||
|
||||
libraries.put("main", defaultLibrary);
|
||||
saveLibrariesToFile();
|
||||
|
||||
logger.warn("=".repeat(80));
|
||||
logger.warn("MIGRATION: Created 'Main Library' for your existing data");
|
||||
logger.warn("Temporary password: 'temp-password-change-me'");
|
||||
logger.warn("IMPORTANT: Please set a proper password in Settings > Library Settings");
|
||||
logger.warn("=".repeat(80));
|
||||
}
|
||||
|
||||
private String extractDatabaseName(String jdbcUrl) {
|
||||
// Extract database name from JDBC URL like "jdbc:postgresql://db:5432/storycove"
|
||||
int lastSlash = jdbcUrl.lastIndexOf('/');
|
||||
if (lastSlash != -1 && lastSlash < jdbcUrl.length() - 1) {
|
||||
String dbPart = jdbcUrl.substring(lastSlash + 1);
|
||||
// Remove any query parameters
|
||||
int queryStart = dbPart.indexOf('?');
|
||||
return queryStart != -1 ? dbPart.substring(0, queryStart) : dbPart;
|
||||
}
|
||||
return "storycove"; // fallback
|
||||
}
|
||||
|
||||
private void saveLibrariesToFile() {
|
||||
try {
|
||||
Map<String, Object> config = new HashMap<>();
|
||||
Map<String, Map<String, Object>> librariesData = new HashMap<>();
|
||||
|
||||
for (Library library : libraries.values()) {
|
||||
Map<String, Object> data = new HashMap<>();
|
||||
data.put("name", library.getName());
|
||||
data.put("description", library.getDescription());
|
||||
data.put("passwordHash", library.getPasswordHash());
|
||||
data.put("dbName", library.getDbName());
|
||||
data.put("initialized", library.isInitialized());
|
||||
|
||||
librariesData.put(library.getId(), data);
|
||||
}
|
||||
|
||||
config.put("libraries", librariesData);
|
||||
|
||||
// Ensure config directory exists
|
||||
new File("/app/config").mkdirs();
|
||||
|
||||
String json = objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(config);
|
||||
Files.writeString(Paths.get(LIBRARIES_CONFIG_PATH), json);
|
||||
|
||||
logger.info("Saved libraries configuration");
|
||||
} catch (IOException e) {
|
||||
logger.error("Failed to save libraries configuration", e);
|
||||
}
|
||||
}
|
||||
|
||||
private DataSource createDataSource(String dbName) {
|
||||
String url = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
||||
logger.info("Creating DataSource for: {}", url);
|
||||
|
||||
// First, ensure the database exists
|
||||
ensureDatabaseExists(dbName);
|
||||
|
||||
HikariConfig config = new HikariConfig();
|
||||
config.setJdbcUrl(url);
|
||||
config.setUsername(dbUsername);
|
||||
config.setPassword(dbPassword);
|
||||
config.setDriverClassName("org.postgresql.Driver");
|
||||
config.setMaximumPoolSize(10);
|
||||
config.setConnectionTimeout(30000);
|
||||
|
||||
return new HikariDataSource(config);
|
||||
}
|
||||
|
||||
private void ensureDatabaseExists(String dbName) {
|
||||
// Connect to the 'postgres' database to create the new database
|
||||
String adminUrl = baseDbUrl.replaceAll("/[^/]*$", "/postgres");
|
||||
|
||||
HikariConfig adminConfig = new HikariConfig();
|
||||
adminConfig.setJdbcUrl(adminUrl);
|
||||
adminConfig.setUsername(dbUsername);
|
||||
adminConfig.setPassword(dbPassword);
|
||||
adminConfig.setDriverClassName("org.postgresql.Driver");
|
||||
adminConfig.setMaximumPoolSize(1);
|
||||
adminConfig.setConnectionTimeout(30000);
|
||||
|
||||
boolean databaseCreated = false;
|
||||
|
||||
try (HikariDataSource adminDataSource = new HikariDataSource(adminConfig);
|
||||
var connection = adminDataSource.getConnection();
|
||||
var statement = connection.createStatement()) {
|
||||
|
||||
// Check if database exists
|
||||
String checkQuery = "SELECT 1 FROM pg_database WHERE datname = ?";
|
||||
try (var preparedStatement = connection.prepareStatement(checkQuery)) {
|
||||
preparedStatement.setString(1, dbName);
|
||||
try (var resultSet = preparedStatement.executeQuery()) {
|
||||
if (resultSet.next()) {
|
||||
logger.info("Database {} already exists", dbName);
|
||||
return; // Database exists, nothing to do
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create database if it doesn't exist
|
||||
// Note: Database names cannot be parameterized, but we validate the name is safe
|
||||
if (!dbName.matches("^[a-zA-Z][a-zA-Z0-9_]*$")) {
|
||||
throw new IllegalArgumentException("Invalid database name: " + dbName);
|
||||
}
|
||||
|
||||
String createQuery = "CREATE DATABASE " + dbName;
|
||||
statement.executeUpdate(createQuery);
|
||||
logger.info("Created database: {}", dbName);
|
||||
databaseCreated = true;
|
||||
|
||||
} catch (SQLException e) {
|
||||
logger.error("Failed to ensure database {} exists: {}", dbName, e.getMessage());
|
||||
throw new RuntimeException("Database creation failed", e);
|
||||
}
|
||||
|
||||
// If we just created the database, initialize its schema
|
||||
if (databaseCreated) {
|
||||
initializeNewDatabaseSchema(dbName);
|
||||
}
|
||||
}
|
||||
|
||||
private void initializeNewDatabaseSchema(String dbName) {
|
||||
logger.info("Initializing schema for new database: {}", dbName);
|
||||
|
||||
// Create a temporary DataSource for the new database to initialize schema
|
||||
String newDbUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
||||
|
||||
HikariConfig config = new HikariConfig();
|
||||
config.setJdbcUrl(newDbUrl);
|
||||
config.setUsername(dbUsername);
|
||||
config.setPassword(dbPassword);
|
||||
config.setDriverClassName("org.postgresql.Driver");
|
||||
config.setMaximumPoolSize(1);
|
||||
config.setConnectionTimeout(30000);
|
||||
|
||||
try (HikariDataSource tempDataSource = new HikariDataSource(config)) {
|
||||
// Use Hibernate to create the schema
|
||||
// This mimics what Spring Boot does during startup
|
||||
createSchemaUsingHibernate(tempDataSource);
|
||||
logger.info("Schema initialized for database: {}", dbName);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to initialize schema for database {}: {}", dbName, e.getMessage());
|
||||
throw new RuntimeException("Schema initialization failed", e);
|
||||
}
|
||||
}
|
||||
|
||||
public void initializeNewLibraryResources(String libraryId) {
|
||||
Library library = libraries.get(libraryId);
|
||||
if (library == null) {
|
||||
throw new IllegalArgumentException("Library not found: " + libraryId);
|
||||
}
|
||||
|
||||
try {
|
||||
logger.info("Initializing resources for new library: {}", library.getName());
|
||||
|
||||
// 1. Create image directory structure
|
||||
initializeImageDirectories(library);
|
||||
|
||||
// 2. OpenSearch indexes are global and managed automatically
|
||||
// No per-library initialization needed for OpenSearch
|
||||
|
||||
logger.info("Successfully initialized resources for library: {}", library.getName());
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to initialize resources for library {}: {}", libraryId, e.getMessage());
|
||||
throw new RuntimeException("Library resource initialization failed", e);
|
||||
}
|
||||
}
|
||||
|
||||
private void initializeImageDirectories(Library library) {
|
||||
try {
|
||||
// Create the library-specific image directory
|
||||
String imagePath = "/app/images/" + library.getId();
|
||||
java.nio.file.Path libraryImagePath = java.nio.file.Paths.get(imagePath);
|
||||
|
||||
if (!java.nio.file.Files.exists(libraryImagePath)) {
|
||||
java.nio.file.Files.createDirectories(libraryImagePath);
|
||||
logger.info("Created image directory: {}", imagePath);
|
||||
|
||||
// Create subdirectories for different image types
|
||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("stories"));
|
||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("authors"));
|
||||
java.nio.file.Files.createDirectories(libraryImagePath.resolve("collections"));
|
||||
|
||||
logger.info("Created image subdirectories for library: {}", library.getId());
|
||||
} else {
|
||||
logger.info("Image directory already exists: {}", imagePath);
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to create image directories for library {}: {}", library.getId(), e.getMessage());
|
||||
throw new RuntimeException("Image directory creation failed", e);
|
||||
}
|
||||
}
|
||||
|
||||
private void createSchemaUsingHibernate(DataSource dataSource) {
|
||||
// Create the essential tables manually using the same DDL that Hibernate would generate
|
||||
// This is simpler than setting up a full Hibernate configuration for schema creation
|
||||
|
||||
String[] createTableStatements = {
|
||||
// Authors table
|
||||
"""
|
||||
CREATE TABLE authors (
|
||||
author_rating integer,
|
||||
created_at timestamp(6) not null,
|
||||
updated_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
avatar_image_path varchar(255),
|
||||
name varchar(255) not null,
|
||||
notes TEXT,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Author URLs table
|
||||
"""
|
||||
CREATE TABLE author_urls (
|
||||
author_id uuid not null,
|
||||
url varchar(255)
|
||||
)
|
||||
""",
|
||||
|
||||
// Series table
|
||||
"""
|
||||
CREATE TABLE series (
|
||||
created_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
description varchar(1000),
|
||||
name varchar(255) not null,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Tags table
|
||||
"""
|
||||
CREATE TABLE tags (
|
||||
color varchar(7),
|
||||
created_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
description varchar(500),
|
||||
name varchar(255) not null unique,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Tag aliases table
|
||||
"""
|
||||
CREATE TABLE tag_aliases (
|
||||
created_from_merge boolean not null,
|
||||
created_at timestamp(6) not null,
|
||||
canonical_tag_id uuid not null,
|
||||
id uuid not null,
|
||||
alias_name varchar(255) not null unique,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Collections table
|
||||
"""
|
||||
CREATE TABLE collections (
|
||||
is_archived boolean not null,
|
||||
rating integer,
|
||||
created_at timestamp(6) not null,
|
||||
updated_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
cover_image_path varchar(500),
|
||||
name varchar(500) not null,
|
||||
description TEXT,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Stories table
|
||||
"""
|
||||
CREATE TABLE stories (
|
||||
is_read boolean,
|
||||
rating integer,
|
||||
reading_position integer,
|
||||
volume integer,
|
||||
word_count integer,
|
||||
created_at timestamp(6) not null,
|
||||
last_read_at timestamp(6),
|
||||
updated_at timestamp(6) not null,
|
||||
author_id uuid,
|
||||
id uuid not null,
|
||||
series_id uuid,
|
||||
description varchar(1000),
|
||||
content_html TEXT,
|
||||
content_plain TEXT,
|
||||
cover_path varchar(255),
|
||||
source_url varchar(255),
|
||||
summary TEXT,
|
||||
title varchar(255) not null,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Reading positions table
|
||||
"""
|
||||
CREATE TABLE reading_positions (
|
||||
chapter_index integer,
|
||||
character_position integer,
|
||||
percentage_complete float(53),
|
||||
word_position integer,
|
||||
created_at timestamp(6) not null,
|
||||
updated_at timestamp(6) not null,
|
||||
id uuid not null,
|
||||
story_id uuid not null,
|
||||
context_after varchar(500),
|
||||
context_before varchar(500),
|
||||
chapter_title varchar(255),
|
||||
epub_cfi TEXT,
|
||||
primary key (id)
|
||||
)
|
||||
""",
|
||||
|
||||
// Junction tables
|
||||
"""
|
||||
CREATE TABLE story_tags (
|
||||
story_id uuid not null,
|
||||
tag_id uuid not null,
|
||||
primary key (story_id, tag_id)
|
||||
)
|
||||
""",
|
||||
|
||||
"""
|
||||
CREATE TABLE collection_stories (
|
||||
position integer not null,
|
||||
added_at timestamp(6) not null,
|
||||
collection_id uuid not null,
|
||||
story_id uuid not null,
|
||||
primary key (collection_id, story_id),
|
||||
unique (collection_id, position)
|
||||
)
|
||||
""",
|
||||
|
||||
"""
|
||||
CREATE TABLE collection_tags (
|
||||
collection_id uuid not null,
|
||||
tag_id uuid not null,
|
||||
primary key (collection_id, tag_id)
|
||||
)
|
||||
"""
|
||||
};
|
||||
|
||||
String[] createIndexStatements = {
|
||||
"CREATE INDEX idx_reading_position_story ON reading_positions (story_id)"
|
||||
};
|
||||
|
||||
String[] createConstraintStatements = {
|
||||
// Foreign key constraints
|
||||
"ALTER TABLE author_urls ADD CONSTRAINT FKdqhp51m0uveybsts098gd79uo FOREIGN KEY (author_id) REFERENCES authors",
|
||||
"ALTER TABLE stories ADD CONSTRAINT FKhwecpqeaxy40ftrctef1u7gw7 FOREIGN KEY (author_id) REFERENCES authors",
|
||||
"ALTER TABLE stories ADD CONSTRAINT FK1kulyvy7wwcolp2gkndt57cp7 FOREIGN KEY (series_id) REFERENCES series",
|
||||
"ALTER TABLE reading_positions ADD CONSTRAINT FKglfhdhflan3pgyr2u0gxi21i5 FOREIGN KEY (story_id) REFERENCES stories",
|
||||
"ALTER TABLE story_tags ADD CONSTRAINT FKmans33ijt0nf65t0sng2r848j FOREIGN KEY (tag_id) REFERENCES tags",
|
||||
"ALTER TABLE story_tags ADD CONSTRAINT FKq9guid7swnjxwdpgxj3jo1rsi FOREIGN KEY (story_id) REFERENCES stories",
|
||||
"ALTER TABLE tag_aliases ADD CONSTRAINT FKqfsawmcj3ey4yycb6958y24ch FOREIGN KEY (canonical_tag_id) REFERENCES tags",
|
||||
"ALTER TABLE collection_stories ADD CONSTRAINT FKr55ho4vhj0wp03x13iskr1jds FOREIGN KEY (collection_id) REFERENCES collections",
|
||||
"ALTER TABLE collection_stories ADD CONSTRAINT FK7n41tbbrt7r2e81hpu3612r1o FOREIGN KEY (story_id) REFERENCES stories",
|
||||
"ALTER TABLE collection_tags ADD CONSTRAINT FKceq7ggev8n8ibjui1x5yo4x67 FOREIGN KEY (tag_id) REFERENCES tags",
|
||||
"ALTER TABLE collection_tags ADD CONSTRAINT FKq9sa5s8csdpbphrvb48tts8jt FOREIGN KEY (collection_id) REFERENCES collections"
|
||||
};
|
||||
|
||||
try (var connection = dataSource.getConnection();
|
||||
var statement = connection.createStatement()) {
|
||||
|
||||
// Create tables
|
||||
for (String sql : createTableStatements) {
|
||||
statement.executeUpdate(sql);
|
||||
}
|
||||
|
||||
// Create indexes
|
||||
for (String sql : createIndexStatements) {
|
||||
statement.executeUpdate(sql);
|
||||
}
|
||||
|
||||
// Create constraints
|
||||
for (String sql : createConstraintStatements) {
|
||||
statement.executeUpdate(sql);
|
||||
}
|
||||
|
||||
logger.info("Successfully created all database tables and constraints");
|
||||
|
||||
} catch (SQLException e) {
|
||||
logger.error("Failed to create database schema", e);
|
||||
throw new RuntimeException("Schema creation failed", e);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
private void closeCurrentResources() {
|
||||
// No need to close datasource - SmartRoutingDataSource handles this
|
||||
// OpenSearch service is managed by Spring - no explicit cleanup needed
|
||||
// Don't clear currentLibraryId here - only when explicitly switching
|
||||
}
|
||||
|
||||
/**
|
||||
* Update library metadata (name and description)
|
||||
*/
|
||||
public synchronized void updateLibraryMetadata(String libraryId, String newName, String newDescription) throws Exception {
|
||||
if (libraryId == null || libraryId.trim().isEmpty()) {
|
||||
throw new IllegalArgumentException("Library ID cannot be null or empty");
|
||||
}
|
||||
|
||||
Library library = libraries.get(libraryId);
|
||||
if (library == null) {
|
||||
throw new IllegalArgumentException("Library not found: " + libraryId);
|
||||
}
|
||||
|
||||
// Validate new name
|
||||
if (newName == null || newName.trim().isEmpty()) {
|
||||
throw new IllegalArgumentException("Library name cannot be null or empty");
|
||||
}
|
||||
|
||||
String oldName = library.getName();
|
||||
String oldDescription = library.getDescription();
|
||||
|
||||
// Update the library object
|
||||
library.setName(newName.trim());
|
||||
library.setDescription(newDescription != null ? newDescription.trim() : "");
|
||||
|
||||
try {
|
||||
// Save to configuration file
|
||||
saveLibraryConfiguration(library);
|
||||
|
||||
logger.info("Updated library metadata - ID: {}, Name: '{}' -> '{}', Description: '{}' -> '{}'",
|
||||
libraryId, oldName, newName, oldDescription, library.getDescription());
|
||||
|
||||
} catch (Exception e) {
|
||||
// Rollback changes on failure
|
||||
library.setName(oldName);
|
||||
library.setDescription(oldDescription);
|
||||
throw new RuntimeException("Failed to update library metadata: " + e.getMessage(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Save library configuration to file
|
||||
*/
|
||||
private void saveLibraryConfiguration(Library library) throws Exception {
|
||||
Path libraryConfigPath = libraryConfigDir.resolve(library.getId() + ".json");
|
||||
|
||||
// Create library configuration object
|
||||
Map<String, Object> config = new HashMap<>();
|
||||
config.put("id", library.getId());
|
||||
config.put("name", library.getName());
|
||||
config.put("description", library.getDescription());
|
||||
config.put("passwordHash", library.getPasswordHash());
|
||||
config.put("dbName", library.getDbName());
|
||||
config.put("imagePath", library.getImagePath());
|
||||
config.put("initialized", library.isInitialized());
|
||||
|
||||
// Write to file
|
||||
ObjectMapper mapper = new ObjectMapper();
|
||||
String configJson = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(config);
|
||||
Files.writeString(libraryConfigPath, configJson, StandardCharsets.UTF_8);
|
||||
|
||||
logger.debug("Saved library configuration to: {}", libraryConfigPath);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,133 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.config.OpenSearchProperties;
|
||||
import org.opensearch.client.opensearch.OpenSearchClient;
|
||||
import org.opensearch.client.opensearch.cluster.HealthRequest;
|
||||
import org.opensearch.client.opensearch.cluster.HealthResponse;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.boot.actuate.health.Health;
|
||||
import org.springframework.boot.actuate.health.HealthIndicator;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||
import org.springframework.scheduling.annotation.Scheduled;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.concurrent.atomic.AtomicReference;
|
||||
|
||||
@Service
|
||||
@ConditionalOnProperty(name = "storycove.search.engine", havingValue = "opensearch")
|
||||
public class OpenSearchHealthService implements HealthIndicator {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(OpenSearchHealthService.class);
|
||||
|
||||
private final OpenSearchClient openSearchClient;
|
||||
private final OpenSearchProperties properties;
|
||||
|
||||
private final AtomicReference<Health> lastKnownHealth = new AtomicReference<>(Health.unknown().build());
|
||||
private LocalDateTime lastCheckTime = LocalDateTime.now();
|
||||
|
||||
@Autowired
|
||||
public OpenSearchHealthService(OpenSearchClient openSearchClient, OpenSearchProperties properties) {
|
||||
this.openSearchClient = openSearchClient;
|
||||
this.properties = properties;
|
||||
}
|
||||
|
||||
@Override
|
||||
public Health health() {
|
||||
return lastKnownHealth.get();
|
||||
}
|
||||
|
||||
@Scheduled(fixedDelayString = "#{@openSearchProperties.health.checkInterval}")
|
||||
public void performHealthCheck() {
|
||||
try {
|
||||
HealthResponse clusterHealth = openSearchClient.cluster().health(
|
||||
HealthRequest.of(h -> h.timeout(t -> t.time("10s")))
|
||||
);
|
||||
|
||||
Health.Builder healthBuilder = Health.up()
|
||||
.withDetail("cluster_name", clusterHealth.clusterName())
|
||||
.withDetail("status", clusterHealth.status().jsonValue())
|
||||
.withDetail("number_of_nodes", clusterHealth.numberOfNodes())
|
||||
.withDetail("number_of_data_nodes", clusterHealth.numberOfDataNodes())
|
||||
.withDetail("active_primary_shards", clusterHealth.activePrimaryShards())
|
||||
.withDetail("active_shards", clusterHealth.activeShards())
|
||||
.withDetail("relocating_shards", clusterHealth.relocatingShards())
|
||||
.withDetail("initializing_shards", clusterHealth.initializingShards())
|
||||
.withDetail("unassigned_shards", clusterHealth.unassignedShards())
|
||||
.withDetail("last_check", LocalDateTime.now());
|
||||
|
||||
// Check if cluster status is concerning
|
||||
switch (clusterHealth.status()) {
|
||||
case Red:
|
||||
healthBuilder = Health.down()
|
||||
.withDetail("reason", "Cluster status is RED - some primary shards are unassigned");
|
||||
break;
|
||||
case Yellow:
|
||||
if (isProduction()) {
|
||||
healthBuilder = Health.down()
|
||||
.withDetail("reason", "Cluster status is YELLOW - some replica shards are unassigned (critical in production)");
|
||||
} else {
|
||||
// Yellow is acceptable in development (single node clusters)
|
||||
healthBuilder.withDetail("warning", "Cluster status is YELLOW - acceptable for development");
|
||||
}
|
||||
break;
|
||||
case Green:
|
||||
// All good
|
||||
break;
|
||||
}
|
||||
|
||||
lastKnownHealth.set(healthBuilder.build());
|
||||
lastCheckTime = LocalDateTime.now();
|
||||
|
||||
if (properties.getHealth().isEnableMetrics()) {
|
||||
logMetrics(clusterHealth);
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("OpenSearch health check failed", e);
|
||||
Health unhealthyStatus = Health.down()
|
||||
.withDetail("error", e.getMessage())
|
||||
.withDetail("last_successful_check", lastCheckTime)
|
||||
.withDetail("current_time", LocalDateTime.now())
|
||||
.build();
|
||||
lastKnownHealth.set(unhealthyStatus);
|
||||
}
|
||||
}
|
||||
|
||||
private void logMetrics(HealthResponse clusterHealth) {
|
||||
logger.info("OpenSearch Cluster Metrics - Status: {}, Nodes: {}, Active Shards: {}, Unassigned: {}",
|
||||
clusterHealth.status().jsonValue(),
|
||||
clusterHealth.numberOfNodes(),
|
||||
clusterHealth.activeShards(),
|
||||
clusterHealth.unassignedShards());
|
||||
}
|
||||
|
||||
private boolean isProduction() {
|
||||
return "production".equalsIgnoreCase(properties.getProfile());
|
||||
}
|
||||
|
||||
/**
|
||||
* Manual health check for immediate status
|
||||
*/
|
||||
public boolean isClusterHealthy() {
|
||||
Health currentHealth = lastKnownHealth.get();
|
||||
return currentHealth.getStatus() == org.springframework.boot.actuate.health.Status.UP;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get detailed cluster information
|
||||
*/
|
||||
public String getClusterInfo() {
|
||||
try {
|
||||
var info = openSearchClient.info();
|
||||
return String.format("OpenSearch %s (Cluster: %s, Lucene: %s)",
|
||||
info.version().number(),
|
||||
info.clusterName(),
|
||||
info.version().luceneVersion());
|
||||
} catch (Exception e) {
|
||||
return "Unable to retrieve cluster information: " + e.getMessage();
|
||||
}
|
||||
}
|
||||
}
|
||||
1077
backend/src/main/java/com/storycove/service/OpenSearchService.java
Normal file
1077
backend/src/main/java/com/storycove/service/OpenSearchService.java
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,36 +1,83 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import com.storycove.util.JwtUtil;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.security.crypto.password.PasswordEncoder;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
@Service
|
||||
public class PasswordAuthenticationService {
|
||||
|
||||
@Value("${storycove.auth.password}")
|
||||
private String applicationPassword;
|
||||
private static final Logger logger = LoggerFactory.getLogger(PasswordAuthenticationService.class);
|
||||
|
||||
private final PasswordEncoder passwordEncoder;
|
||||
private final LibraryService libraryService;
|
||||
private final JwtUtil jwtUtil;
|
||||
|
||||
public PasswordAuthenticationService(PasswordEncoder passwordEncoder) {
|
||||
@Autowired
|
||||
public PasswordAuthenticationService(
|
||||
PasswordEncoder passwordEncoder,
|
||||
LibraryService libraryService,
|
||||
JwtUtil jwtUtil) {
|
||||
this.passwordEncoder = passwordEncoder;
|
||||
this.libraryService = libraryService;
|
||||
this.jwtUtil = jwtUtil;
|
||||
}
|
||||
|
||||
public boolean authenticate(String providedPassword) {
|
||||
/**
|
||||
* Authenticate user and switch to the appropriate library
|
||||
* Returns JWT token if authentication successful, null otherwise
|
||||
*/
|
||||
public String authenticateAndSwitchLibrary(String providedPassword) {
|
||||
if (providedPassword == null || providedPassword.trim().isEmpty()) {
|
||||
return false;
|
||||
return null;
|
||||
}
|
||||
|
||||
// If application password starts with {bcrypt}, it's already encoded
|
||||
if (applicationPassword.startsWith("{bcrypt}") || applicationPassword.startsWith("$2")) {
|
||||
return passwordEncoder.matches(providedPassword, applicationPassword);
|
||||
// Find which library this password belongs to
|
||||
String libraryId = libraryService.authenticateAndGetLibrary(providedPassword);
|
||||
if (libraryId == null) {
|
||||
logger.warn("Authentication failed - invalid password");
|
||||
return null;
|
||||
}
|
||||
|
||||
// Otherwise, compare directly (for development/testing)
|
||||
return applicationPassword.equals(providedPassword);
|
||||
try {
|
||||
// Switch to the authenticated library with forced reindexing (may take 2-3 seconds)
|
||||
libraryService.switchToLibraryAfterAuthentication(libraryId);
|
||||
|
||||
// Generate JWT token with library context
|
||||
String token = jwtUtil.generateToken("user", libraryId);
|
||||
|
||||
logger.info("Successfully authenticated and switched to library: {}", libraryId);
|
||||
return token;
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to switch to library: {}", libraryId, e);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Legacy method - kept for backward compatibility
|
||||
*/
|
||||
@Deprecated
|
||||
public boolean authenticate(String providedPassword) {
|
||||
return authenticateAndSwitchLibrary(providedPassword) != null;
|
||||
}
|
||||
|
||||
public String encodePassword(String rawPassword) {
|
||||
return passwordEncoder.encode(rawPassword);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current library info for authenticated user
|
||||
*/
|
||||
public String getCurrentLibraryInfo() {
|
||||
var library = libraryService.getCurrentLibrary();
|
||||
if (library != null) {
|
||||
return String.format("Library: %s (%s)", library.getName(), library.getId());
|
||||
}
|
||||
return "No library active";
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,278 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.AuthorSearchDto;
|
||||
import com.storycove.dto.SearchResultDto;
|
||||
import com.storycove.dto.StorySearchDto;
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Story;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
/**
|
||||
* Service adapter that provides a unified interface for search operations.
|
||||
*
|
||||
* This adapter directly delegates to OpenSearchService.
|
||||
*/
|
||||
@Service
|
||||
public class SearchServiceAdapter {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(SearchServiceAdapter.class);
|
||||
|
||||
@Autowired
|
||||
private OpenSearchService openSearchService;
|
||||
|
||||
// ===============================
|
||||
// SEARCH OPERATIONS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Search stories with unified interface
|
||||
*/
|
||||
public SearchResultDto<StorySearchDto> searchStories(String query, List<String> tags, String author,
|
||||
String series, Integer minWordCount, Integer maxWordCount,
|
||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||
String sortBy, String sortOrder, int page, int size,
|
||||
List<String> facetBy,
|
||||
// Advanced filters
|
||||
String createdAfter, String createdBefore,
|
||||
String lastReadAfter, String lastReadBefore,
|
||||
Boolean unratedOnly, String readingStatus,
|
||||
Boolean hasReadingProgress, Boolean hasCoverImage,
|
||||
String sourceDomain, String seriesFilter,
|
||||
Integer minTagCount, Boolean popularOnly,
|
||||
Boolean hiddenGemsOnly) {
|
||||
return openSearchService.searchStories(query, tags, author, series, minWordCount, maxWordCount,
|
||||
minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy,
|
||||
createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus,
|
||||
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly,
|
||||
hiddenGemsOnly);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get random stories with unified interface
|
||||
*/
|
||||
public List<StorySearchDto> getRandomStories(int count, List<String> tags, String author,
|
||||
String series, Integer minWordCount, Integer maxWordCount,
|
||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||
Long seed) {
|
||||
return openSearchService.getRandomStories(count, tags, author, series, minWordCount, maxWordCount,
|
||||
minRating, isRead, isFavorite, seed);
|
||||
}
|
||||
|
||||
/**
|
||||
* Recreate search indices
|
||||
*/
|
||||
public void recreateIndices() {
|
||||
try {
|
||||
openSearchService.recreateIndices();
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to recreate search indices", e);
|
||||
throw new RuntimeException("Failed to recreate search indices", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform complete reindex of all data
|
||||
*/
|
||||
public void performCompleteReindex() {
|
||||
try {
|
||||
recreateIndices();
|
||||
logger.info("Search indices recreated successfully");
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to perform complete reindex", e);
|
||||
throw new RuntimeException("Failed to perform complete reindex", e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get random story ID with unified interface
|
||||
*/
|
||||
public String getRandomStoryId(Long seed) {
|
||||
return openSearchService.getRandomStoryId(seed);
|
||||
}
|
||||
|
||||
/**
|
||||
* Search authors with unified interface
|
||||
*/
|
||||
public List<AuthorSearchDto> searchAuthors(String query, int limit) {
|
||||
return openSearchService.searchAuthors(query, limit);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get tag suggestions with unified interface
|
||||
*/
|
||||
public List<String> getTagSuggestions(String query, int limit) {
|
||||
return openSearchService.getTagSuggestions(query, limit);
|
||||
}
|
||||
|
||||
// ===============================
|
||||
// INDEX OPERATIONS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Index a story in OpenSearch
|
||||
*/
|
||||
public void indexStory(Story story) {
|
||||
try {
|
||||
openSearchService.indexStory(story);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index story {}", story.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a story in OpenSearch
|
||||
*/
|
||||
public void updateStory(Story story) {
|
||||
try {
|
||||
openSearchService.updateStory(story);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update story {}", story.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a story from OpenSearch
|
||||
*/
|
||||
public void deleteStory(UUID storyId) {
|
||||
try {
|
||||
openSearchService.deleteStory(storyId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete story {}", storyId, e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Index an author in OpenSearch
|
||||
*/
|
||||
public void indexAuthor(Author author) {
|
||||
try {
|
||||
openSearchService.indexAuthor(author);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index author {}", author.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update an author in OpenSearch
|
||||
*/
|
||||
public void updateAuthor(Author author) {
|
||||
try {
|
||||
openSearchService.updateAuthor(author);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update author {}", author.getId(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete an author from OpenSearch
|
||||
*/
|
||||
public void deleteAuthor(UUID authorId) {
|
||||
try {
|
||||
openSearchService.deleteAuthor(authorId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete author {}", authorId, e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bulk index stories in OpenSearch
|
||||
*/
|
||||
public void bulkIndexStories(List<Story> stories) {
|
||||
try {
|
||||
openSearchService.bulkIndexStories(stories);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} stories", stories.size(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Bulk index authors in OpenSearch
|
||||
*/
|
||||
public void bulkIndexAuthors(List<Author> authors) {
|
||||
try {
|
||||
openSearchService.bulkIndexAuthors(authors);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} authors", authors.size(), e);
|
||||
}
|
||||
}
|
||||
|
||||
// ===============================
|
||||
// UTILITY METHODS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Check if search service is available and healthy
|
||||
*/
|
||||
public boolean isSearchServiceAvailable() {
|
||||
return openSearchService.testConnection();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current search engine name
|
||||
*/
|
||||
public String getCurrentSearchEngine() {
|
||||
return "opensearch";
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if dual-write is enabled
|
||||
*/
|
||||
public boolean isDualWriteEnabled() {
|
||||
return false; // No longer supported
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if we can switch to OpenSearch
|
||||
*/
|
||||
public boolean canSwitchToOpenSearch() {
|
||||
return true; // Already using OpenSearch
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if we can switch to Typesense
|
||||
*/
|
||||
public boolean canSwitchToTypesense() {
|
||||
return false; // Typesense no longer available
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current search status for admin interface
|
||||
*/
|
||||
public SearchStatus getSearchStatus() {
|
||||
return new SearchStatus(
|
||||
"opensearch",
|
||||
false, // no dual-write
|
||||
false, // no typesense
|
||||
openSearchService.testConnection()
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* DTO for search status
|
||||
*/
|
||||
public static class SearchStatus {
|
||||
private final String primaryEngine;
|
||||
private final boolean dualWrite;
|
||||
private final boolean typesenseAvailable;
|
||||
private final boolean openSearchAvailable;
|
||||
|
||||
public SearchStatus(String primaryEngine, boolean dualWrite,
|
||||
boolean typesenseAvailable, boolean openSearchAvailable) {
|
||||
this.primaryEngine = primaryEngine;
|
||||
this.dualWrite = dualWrite;
|
||||
this.typesenseAvailable = typesenseAvailable;
|
||||
this.openSearchAvailable = openSearchAvailable;
|
||||
}
|
||||
|
||||
public String getPrimaryEngine() { return primaryEngine; }
|
||||
public boolean isDualWrite() { return dualWrite; }
|
||||
public boolean isTypesenseAvailable() { return typesenseAvailable; }
|
||||
public boolean isOpenSearchAvailable() { return openSearchAvailable; }
|
||||
}
|
||||
}
|
||||
@@ -5,6 +5,8 @@ import com.storycove.repository.SeriesRepository;
|
||||
import com.storycove.service.exception.DuplicateResourceException;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
@@ -21,6 +23,8 @@ import java.util.UUID;
|
||||
@Transactional
|
||||
public class SeriesService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(SeriesService.class);
|
||||
|
||||
private final SeriesRepository seriesRepository;
|
||||
|
||||
@Autowired
|
||||
|
||||
@@ -10,8 +10,9 @@ import com.storycove.repository.TagRepository;
|
||||
import com.storycove.service.exception.DuplicateResourceException;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnBean;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.stereotype.Service;
|
||||
@@ -25,12 +26,15 @@ import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.Set;
|
||||
import java.util.UUID;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
@Service
|
||||
@Validated
|
||||
@Transactional
|
||||
public class StoryService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(StoryService.class);
|
||||
|
||||
private final StoryRepository storyRepository;
|
||||
private final TagRepository tagRepository;
|
||||
private final ReadingPositionRepository readingPositionRepository;
|
||||
@@ -38,7 +42,7 @@ public class StoryService {
|
||||
private final TagService tagService;
|
||||
private final SeriesService seriesService;
|
||||
private final HtmlSanitizationService sanitizationService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
@Autowired
|
||||
public StoryService(StoryRepository storyRepository,
|
||||
@@ -48,7 +52,7 @@ public class StoryService {
|
||||
TagService tagService,
|
||||
SeriesService seriesService,
|
||||
HtmlSanitizationService sanitizationService,
|
||||
@Autowired(required = false) TypesenseService typesenseService) {
|
||||
SearchServiceAdapter searchServiceAdapter) {
|
||||
this.storyRepository = storyRepository;
|
||||
this.tagRepository = tagRepository;
|
||||
this.readingPositionRepository = readingPositionRepository;
|
||||
@@ -56,7 +60,7 @@ public class StoryService {
|
||||
this.tagService = tagService;
|
||||
this.seriesService = seriesService;
|
||||
this.sanitizationService = sanitizationService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
@@ -80,11 +84,13 @@ public class StoryService {
|
||||
.orElseThrow(() -> new ResourceNotFoundException("Story", id.toString()));
|
||||
}
|
||||
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public Optional<Story> findByIdOptional(UUID id) {
|
||||
return storyRepository.findById(id);
|
||||
}
|
||||
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public Optional<Story> findByTitle(String title) {
|
||||
return storyRepository.findByTitle(title);
|
||||
@@ -119,7 +125,7 @@ public class StoryService {
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public List<Story> findBySeries(UUID seriesId) {
|
||||
Series series = seriesService.findById(seriesId);
|
||||
seriesService.findById(seriesId); // Validate series exists
|
||||
return storyRepository.findBySeriesOrderByVolume(seriesId);
|
||||
}
|
||||
|
||||
@@ -233,10 +239,8 @@ public class StoryService {
|
||||
story.addTag(tag);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new tag information
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with new tag information
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -250,10 +254,8 @@ public class StoryService {
|
||||
story.removeTag(tag);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with updated tag information
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with updated tag information
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -268,10 +270,8 @@ public class StoryService {
|
||||
story.setRating(rating);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new rating
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with new rating
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -286,10 +286,8 @@ public class StoryService {
|
||||
story.updateReadingProgress(position);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new reading progress
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with new reading progress
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -307,10 +305,8 @@ public class StoryService {
|
||||
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new reading status
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
// Update search index with new reading status
|
||||
searchServiceAdapter.updateStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -352,10 +348,8 @@ public class StoryService {
|
||||
updateStoryTags(savedStory, story.getTags());
|
||||
}
|
||||
|
||||
// Index in Typesense (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexStory(savedStory);
|
||||
}
|
||||
// Index in search engine
|
||||
searchServiceAdapter.indexStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -382,10 +376,8 @@ public class StoryService {
|
||||
updateStoryTagsByNames(savedStory, tagNames);
|
||||
}
|
||||
|
||||
// Index in Typesense (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.indexStory(savedStory);
|
||||
}
|
||||
// Index in search engine
|
||||
searchServiceAdapter.indexStory(savedStory);
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
@@ -403,10 +395,8 @@ public class StoryService {
|
||||
updateStoryFields(existingStory, storyUpdates);
|
||||
Story updatedStory = storyRepository.save(existingStory);
|
||||
|
||||
// Update in Typesense (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(updatedStory);
|
||||
}
|
||||
// Update in search engine
|
||||
searchServiceAdapter.updateStory(updatedStory);
|
||||
|
||||
return updatedStory;
|
||||
}
|
||||
@@ -426,10 +416,8 @@ public class StoryService {
|
||||
|
||||
Story updatedStory = storyRepository.save(existingStory);
|
||||
|
||||
// Update in Typesense (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(updatedStory);
|
||||
}
|
||||
// Update in search engine
|
||||
searchServiceAdapter.updateStory(updatedStory);
|
||||
|
||||
return updatedStory;
|
||||
}
|
||||
@@ -449,10 +437,8 @@ public class StoryService {
|
||||
// Create a copy to avoid ConcurrentModificationException
|
||||
new ArrayList<>(story.getTags()).forEach(tag -> story.removeTag(tag));
|
||||
|
||||
// Delete from Typesense first (if available)
|
||||
if (typesenseService != null) {
|
||||
typesenseService.deleteStory(story.getId().toString());
|
||||
}
|
||||
// Delete from search engine first
|
||||
searchServiceAdapter.deleteStory(story.getId());
|
||||
|
||||
storyRepository.delete(story);
|
||||
}
|
||||
@@ -615,9 +601,24 @@ public class StoryService {
|
||||
Author author = authorService.findById(updateReq.getAuthorId());
|
||||
story.setAuthor(author);
|
||||
}
|
||||
// Handle series - either by ID or by name
|
||||
if (updateReq.getSeriesId() != null) {
|
||||
Series series = seriesService.findById(updateReq.getSeriesId());
|
||||
story.setSeries(series);
|
||||
} else if (updateReq.getSeriesName() != null) {
|
||||
if (updateReq.getSeriesName().trim().isEmpty()) {
|
||||
// Empty series name means remove from series
|
||||
story.setSeries(null);
|
||||
} else {
|
||||
// Find or create series by name
|
||||
Series series = seriesService.findByNameOptional(updateReq.getSeriesName().trim())
|
||||
.orElseGet(() -> {
|
||||
Series newSeries = new Series();
|
||||
newSeries.setName(updateReq.getSeriesName().trim());
|
||||
return seriesService.create(newSeries);
|
||||
});
|
||||
story.setSeries(series);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -653,12 +654,64 @@ public class StoryService {
|
||||
|
||||
/**
|
||||
* Find a random story based on optional filters.
|
||||
* Uses count + random offset approach for performance with large datasets.
|
||||
* Supports text search and multiple tags.
|
||||
* Uses search service for consistency with Library search functionality.
|
||||
* Supports text search and multiple tags using the same logic as the Library view.
|
||||
* @param searchQuery Optional search query
|
||||
* @param tags Optional list of tags to filter by
|
||||
* @return Optional containing the random story if found
|
||||
*/
|
||||
@Transactional(readOnly = true)
|
||||
public Optional<Story> findRandomStory(String searchQuery, List<String> tags) {
|
||||
return findRandomStory(searchQuery, tags, null, null, null, null, null, null, null,
|
||||
null, null, null, null, null, null, null, null, null, null, null);
|
||||
}
|
||||
|
||||
public Optional<Story> findRandomStory(String searchQuery, List<String> tags, Long seed) {
|
||||
return findRandomStory(searchQuery, tags, seed, null, null, null, null, null, null,
|
||||
null, null, null, null, null, null, null, null, null, null, null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Find a random story based on optional filters with seed support.
|
||||
* Uses search service for consistency with Library search functionality.
|
||||
* Supports text search and multiple tags using the same logic as the Library view.
|
||||
* @param searchQuery Optional search query
|
||||
* @param tags Optional list of tags to filter by
|
||||
* @param seed Optional seed for consistent randomization (null for truly random)
|
||||
* @return Optional containing the random story if found
|
||||
*/
|
||||
@Transactional(readOnly = true)
|
||||
public Optional<Story> findRandomStory(String searchQuery, List<String> tags, Long seed,
|
||||
Integer minWordCount, Integer maxWordCount,
|
||||
String createdAfter, String createdBefore,
|
||||
String lastReadAfter, String lastReadBefore,
|
||||
Integer minRating, Integer maxRating, Boolean unratedOnly,
|
||||
String readingStatus, Boolean hasReadingProgress,
|
||||
Boolean hasCoverImage, String sourceDomain,
|
||||
String seriesFilter, Integer minTagCount,
|
||||
Boolean popularOnly, Boolean hiddenGemsOnly) {
|
||||
|
||||
// Use search service for consistency with Library search
|
||||
try {
|
||||
String randomStoryId = searchServiceAdapter.getRandomStoryId(seed);
|
||||
if (randomStoryId != null) {
|
||||
return storyRepository.findById(UUID.fromString(randomStoryId));
|
||||
}
|
||||
return Optional.empty();
|
||||
} catch (Exception e) {
|
||||
// Fallback to database queries if search service fails
|
||||
logger.warn("Search service random story lookup failed, falling back to database queries", e);
|
||||
}
|
||||
|
||||
// Fallback to repository-based implementation (global routing handles library selection)
|
||||
return findRandomStoryFromRepository(searchQuery, tags);
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Find random story using repository methods (for default database or when library-aware fails)
|
||||
*/
|
||||
private Optional<Story> findRandomStoryFromRepository(String searchQuery, List<String> tags) {
|
||||
// Clean up inputs
|
||||
String cleanSearchQuery = (searchQuery != null && !searchQuery.trim().isEmpty()) ? searchQuery.trim() : null;
|
||||
List<String> cleanTags = (tags != null) ? tags.stream()
|
||||
@@ -724,4 +777,6 @@ public class StoryService {
|
||||
|
||||
return randomStory;
|
||||
}
|
||||
|
||||
|
||||
}
|
||||
@@ -1,10 +1,15 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.entity.Tag;
|
||||
import com.storycove.entity.TagAlias;
|
||||
import com.storycove.repository.TagRepository;
|
||||
import com.storycove.repository.TagAliasRepository;
|
||||
import com.storycove.service.exception.DuplicateResourceException;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
@@ -12,8 +17,11 @@ import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
import org.springframework.validation.annotation.Validated;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.Set;
|
||||
import java.util.UUID;
|
||||
|
||||
@Service
|
||||
@@ -21,11 +29,15 @@ import java.util.UUID;
|
||||
@Transactional
|
||||
public class TagService {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(TagService.class);
|
||||
|
||||
private final TagRepository tagRepository;
|
||||
private final TagAliasRepository tagAliasRepository;
|
||||
|
||||
@Autowired
|
||||
public TagService(TagRepository tagRepository) {
|
||||
public TagService(TagRepository tagRepository, TagAliasRepository tagAliasRepository) {
|
||||
this.tagRepository = tagRepository;
|
||||
this.tagAliasRepository = tagAliasRepository;
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
@@ -207,5 +219,273 @@ public class TagService {
|
||||
if (updates.getName() != null) {
|
||||
existing.setName(updates.getName());
|
||||
}
|
||||
if (updates.getColor() != null) {
|
||||
existing.setColor(updates.getColor());
|
||||
}
|
||||
if (updates.getDescription() != null) {
|
||||
existing.setDescription(updates.getDescription());
|
||||
}
|
||||
}
|
||||
|
||||
// Tag alias management methods
|
||||
|
||||
public TagAlias addAlias(UUID tagId, String aliasName) {
|
||||
Tag canonicalTag = findById(tagId);
|
||||
|
||||
// Check if alias already exists (case-insensitive)
|
||||
if (tagAliasRepository.existsByAliasNameIgnoreCase(aliasName)) {
|
||||
throw new DuplicateResourceException("Tag alias", aliasName);
|
||||
}
|
||||
|
||||
// Check if alias name conflicts with existing tag names
|
||||
if (tagRepository.existsByNameIgnoreCase(aliasName)) {
|
||||
throw new DuplicateResourceException("Tag alias conflicts with existing tag name", aliasName);
|
||||
}
|
||||
|
||||
TagAlias alias = new TagAlias();
|
||||
alias.setAliasName(aliasName);
|
||||
alias.setCanonicalTag(canonicalTag);
|
||||
alias.setCreatedFromMerge(false);
|
||||
|
||||
return tagAliasRepository.save(alias);
|
||||
}
|
||||
|
||||
public void removeAlias(UUID tagId, UUID aliasId) {
|
||||
findById(tagId); // Validate tag exists
|
||||
TagAlias alias = tagAliasRepository.findById(aliasId)
|
||||
.orElseThrow(() -> new ResourceNotFoundException("Tag alias", aliasId.toString()));
|
||||
|
||||
// Verify the alias belongs to the specified tag
|
||||
if (!alias.getCanonicalTag().getId().equals(tagId)) {
|
||||
throw new IllegalArgumentException("Alias does not belong to the specified tag");
|
||||
}
|
||||
|
||||
tagAliasRepository.delete(alias);
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public Tag resolveTagByName(String name) {
|
||||
// First try to find exact tag match
|
||||
Optional<Tag> directMatch = tagRepository.findByNameIgnoreCase(name);
|
||||
if (directMatch.isPresent()) {
|
||||
return directMatch.get();
|
||||
}
|
||||
|
||||
// Then try to find by alias
|
||||
Optional<TagAlias> aliasMatch = tagAliasRepository.findByAliasNameIgnoreCase(name);
|
||||
if (aliasMatch.isPresent()) {
|
||||
return aliasMatch.get().getCanonicalTag();
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
@Transactional
|
||||
public Tag mergeTags(List<UUID> sourceTagIds, UUID targetTagId) {
|
||||
// Validate target tag exists
|
||||
Tag targetTag = findById(targetTagId);
|
||||
|
||||
// Validate source tags exist and are different from target
|
||||
List<Tag> sourceTags = sourceTagIds.stream()
|
||||
.filter(id -> !id.equals(targetTagId)) // Don't merge tag with itself
|
||||
.map(this::findById)
|
||||
.toList();
|
||||
|
||||
if (sourceTags.isEmpty()) {
|
||||
throw new IllegalArgumentException("No valid source tags to merge");
|
||||
}
|
||||
|
||||
// Perform the merge atomically
|
||||
for (Tag sourceTag : sourceTags) {
|
||||
// Move all stories from source tag to target tag
|
||||
// Create a copy to avoid ConcurrentModificationException
|
||||
List<Story> storiesToMove = new ArrayList<>(sourceTag.getStories());
|
||||
storiesToMove.forEach(story -> {
|
||||
story.removeTag(sourceTag);
|
||||
story.addTag(targetTag);
|
||||
});
|
||||
|
||||
// Create alias for the source tag name
|
||||
TagAlias alias = new TagAlias();
|
||||
alias.setAliasName(sourceTag.getName());
|
||||
alias.setCanonicalTag(targetTag);
|
||||
alias.setCreatedFromMerge(true);
|
||||
tagAliasRepository.save(alias);
|
||||
|
||||
// Delete the source tag
|
||||
tagRepository.delete(sourceTag);
|
||||
}
|
||||
|
||||
return tagRepository.save(targetTag);
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public List<Tag> findByNameOrAliasStartingWith(String query, int limit) {
|
||||
// Find tags that start with the query
|
||||
List<Tag> directMatches = tagRepository.findByNameStartingWithIgnoreCase(query.toLowerCase());
|
||||
|
||||
// Find tags via aliases that start with the query
|
||||
List<TagAlias> aliasMatches = tagAliasRepository.findByAliasNameStartingWithIgnoreCase(query.toLowerCase());
|
||||
List<Tag> aliasTagMatches = aliasMatches.stream()
|
||||
.map(TagAlias::getCanonicalTag)
|
||||
.distinct()
|
||||
.toList();
|
||||
|
||||
// Combine and deduplicate
|
||||
Set<Tag> allMatches = new HashSet<>(directMatches);
|
||||
allMatches.addAll(aliasTagMatches);
|
||||
|
||||
// Convert to list and limit results
|
||||
return allMatches.stream()
|
||||
.sorted((a, b) -> a.getName().compareToIgnoreCase(b.getName()))
|
||||
.limit(limit)
|
||||
.toList();
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public com.storycove.controller.TagController.MergePreviewResponse previewMerge(List<UUID> sourceTagIds, UUID targetTagId) {
|
||||
// Validate target tag exists
|
||||
Tag targetTag = findById(targetTagId);
|
||||
|
||||
// Validate source tags exist and are different from target
|
||||
List<Tag> sourceTags = sourceTagIds.stream()
|
||||
.filter(id -> !id.equals(targetTagId))
|
||||
.map(this::findById)
|
||||
.toList();
|
||||
|
||||
if (sourceTags.isEmpty()) {
|
||||
throw new IllegalArgumentException("No valid source tags to merge");
|
||||
}
|
||||
|
||||
// Calculate preview data
|
||||
int targetStoryCount = targetTag.getStories().size();
|
||||
|
||||
// Collect all unique stories from all tags (including target) to handle overlaps correctly
|
||||
Set<Story> allUniqueStories = new HashSet<>(targetTag.getStories());
|
||||
for (Tag sourceTag : sourceTags) {
|
||||
allUniqueStories.addAll(sourceTag.getStories());
|
||||
}
|
||||
int totalStories = allUniqueStories.size();
|
||||
|
||||
List<String> aliasesToCreate = sourceTags.stream()
|
||||
.map(Tag::getName)
|
||||
.toList();
|
||||
|
||||
// Create response object using the controller's inner class
|
||||
var preview = new com.storycove.controller.TagController.MergePreviewResponse();
|
||||
preview.setTargetTagName(targetTag.getName());
|
||||
preview.setTargetStoryCount(targetStoryCount);
|
||||
preview.setTotalResultStoryCount(totalStories);
|
||||
preview.setAliasesToCreate(aliasesToCreate);
|
||||
|
||||
return preview;
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public List<com.storycove.controller.TagController.TagSuggestion> suggestTags(String title, String content, String summary, int limit) {
|
||||
List<com.storycove.controller.TagController.TagSuggestion> suggestions = new ArrayList<>();
|
||||
|
||||
// Get all existing tags for matching
|
||||
List<Tag> existingTags = findAll();
|
||||
|
||||
// Combine all text for analysis
|
||||
String combinedText = (title != null ? title : "") + " " +
|
||||
(summary != null ? summary : "") + " " +
|
||||
(content != null ? stripHtml(content) : "");
|
||||
|
||||
if (combinedText.trim().isEmpty()) {
|
||||
return suggestions;
|
||||
}
|
||||
|
||||
String lowerText = combinedText.toLowerCase();
|
||||
|
||||
// Score each existing tag based on how well it matches the content
|
||||
for (Tag tag : existingTags) {
|
||||
double score = calculateTagRelevanceScore(tag, lowerText, title, summary);
|
||||
|
||||
if (score > 0.1) { // Only suggest tags with reasonable confidence
|
||||
String reason = generateReason(tag, lowerText, title, summary);
|
||||
suggestions.add(new com.storycove.controller.TagController.TagSuggestion(
|
||||
tag.getName(), score, reason
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by confidence score (descending) and limit results
|
||||
return suggestions.stream()
|
||||
.sorted((a, b) -> Double.compare(b.getConfidence(), a.getConfidence()))
|
||||
.limit(limit)
|
||||
.collect(java.util.stream.Collectors.toList());
|
||||
}
|
||||
|
||||
private double calculateTagRelevanceScore(Tag tag, String lowerText, String title, String summary) {
|
||||
String tagName = tag.getName().toLowerCase();
|
||||
double score = 0.0;
|
||||
|
||||
// Exact matches get highest score
|
||||
if (lowerText.contains(" " + tagName + " ") || lowerText.startsWith(tagName + " ") || lowerText.endsWith(" " + tagName)) {
|
||||
score += 0.8;
|
||||
}
|
||||
|
||||
// Partial matches in title get high score
|
||||
if (title != null && title.toLowerCase().contains(tagName)) {
|
||||
score += 0.6;
|
||||
}
|
||||
|
||||
// Partial matches in summary get medium score
|
||||
if (summary != null && summary.toLowerCase().contains(tagName)) {
|
||||
score += 0.4;
|
||||
}
|
||||
|
||||
// Word-based matching (split tag name and look for individual words)
|
||||
String[] tagWords = tagName.split("[\\s-_]+");
|
||||
int matchedWords = 0;
|
||||
for (String word : tagWords) {
|
||||
if (word.length() > 2 && lowerText.contains(word)) {
|
||||
matchedWords++;
|
||||
}
|
||||
}
|
||||
if (tagWords.length > 0) {
|
||||
score += 0.3 * ((double) matchedWords / tagWords.length);
|
||||
}
|
||||
|
||||
// Boost score based on tag popularity (more used tags are more likely to be relevant)
|
||||
int storyCount = tag.getStories() != null ? tag.getStories().size() : 0;
|
||||
if (storyCount > 0) {
|
||||
score += Math.min(0.2, storyCount * 0.01); // Small boost, capped at 0.2
|
||||
}
|
||||
|
||||
return Math.min(1.0, score); // Cap at 1.0
|
||||
}
|
||||
|
||||
private String generateReason(Tag tag, String lowerText, String title, String summary) {
|
||||
String tagName = tag.getName().toLowerCase();
|
||||
|
||||
if (title != null && title.toLowerCase().contains(tagName)) {
|
||||
return "Found in title";
|
||||
}
|
||||
|
||||
if (summary != null && summary.toLowerCase().contains(tagName)) {
|
||||
return "Found in summary";
|
||||
}
|
||||
|
||||
if (lowerText.contains(" " + tagName + " ") || lowerText.startsWith(tagName + " ") || lowerText.endsWith(" " + tagName)) {
|
||||
return "Exact match in content";
|
||||
}
|
||||
|
||||
String[] tagWords = tagName.split("[\\s-_]+");
|
||||
for (String word : tagWords) {
|
||||
if (word.length() > 2 && lowerText.contains(word)) {
|
||||
return "Related keywords found";
|
||||
}
|
||||
}
|
||||
|
||||
return "Similar content";
|
||||
}
|
||||
|
||||
private String stripHtml(String html) {
|
||||
if (html == null) return "";
|
||||
// Simple HTML tag removal - replace with a proper HTML parser if needed
|
||||
return html.replaceAll("<[^>]+>", " ").replaceAll("\\s+", " ").trim();
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -3,35 +3,64 @@ package com.storycove.util;
|
||||
import io.jsonwebtoken.Claims;
|
||||
import io.jsonwebtoken.Jwts;
|
||||
import io.jsonwebtoken.security.Keys;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import jakarta.annotation.PostConstruct;
|
||||
import javax.crypto.SecretKey;
|
||||
import java.security.SecureRandom;
|
||||
import java.util.Base64;
|
||||
import java.util.Date;
|
||||
|
||||
@Component
|
||||
public class JwtUtil {
|
||||
|
||||
@Value("${storycove.jwt.secret}")
|
||||
private static final Logger logger = LoggerFactory.getLogger(JwtUtil.class);
|
||||
|
||||
// Security: Generate new secret on each startup to invalidate all existing tokens
|
||||
private String secret;
|
||||
|
||||
@Value("${storycove.jwt.expiration:86400000}") // 24 hours default
|
||||
private Long expiration;
|
||||
|
||||
@PostConstruct
|
||||
public void initialize() {
|
||||
// Generate a new random secret on startup to invalidate all existing JWT tokens
|
||||
// This ensures users must re-authenticate after application restart
|
||||
SecureRandom random = new SecureRandom();
|
||||
byte[] secretBytes = new byte[64]; // 512 bits
|
||||
random.nextBytes(secretBytes);
|
||||
this.secret = Base64.getEncoder().encodeToString(secretBytes);
|
||||
|
||||
logger.info("JWT secret rotated on startup - all existing tokens invalidated");
|
||||
logger.info("Users will need to re-authenticate after application restart for security");
|
||||
}
|
||||
|
||||
private SecretKey getSigningKey() {
|
||||
return Keys.hmacShaKeyFor(secret.getBytes());
|
||||
}
|
||||
|
||||
public String generateToken() {
|
||||
return generateToken("user", null);
|
||||
}
|
||||
|
||||
public String generateToken(String subject, String libraryId) {
|
||||
Date now = new Date();
|
||||
Date expiryDate = new Date(now.getTime() + expiration);
|
||||
|
||||
return Jwts.builder()
|
||||
.subject("user")
|
||||
var builder = Jwts.builder()
|
||||
.subject(subject)
|
||||
.issuedAt(now)
|
||||
.expiration(expiryDate)
|
||||
.signWith(getSigningKey())
|
||||
.compact();
|
||||
.expiration(expiryDate);
|
||||
|
||||
// Add library context if provided
|
||||
if (libraryId != null) {
|
||||
builder.claim("libraryId", libraryId);
|
||||
}
|
||||
|
||||
return builder.signWith(getSigningKey()).compact();
|
||||
}
|
||||
|
||||
public boolean validateToken(String token) {
|
||||
@@ -62,4 +91,13 @@ public class JwtUtil {
|
||||
public String getSubjectFromToken(String token) {
|
||||
return getClaimsFromToken(token).getSubject();
|
||||
}
|
||||
|
||||
public String getLibraryIdFromToken(String token) {
|
||||
try {
|
||||
Claims claims = getClaimsFromToken(token);
|
||||
return claims.get("libraryId", String.class);
|
||||
} catch (Exception e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -16,8 +16,14 @@ spring:
|
||||
|
||||
servlet:
|
||||
multipart:
|
||||
max-file-size: 10MB # Reduced for security (was 250MB)
|
||||
max-request-size: 15MB # Slightly higher to account for form data
|
||||
max-file-size: 256MB # Increased for backup restore
|
||||
max-request-size: 260MB # Slightly higher to account for form data
|
||||
|
||||
jackson:
|
||||
serialization:
|
||||
write-dates-as-timestamps: false
|
||||
deserialization:
|
||||
adjust-dates-to-context-time-zone: false
|
||||
|
||||
server:
|
||||
port: 8080
|
||||
@@ -32,15 +38,71 @@ storycove:
|
||||
expiration: 86400000 # 24 hours
|
||||
auth:
|
||||
password: ${APP_PASSWORD} # REQUIRED: No default password for security
|
||||
typesense:
|
||||
api-key: ${TYPESENSE_API_KEY:xyz}
|
||||
host: ${TYPESENSE_HOST:localhost}
|
||||
port: ${TYPESENSE_PORT:8108}
|
||||
enabled: ${TYPESENSE_ENABLED:true}
|
||||
reindex-interval: ${TYPESENSE_REINDEX_INTERVAL:3600000} # 1 hour in milliseconds
|
||||
search:
|
||||
engine: opensearch # OpenSearch is the only search engine
|
||||
opensearch:
|
||||
# Connection settings
|
||||
host: ${OPENSEARCH_HOST:localhost}
|
||||
port: ${OPENSEARCH_PORT:9200}
|
||||
scheme: ${OPENSEARCH_SCHEME:http}
|
||||
username: ${OPENSEARCH_USERNAME:}
|
||||
password: ${OPENSEARCH_PASSWORD:} # Empty when security is disabled
|
||||
|
||||
# Environment-specific configuration
|
||||
profile: ${SPRING_PROFILES_ACTIVE:development} # development, staging, production
|
||||
|
||||
# Security settings
|
||||
security:
|
||||
ssl-verification: ${OPENSEARCH_SSL_VERIFICATION:false}
|
||||
trust-all-certificates: ${OPENSEARCH_TRUST_ALL_CERTS:true}
|
||||
keystore-path: ${OPENSEARCH_KEYSTORE_PATH:}
|
||||
keystore-password: ${OPENSEARCH_KEYSTORE_PASSWORD:}
|
||||
truststore-path: ${OPENSEARCH_TRUSTSTORE_PATH:}
|
||||
truststore-password: ${OPENSEARCH_TRUSTSTORE_PASSWORD:}
|
||||
|
||||
# Connection pool settings
|
||||
connection:
|
||||
timeout: ${OPENSEARCH_CONNECTION_TIMEOUT:30000} # 30 seconds
|
||||
socket-timeout: ${OPENSEARCH_SOCKET_TIMEOUT:60000} # 60 seconds
|
||||
max-connections-per-route: ${OPENSEARCH_MAX_CONN_PER_ROUTE:10}
|
||||
max-connections-total: ${OPENSEARCH_MAX_CONN_TOTAL:30}
|
||||
retry-on-failure: ${OPENSEARCH_RETRY_ON_FAILURE:true}
|
||||
max-retries: ${OPENSEARCH_MAX_RETRIES:3}
|
||||
|
||||
# Index settings
|
||||
indices:
|
||||
default-shards: ${OPENSEARCH_DEFAULT_SHARDS:1}
|
||||
default-replicas: ${OPENSEARCH_DEFAULT_REPLICAS:0}
|
||||
refresh-interval: ${OPENSEARCH_REFRESH_INTERVAL:1s}
|
||||
|
||||
# Bulk operations
|
||||
bulk:
|
||||
actions: ${OPENSEARCH_BULK_ACTIONS:1000}
|
||||
size: ${OPENSEARCH_BULK_SIZE:5242880} # 5MB
|
||||
timeout: ${OPENSEARCH_BULK_TIMEOUT:10000} # 10 seconds
|
||||
concurrent-requests: ${OPENSEARCH_BULK_CONCURRENT:1}
|
||||
|
||||
# Health and monitoring
|
||||
health:
|
||||
check-interval: ${OPENSEARCH_HEALTH_CHECK_INTERVAL:30000} # 30 seconds
|
||||
slow-query-threshold: ${OPENSEARCH_SLOW_QUERY_THRESHOLD:5000} # 5 seconds
|
||||
enable-metrics: ${OPENSEARCH_ENABLE_METRICS:true}
|
||||
images:
|
||||
storage-path: ${IMAGE_STORAGE_PATH:/app/images}
|
||||
|
||||
management:
|
||||
endpoints:
|
||||
web:
|
||||
exposure:
|
||||
include: health,info,prometheus
|
||||
endpoint:
|
||||
health:
|
||||
show-details: when-authorized
|
||||
show-components: always
|
||||
health:
|
||||
opensearch:
|
||||
enabled: ${OPENSEARCH_HEALTH_ENABLED:true}
|
||||
|
||||
logging:
|
||||
level:
|
||||
com.storycove: ${LOG_LEVEL:INFO} # Use INFO for production, DEBUG for development
|
||||
|
||||
@@ -4,7 +4,7 @@
|
||||
"b", "strong", "i", "em", "u", "s", "strike", "del", "ins",
|
||||
"sup", "sub", "small", "big", "mark", "pre", "code", "kbd", "samp", "var",
|
||||
"ul", "ol", "li", "dl", "dt", "dd",
|
||||
"a", "table", "thead", "tbody", "tfoot", "tr", "th", "td", "caption", "colgroup", "col",
|
||||
"a", "img", "table", "thead", "tbody", "tfoot", "tr", "th", "td", "caption", "colgroup", "col",
|
||||
"blockquote", "cite", "q", "hr", "details", "summary"
|
||||
],
|
||||
"allowedAttributes": {
|
||||
@@ -18,6 +18,7 @@
|
||||
"h5": ["class", "style"],
|
||||
"h6": ["class", "style"],
|
||||
"a": ["class", "href", "title"],
|
||||
"img": ["src", "alt", "width", "height", "class", "style"],
|
||||
"table": ["class", "style"],
|
||||
"th": ["class", "style", "colspan", "rowspan"],
|
||||
"td": ["class", "style", "colspan", "rowspan"],
|
||||
@@ -41,6 +42,9 @@
|
||||
"allowedProtocols": {
|
||||
"a": {
|
||||
"href": ["http", "https", "#", "/"]
|
||||
},
|
||||
"img": {
|
||||
"src": ["http", "https", "data", "/", "cid"]
|
||||
}
|
||||
},
|
||||
"description": "HTML sanitization configuration for StoryCove story content. This configuration is shared between frontend (DOMPurify) and backend (Jsoup) to ensure consistency."
|
||||
|
||||
178
backend/src/main/resources/opensearch/README.md
Normal file
178
backend/src/main/resources/opensearch/README.md
Normal file
@@ -0,0 +1,178 @@
|
||||
# OpenSearch Configuration - Best Practices Implementation
|
||||
|
||||
## Overview
|
||||
|
||||
This directory contains a production-ready OpenSearch configuration following industry best practices for security, scalability, and maintainability.
|
||||
|
||||
## Architecture
|
||||
|
||||
### 📁 Directory Structure
|
||||
```
|
||||
opensearch/
|
||||
├── config/
|
||||
│ ├── opensearch-development.yml # Development-specific settings
|
||||
│ └── opensearch-production.yml # Production-specific settings
|
||||
├── mappings/
|
||||
│ ├── stories-mapping.json # Story index mapping
|
||||
│ ├── authors-mapping.json # Author index mapping
|
||||
│ └── collections-mapping.json # Collection index mapping
|
||||
├── templates/
|
||||
│ ├── stories-template.json # Index template for stories_*
|
||||
│ └── index-lifecycle-policy.json # ILM policy for index management
|
||||
└── README.md # This file
|
||||
```
|
||||
|
||||
## ✅ Best Practices Implemented
|
||||
|
||||
### 🔒 **Security**
|
||||
- **Environment-Aware SSL Configuration**
|
||||
- Production: Full certificate validation with custom truststore support
|
||||
- Development: Optional certificate validation for local development
|
||||
- **Proper Authentication**: Basic auth with secure credential management
|
||||
- **Connection Security**: TLS 1.3 support with hostname verification
|
||||
|
||||
### 🏗️ **Configuration Management**
|
||||
- **Externalized Configuration**: JSON/YAML files instead of hardcoded values
|
||||
- **Environment-Specific Settings**: Different configs for dev/staging/prod
|
||||
- **Type-Safe Properties**: Strongly-typed configuration classes
|
||||
- **Validation**: Configuration validation at startup
|
||||
|
||||
### 📈 **Scalability & Performance**
|
||||
- **Connection Pooling**: Configurable connection pool with timeout management
|
||||
- **Environment-Aware Sharding**:
|
||||
- Development: 1 shard, 0 replicas (single node)
|
||||
- Production: 3 shards, 1 replica (high availability)
|
||||
- **Bulk Operations**: Optimized bulk indexing with configurable batch sizes
|
||||
- **Index Templates**: Automatic application of settings to new indexes
|
||||
|
||||
### 🔄 **Index Lifecycle Management**
|
||||
- **Automated Index Rollover**: Based on size, document count, and age
|
||||
- **Hot-Warm-Cold Architecture**: Optimized storage costs
|
||||
- **Retention Policies**: Automatic cleanup of old data
|
||||
- **Force Merge**: Optimization in warm phase
|
||||
|
||||
### 📊 **Monitoring & Observability**
|
||||
- **Health Checks**: Automatic cluster health monitoring
|
||||
- **Spring Boot Actuator**: Health endpoints for monitoring systems
|
||||
- **Metrics Collection**: Configurable performance metrics
|
||||
- **Slow Query Detection**: Configurable thresholds for query performance
|
||||
|
||||
### 🛡️ **Error Handling & Resilience**
|
||||
- **Connection Retry Logic**: Automatic retry with backoff
|
||||
- **Circuit Breaker Pattern**: Fail-fast for unhealthy clusters
|
||||
- **Graceful Degradation**: Graceful handling when OpenSearch unavailable
|
||||
- **Detailed Error Logging**: Comprehensive error tracking
|
||||
|
||||
## 🚀 Usage
|
||||
|
||||
### Development Environment
|
||||
```yaml
|
||||
# application-development.yml
|
||||
storycove:
|
||||
opensearch:
|
||||
profile: development
|
||||
security:
|
||||
ssl-verification: false
|
||||
trust-all-certificates: true
|
||||
indices:
|
||||
default-shards: 1
|
||||
default-replicas: 0
|
||||
```
|
||||
|
||||
### Production Environment
|
||||
```yaml
|
||||
# application-production.yml
|
||||
storycove:
|
||||
opensearch:
|
||||
profile: production
|
||||
security:
|
||||
ssl-verification: true
|
||||
trust-all-certificates: false
|
||||
truststore-path: /etc/ssl/opensearch-truststore.jks
|
||||
indices:
|
||||
default-shards: 3
|
||||
default-replicas: 1
|
||||
```
|
||||
|
||||
## 📋 Environment Variables
|
||||
|
||||
### Required
|
||||
- `OPENSEARCH_PASSWORD`: Admin password for OpenSearch cluster
|
||||
|
||||
### Optional (with sensible defaults)
|
||||
- `OPENSEARCH_HOST`: Cluster hostname (default: localhost)
|
||||
- `OPENSEARCH_PORT`: Cluster port (default: 9200)
|
||||
- `OPENSEARCH_USERNAME`: Admin username (default: admin)
|
||||
- `OPENSEARCH_SSL_VERIFICATION`: Enable SSL verification (default: false for dev)
|
||||
- `OPENSEARCH_MAX_CONN_TOTAL`: Max connections (default: 30 for dev, 200 for prod)
|
||||
|
||||
## 🎯 Index Templates
|
||||
|
||||
Index templates automatically apply configuration to new indexes:
|
||||
|
||||
```json
|
||||
{
|
||||
"index_patterns": ["stories_*"],
|
||||
"template": {
|
||||
"settings": {
|
||||
"number_of_shards": "#{ENV_SPECIFIC}",
|
||||
"analysis": {
|
||||
"analyzer": {
|
||||
"story_analyzer": {
|
||||
"type": "standard",
|
||||
"stopwords": "_english_"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 🔍 Health Monitoring
|
||||
|
||||
Access health information:
|
||||
- **Application Health**: `/actuator/health`
|
||||
- **OpenSearch Specific**: `/actuator/health/opensearch`
|
||||
- **Detailed Metrics**: Available when `enable-metrics: true`
|
||||
|
||||
## 🔄 Deployment Strategy
|
||||
|
||||
Recommended deployment approach:
|
||||
|
||||
1. **Development**: Test OpenSearch configuration locally
|
||||
2. **Staging**: Validate performance and accuracy in staging environment
|
||||
3. **Production**: Deploy with proper monitoring and backup procedures
|
||||
|
||||
## 🛠️ Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **SSL Certificate Errors**
|
||||
- Development: Set `trust-all-certificates: true`
|
||||
- Production: Provide valid truststore path
|
||||
|
||||
2. **Connection Timeouts**
|
||||
- Increase `connection.timeout` values
|
||||
- Check network connectivity and firewall rules
|
||||
|
||||
3. **Index Creation Failures**
|
||||
- Verify cluster health with `/actuator/health/opensearch`
|
||||
- Check OpenSearch logs for detailed error messages
|
||||
|
||||
4. **Performance Issues**
|
||||
- Monitor slow queries with configurable thresholds
|
||||
- Adjust bulk operation settings
|
||||
- Review shard allocation and replica settings
|
||||
|
||||
## 🔮 Future Enhancements
|
||||
|
||||
- **Multi-Cluster Support**: Connect to multiple OpenSearch clusters
|
||||
- **Advanced Security**: Integration with OpenSearch Security plugin
|
||||
- **Custom Analyzers**: Domain-specific text analysis
|
||||
- **Index Aliases**: Zero-downtime index updates
|
||||
- **Machine Learning**: Integration with OpenSearch ML features
|
||||
|
||||
---
|
||||
|
||||
This configuration provides a solid foundation that scales from development to enterprise production environments while maintaining security, performance, and operational excellence.
|
||||
@@ -0,0 +1,32 @@
|
||||
# OpenSearch Development Configuration
|
||||
opensearch:
|
||||
cluster:
|
||||
name: "storycove-dev"
|
||||
initial_master_nodes: ["opensearch-node"]
|
||||
|
||||
# Development settings - single node, minimal resources
|
||||
indices:
|
||||
default_settings:
|
||||
number_of_shards: 1
|
||||
number_of_replicas: 0
|
||||
refresh_interval: "1s"
|
||||
|
||||
# Security settings for development
|
||||
security:
|
||||
ssl_verification: false
|
||||
trust_all_certificates: true
|
||||
|
||||
# Connection settings
|
||||
connection:
|
||||
timeout: "30s"
|
||||
socket_timeout: "60s"
|
||||
max_connections_per_route: 10
|
||||
max_connections_total: 30
|
||||
|
||||
# Index management
|
||||
index_management:
|
||||
auto_create_templates: true
|
||||
template_patterns:
|
||||
stories: "stories_*"
|
||||
authors: "authors_*"
|
||||
collections: "collections_*"
|
||||
@@ -0,0 +1,60 @@
|
||||
# OpenSearch Production Configuration
|
||||
opensearch:
|
||||
cluster:
|
||||
name: "storycove-prod"
|
||||
|
||||
# Production settings - multi-shard, with replicas
|
||||
indices:
|
||||
default_settings:
|
||||
number_of_shards: 3
|
||||
number_of_replicas: 1
|
||||
refresh_interval: "30s"
|
||||
max_result_window: 50000
|
||||
|
||||
# Index lifecycle policies
|
||||
lifecycle:
|
||||
hot_phase_duration: "7d"
|
||||
warm_phase_duration: "30d"
|
||||
cold_phase_duration: "90d"
|
||||
delete_after: "1y"
|
||||
|
||||
# Security settings for production
|
||||
security:
|
||||
ssl_verification: true
|
||||
trust_all_certificates: false
|
||||
certificate_verification: true
|
||||
tls_version: "TLSv1.3"
|
||||
|
||||
# Connection settings
|
||||
connection:
|
||||
timeout: "10s"
|
||||
socket_timeout: "30s"
|
||||
max_connections_per_route: 50
|
||||
max_connections_total: 200
|
||||
retry_on_failure: true
|
||||
max_retries: 3
|
||||
retry_delay: "1s"
|
||||
|
||||
# Performance tuning
|
||||
performance:
|
||||
bulk_actions: 1000
|
||||
bulk_size: "5MB"
|
||||
bulk_timeout: "10s"
|
||||
concurrent_requests: 4
|
||||
|
||||
# Monitoring and observability
|
||||
monitoring:
|
||||
health_check_interval: "30s"
|
||||
slow_query_threshold: "5s"
|
||||
enable_metrics: true
|
||||
|
||||
# Index management
|
||||
index_management:
|
||||
auto_create_templates: true
|
||||
template_patterns:
|
||||
stories: "stories_*"
|
||||
authors: "authors_*"
|
||||
collections: "collections_*"
|
||||
retention_policy:
|
||||
enabled: true
|
||||
default_retention: "1y"
|
||||
@@ -0,0 +1,79 @@
|
||||
{
|
||||
"settings": {
|
||||
"number_of_shards": 1,
|
||||
"number_of_replicas": 0,
|
||||
"analysis": {
|
||||
"analyzer": {
|
||||
"name_analyzer": {
|
||||
"type": "standard",
|
||||
"stopwords": "_english_"
|
||||
},
|
||||
"autocomplete_analyzer": {
|
||||
"type": "custom",
|
||||
"tokenizer": "standard",
|
||||
"filter": ["lowercase", "edge_ngram"]
|
||||
}
|
||||
},
|
||||
"filter": {
|
||||
"edge_ngram": {
|
||||
"type": "edge_ngram",
|
||||
"min_gram": 2,
|
||||
"max_gram": 20
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"name": {
|
||||
"type": "text",
|
||||
"analyzer": "name_analyzer",
|
||||
"fields": {
|
||||
"autocomplete": {
|
||||
"type": "text",
|
||||
"analyzer": "autocomplete_analyzer"
|
||||
},
|
||||
"keyword": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"bio": {
|
||||
"type": "text",
|
||||
"analyzer": "name_analyzer"
|
||||
},
|
||||
"urls": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"imageUrl": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"storyCount": {
|
||||
"type": "integer"
|
||||
},
|
||||
"averageRating": {
|
||||
"type": "float"
|
||||
},
|
||||
"totalWordCount": {
|
||||
"type": "long"
|
||||
},
|
||||
"totalReadingTime": {
|
||||
"type": "integer"
|
||||
},
|
||||
"createdAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"updatedAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"libraryId": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,73 @@
|
||||
{
|
||||
"settings": {
|
||||
"number_of_shards": 1,
|
||||
"number_of_replicas": 0,
|
||||
"analysis": {
|
||||
"analyzer": {
|
||||
"collection_analyzer": {
|
||||
"type": "standard",
|
||||
"stopwords": "_english_"
|
||||
},
|
||||
"autocomplete_analyzer": {
|
||||
"type": "custom",
|
||||
"tokenizer": "standard",
|
||||
"filter": ["lowercase", "edge_ngram"]
|
||||
}
|
||||
},
|
||||
"filter": {
|
||||
"edge_ngram": {
|
||||
"type": "edge_ngram",
|
||||
"min_gram": 2,
|
||||
"max_gram": 20
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"name": {
|
||||
"type": "text",
|
||||
"analyzer": "collection_analyzer",
|
||||
"fields": {
|
||||
"autocomplete": {
|
||||
"type": "text",
|
||||
"analyzer": "autocomplete_analyzer"
|
||||
},
|
||||
"keyword": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"description": {
|
||||
"type": "text",
|
||||
"analyzer": "collection_analyzer"
|
||||
},
|
||||
"storyCount": {
|
||||
"type": "integer"
|
||||
},
|
||||
"totalWordCount": {
|
||||
"type": "long"
|
||||
},
|
||||
"averageRating": {
|
||||
"type": "float"
|
||||
},
|
||||
"isPublic": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"createdAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"updatedAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"libraryId": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,120 @@
|
||||
{
|
||||
"settings": {
|
||||
"number_of_shards": 1,
|
||||
"number_of_replicas": 0,
|
||||
"analysis": {
|
||||
"analyzer": {
|
||||
"story_analyzer": {
|
||||
"type": "standard",
|
||||
"stopwords": "_english_"
|
||||
},
|
||||
"autocomplete_analyzer": {
|
||||
"type": "custom",
|
||||
"tokenizer": "standard",
|
||||
"filter": ["lowercase", "edge_ngram"]
|
||||
}
|
||||
},
|
||||
"filter": {
|
||||
"edge_ngram": {
|
||||
"type": "edge_ngram",
|
||||
"min_gram": 2,
|
||||
"max_gram": 20
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {
|
||||
"autocomplete": {
|
||||
"type": "text",
|
||||
"analyzer": "autocomplete_analyzer"
|
||||
},
|
||||
"keyword": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"content": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer"
|
||||
},
|
||||
"summary": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer"
|
||||
},
|
||||
"authorNames": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"authorIds": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"tagNames": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"seriesTitle": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"seriesId": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"wordCount": {
|
||||
"type": "integer"
|
||||
},
|
||||
"rating": {
|
||||
"type": "float"
|
||||
},
|
||||
"readingTime": {
|
||||
"type": "integer"
|
||||
},
|
||||
"language": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"status": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"createdAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"updatedAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"publishedAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"isRead": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isFavorite": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"readingProgress": {
|
||||
"type": "float"
|
||||
},
|
||||
"libraryId": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,77 @@
|
||||
{
|
||||
"policy": {
|
||||
"description": "StoryCove index lifecycle policy",
|
||||
"default_state": "hot",
|
||||
"states": [
|
||||
{
|
||||
"name": "hot",
|
||||
"actions": [
|
||||
{
|
||||
"rollover": {
|
||||
"min_size": "50gb",
|
||||
"min_doc_count": 1000000,
|
||||
"min_age": "7d"
|
||||
}
|
||||
}
|
||||
],
|
||||
"transitions": [
|
||||
{
|
||||
"state_name": "warm",
|
||||
"conditions": {
|
||||
"min_age": "7d"
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "warm",
|
||||
"actions": [
|
||||
{
|
||||
"replica_count": {
|
||||
"number_of_replicas": 0
|
||||
}
|
||||
},
|
||||
{
|
||||
"force_merge": {
|
||||
"max_num_segments": 1
|
||||
}
|
||||
}
|
||||
],
|
||||
"transitions": [
|
||||
{
|
||||
"state_name": "cold",
|
||||
"conditions": {
|
||||
"min_age": "30d"
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "cold",
|
||||
"actions": [],
|
||||
"transitions": [
|
||||
{
|
||||
"state_name": "delete",
|
||||
"conditions": {
|
||||
"min_age": "365d"
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "delete",
|
||||
"actions": [
|
||||
{
|
||||
"delete": {}
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"ism_template": [
|
||||
{
|
||||
"index_patterns": ["stories_*", "authors_*", "collections_*"],
|
||||
"priority": 100
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,124 @@
|
||||
{
|
||||
"index_patterns": ["stories_*"],
|
||||
"priority": 1,
|
||||
"template": {
|
||||
"settings": {
|
||||
"number_of_shards": 1,
|
||||
"number_of_replicas": 0,
|
||||
"analysis": {
|
||||
"analyzer": {
|
||||
"story_analyzer": {
|
||||
"type": "standard",
|
||||
"stopwords": "_english_"
|
||||
},
|
||||
"autocomplete_analyzer": {
|
||||
"type": "custom",
|
||||
"tokenizer": "standard",
|
||||
"filter": ["lowercase", "edge_ngram"]
|
||||
}
|
||||
},
|
||||
"filter": {
|
||||
"edge_ngram": {
|
||||
"type": "edge_ngram",
|
||||
"min_gram": 2,
|
||||
"max_gram": 20
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"properties": {
|
||||
"id": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {
|
||||
"autocomplete": {
|
||||
"type": "text",
|
||||
"analyzer": "autocomplete_analyzer"
|
||||
},
|
||||
"keyword": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"content": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer"
|
||||
},
|
||||
"summary": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer"
|
||||
},
|
||||
"authorNames": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"authorIds": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"tagNames": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"seriesTitle": {
|
||||
"type": "text",
|
||||
"analyzer": "story_analyzer",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"seriesId": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"wordCount": {
|
||||
"type": "integer"
|
||||
},
|
||||
"rating": {
|
||||
"type": "float"
|
||||
},
|
||||
"readingTime": {
|
||||
"type": "integer"
|
||||
},
|
||||
"language": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"status": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"createdAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"updatedAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"publishedAt": {
|
||||
"type": "date",
|
||||
"format": "strict_date_optional_time||epoch_millis"
|
||||
},
|
||||
"isRead": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isFavorite": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"readingProgress": {
|
||||
"type": "float"
|
||||
},
|
||||
"libraryId": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,12 +1,8 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import com.storycove.service.TypesenseService;
|
||||
import org.springframework.boot.test.context.TestConfiguration;
|
||||
import org.springframework.boot.test.mock.mockito.MockBean;
|
||||
|
||||
@TestConfiguration
|
||||
public class TestConfig {
|
||||
|
||||
@MockBean
|
||||
public TypesenseService typesenseService;
|
||||
// Test configuration
|
||||
}
|
||||
@@ -15,10 +15,12 @@ public abstract class BaseRepositoryTest {
|
||||
private static final PostgreSQLContainer<?> postgres;
|
||||
|
||||
static {
|
||||
postgres = new PostgreSQLContainer<>("postgres:15-alpine")
|
||||
@SuppressWarnings("resource") // Container is managed by shutdown hook
|
||||
PostgreSQLContainer<?> container = new PostgreSQLContainer<>("postgres:15-alpine")
|
||||
.withDatabaseName("storycove_test")
|
||||
.withUsername("test")
|
||||
.withPassword("test");
|
||||
postgres = container;
|
||||
postgres.start();
|
||||
|
||||
// Add shutdown hook to properly close the container
|
||||
|
||||
@@ -9,7 +9,6 @@ import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.DisplayName;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.InjectMocks;
|
||||
import org.mockito.Mock;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
import org.springframework.data.domain.Page;
|
||||
@@ -23,7 +22,6 @@ import java.util.UUID;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.any;
|
||||
import static org.mockito.ArgumentMatchers.anyString;
|
||||
import static org.mockito.Mockito.*;
|
||||
import static org.mockito.Mockito.times;
|
||||
|
||||
@@ -46,8 +44,9 @@ class AuthorServiceTest {
|
||||
testAuthor.setId(testId);
|
||||
testAuthor.setNotes("Test notes");
|
||||
|
||||
// Initialize service with null TypesenseService (which is allowed)
|
||||
authorService = new AuthorService(authorRepository, null);
|
||||
// Initialize service with mock SearchServiceAdapter
|
||||
SearchServiceAdapter mockSearchServiceAdapter = mock(SearchServiceAdapter.class);
|
||||
authorService = new AuthorService(authorRepository, mockSearchServiceAdapter);
|
||||
}
|
||||
|
||||
@Test
|
||||
@@ -176,7 +175,7 @@ class AuthorServiceTest {
|
||||
when(authorRepository.existsByName("Updated Author")).thenReturn(false);
|
||||
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
||||
|
||||
Author result = authorService.update(testId, updates);
|
||||
authorService.update(testId, updates);
|
||||
|
||||
assertEquals("Updated Author", testAuthor.getName());
|
||||
assertEquals("Updated notes", testAuthor.getNotes());
|
||||
@@ -318,7 +317,7 @@ class AuthorServiceTest {
|
||||
when(authorRepository.findById(testId)).thenReturn(Optional.of(testAuthor));
|
||||
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
||||
|
||||
Author result = authorService.setRating(testId, 4);
|
||||
authorService.setRating(testId, 4);
|
||||
|
||||
assertEquals(4, testAuthor.getAuthorRating());
|
||||
verify(authorRepository, times(2)).findById(testId); // Called twice: once initially, once after flush
|
||||
@@ -342,7 +341,7 @@ class AuthorServiceTest {
|
||||
when(authorRepository.findById(testId)).thenReturn(Optional.of(testAuthor));
|
||||
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
||||
|
||||
Author result = authorService.setRating(testId, null);
|
||||
authorService.setRating(testId, null);
|
||||
|
||||
assertNull(testAuthor.getAuthorRating());
|
||||
verify(authorRepository, times(2)).findById(testId); // Called twice: once initially, once after flush
|
||||
|
||||
@@ -33,6 +33,9 @@ class StoryServiceTest {
|
||||
@Mock
|
||||
private ReadingPositionRepository readingPositionRepository;
|
||||
|
||||
@Mock
|
||||
private SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
private StoryService storyService;
|
||||
private Story testStory;
|
||||
private UUID testId;
|
||||
@@ -44,16 +47,16 @@ class StoryServiceTest {
|
||||
testStory.setId(testId);
|
||||
testStory.setContentHtml("<p>Test content for reading progress tracking</p>");
|
||||
|
||||
// Create StoryService with only required repositories, all services can be null for these tests
|
||||
// Create StoryService with mocked dependencies
|
||||
storyService = new StoryService(
|
||||
storyRepository,
|
||||
tagRepository,
|
||||
readingPositionRepository, // added for foreign key constraint handling
|
||||
readingPositionRepository,
|
||||
null, // authorService - not needed for reading progress tests
|
||||
null, // tagService - not needed for reading progress tests
|
||||
null, // seriesService - not needed for reading progress tests
|
||||
null, // sanitizationService - not needed for reading progress tests
|
||||
null // typesenseService - will test both with and without
|
||||
searchServiceAdapter
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -18,11 +18,12 @@ storycove:
|
||||
expiration: 86400000
|
||||
auth:
|
||||
password: test-password
|
||||
typesense:
|
||||
enabled: false
|
||||
api-key: test-key
|
||||
search:
|
||||
engine: opensearch
|
||||
opensearch:
|
||||
host: localhost
|
||||
port: 8108
|
||||
port: 9200
|
||||
scheme: http
|
||||
images:
|
||||
storage-path: /tmp/test-images
|
||||
|
||||
|
||||
4308
backend/test_results.log
Normal file
4308
backend/test_results.log
Normal file
File diff suppressed because it is too large
Load Diff
5
cookies.txt
Normal file
5
cookies.txt
Normal file
@@ -0,0 +1,5 @@
|
||||
# Netscape HTTP Cookie File
|
||||
# https://curl.se/docs/http-cookies.html
|
||||
# This file was generated by libcurl! Edit at your own risk.
|
||||
|
||||
#HttpOnly_localhost FALSE / FALSE 1758433252 token eyJhbGciOiJIUzUxMiJ9.eyJzdWIiOiJ1c2VyIiwiaWF0IjoxNzU4MzQ2ODUyLCJleHAiOjE3NTg0MzMyNTIsImxpYnJhcnlJZCI6InNlY3JldCJ9.zEAQT5_11-pxPxmIhufSQqE26hvHldde4kFNE2HWWgBa5lT_Wt7jwpoPUMkQGQfShQwDZ9N-hFX3R2ew8jD7WQ
|
||||
@@ -34,23 +34,27 @@ services:
|
||||
- SPRING_DATASOURCE_USERNAME=storycove
|
||||
- SPRING_DATASOURCE_PASSWORD=${DB_PASSWORD}
|
||||
- JWT_SECRET=${JWT_SECRET}
|
||||
- TYPESENSE_API_KEY=${TYPESENSE_API_KEY}
|
||||
- TYPESENSE_HOST=typesense
|
||||
- TYPESENSE_PORT=8108
|
||||
- OPENSEARCH_HOST=opensearch
|
||||
- OPENSEARCH_PORT=9200
|
||||
- OPENSEARCH_SCHEME=http
|
||||
- SEARCH_ENGINE=${SEARCH_ENGINE:-opensearch}
|
||||
- IMAGE_STORAGE_PATH=/app/images
|
||||
- APP_PASSWORD=${APP_PASSWORD}
|
||||
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
|
||||
volumes:
|
||||
- images_data:/app/images
|
||||
- library_config:/app/config
|
||||
depends_on:
|
||||
- postgres
|
||||
- typesense
|
||||
- opensearch
|
||||
networks:
|
||||
- storycove-network
|
||||
|
||||
postgres:
|
||||
image: postgres:15-alpine
|
||||
# No port mapping - only accessible within the Docker network
|
||||
#ports:
|
||||
# - "5432:5432"
|
||||
environment:
|
||||
- POSTGRES_DB=storycove
|
||||
- POSTGRES_USER=storycove
|
||||
@@ -60,21 +64,48 @@ services:
|
||||
networks:
|
||||
- storycove-network
|
||||
|
||||
typesense:
|
||||
image: typesense/typesense:0.25.0
|
||||
|
||||
opensearch:
|
||||
image: opensearchproject/opensearch:3.2.0
|
||||
# No port mapping - only accessible within the Docker network
|
||||
environment:
|
||||
- TYPESENSE_API_KEY=${TYPESENSE_API_KEY}
|
||||
- TYPESENSE_DATA_DIR=/data
|
||||
- cluster.name=storycove-opensearch
|
||||
- node.name=opensearch-node
|
||||
- discovery.type=single-node
|
||||
- bootstrap.memory_lock=false
|
||||
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
|
||||
- "DISABLE_INSTALL_DEMO_CONFIG=true"
|
||||
- "DISABLE_SECURITY_PLUGIN=true"
|
||||
ulimits:
|
||||
memlock:
|
||||
soft: -1
|
||||
hard: -1
|
||||
nofile:
|
||||
soft: 65536
|
||||
hard: 65536
|
||||
volumes:
|
||||
- typesense_data:/data
|
||||
- opensearch_data:/usr/share/opensearch/data
|
||||
networks:
|
||||
- storycove-network
|
||||
restart: unless-stopped
|
||||
|
||||
opensearch-dashboards:
|
||||
image: opensearchproject/opensearch-dashboards:3.2.0
|
||||
ports:
|
||||
- "5601:5601" # Expose OpenSearch Dashboard
|
||||
environment:
|
||||
- OPENSEARCH_HOSTS=http://opensearch:9200
|
||||
- "DISABLE_SECURITY_DASHBOARDS_PLUGIN=true"
|
||||
depends_on:
|
||||
- opensearch
|
||||
networks:
|
||||
- storycove-network
|
||||
|
||||
volumes:
|
||||
postgres_data:
|
||||
typesense_data:
|
||||
opensearch_data:
|
||||
images_data:
|
||||
library_config:
|
||||
|
||||
configs:
|
||||
nginx_config:
|
||||
@@ -91,7 +122,7 @@ configs:
|
||||
}
|
||||
server {
|
||||
listen 80;
|
||||
client_max_body_size 10M;
|
||||
client_max_body_size 256M;
|
||||
location / {
|
||||
proxy_pass http://frontend;
|
||||
proxy_http_version 1.1;
|
||||
@@ -118,13 +149,5 @@ configs:
|
||||
expires 1y;
|
||||
add_header Cache-Control public;
|
||||
}
|
||||
location /typesense/ {
|
||||
proxy_pass http://typesense:8108/;
|
||||
proxy_set_header Host $$host;
|
||||
proxy_set_header X-Real-IP $$remote_addr;
|
||||
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $$scheme;
|
||||
proxy_set_header X-Typesense-API-Key $$http_x_typesense_api_key;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,40 +1,58 @@
|
||||
# Use node 18 alpine for smaller image size
|
||||
FROM node:18-alpine
|
||||
|
||||
# Multi-stage build for better layer caching and smaller final image
|
||||
FROM node:18-alpine AS deps
|
||||
WORKDIR /app
|
||||
|
||||
# Install dumb-init for proper signal handling
|
||||
# Install dumb-init early
|
||||
RUN apk add --no-cache dumb-init
|
||||
|
||||
# Copy package files
|
||||
# Copy package files first to leverage Docker layer caching
|
||||
COPY package*.json ./
|
||||
|
||||
# Install all dependencies (including devDependencies needed for build)
|
||||
# Set npm config for better CI performance
|
||||
RUN npm ci --prefer-offline --no-audit
|
||||
# Install dependencies with optimized settings
|
||||
RUN npm ci --prefer-offline --no-audit --frozen-lockfile
|
||||
|
||||
# Copy source code
|
||||
# Build stage
|
||||
FROM node:18-alpine AS builder
|
||||
WORKDIR /app
|
||||
|
||||
# Copy dependencies from deps stage
|
||||
COPY --from=deps /app/node_modules ./node_modules
|
||||
COPY . .
|
||||
|
||||
# Set Node.js memory limit for build (helpful in constrained environments)
|
||||
# Set Node.js memory limit for build
|
||||
ENV NODE_OPTIONS="--max-old-space-size=1024"
|
||||
ENV NEXT_TELEMETRY_DISABLED=1
|
||||
|
||||
# Build the application
|
||||
RUN npm run build
|
||||
|
||||
# Remove devDependencies after build to reduce image size
|
||||
RUN npm prune --omit=dev
|
||||
# Production stage
|
||||
FROM node:18-alpine AS runner
|
||||
WORKDIR /app
|
||||
|
||||
ENV NODE_ENV=production
|
||||
ENV NEXT_TELEMETRY_DISABLED=1
|
||||
|
||||
# Install dumb-init for proper signal handling
|
||||
RUN apk add --no-cache dumb-init
|
||||
|
||||
# Create non-root user for security
|
||||
RUN addgroup -g 1001 -S nodejs
|
||||
RUN adduser -S nextjs -u 1001
|
||||
|
||||
# Change ownership of the app directory
|
||||
RUN chown -R nextjs:nodejs /app
|
||||
# Copy necessary files from builder stage
|
||||
COPY --from=builder /app/next.config.js* ./
|
||||
COPY --from=builder /app/public ./public
|
||||
COPY --from=builder /app/package.json ./package.json
|
||||
|
||||
# Copy built application
|
||||
COPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./
|
||||
COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static
|
||||
|
||||
USER nextjs
|
||||
|
||||
EXPOSE 3000
|
||||
|
||||
# Use dumb-init to handle signals properly
|
||||
ENTRYPOINT ["dumb-init", "--"]
|
||||
CMD ["npm", "start"]
|
||||
CMD ["node", "server.js"]
|
||||
@@ -1,5 +1,7 @@
|
||||
/** @type {import('next').NextConfig} */
|
||||
const nextConfig = {
|
||||
// Enable standalone output for optimized Docker builds
|
||||
output: 'standalone',
|
||||
// Removed Next.js rewrites since nginx handles all API routing
|
||||
webpack: (config, { isServer }) => {
|
||||
// Exclude cheerio and its dependencies from client-side bundling
|
||||
|
||||
12
frontend/package-lock.json
generated
12
frontend/package-lock.json
generated
@@ -10,9 +10,9 @@
|
||||
"dependencies": {
|
||||
"@heroicons/react": "^2.2.0",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"axios": "^1.6.0",
|
||||
"axios": "^1.11.0",
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"dompurify": "^3.0.5",
|
||||
"dompurify": "^3.2.6",
|
||||
"next": "14.0.0",
|
||||
"postcss": "^8.4.31",
|
||||
"react": "^18",
|
||||
@@ -1372,13 +1372,13 @@
|
||||
}
|
||||
},
|
||||
"node_modules/axios": {
|
||||
"version": "1.10.0",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.10.0.tgz",
|
||||
"integrity": "sha512-/1xYAC4MP/HEG+3duIhFr4ZQXR4sQXOIe+o6sdqzeykGLx6Upp/1p8MHqhINOvGeP7xyNHe7tsiJByc4SSVUxw==",
|
||||
"version": "1.11.0",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.11.0.tgz",
|
||||
"integrity": "sha512-1Lx3WLFQWm3ooKDYZD1eXmoGO9fxYQjrycfHFC8P0sCfQVXyROp0p9PFWBehewBOdCwHc+f/b8I0fMto5eSfwA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"follow-redirects": "^1.15.6",
|
||||
"form-data": "^4.0.0",
|
||||
"form-data": "^4.0.4",
|
||||
"proxy-from-env": "^1.1.0"
|
||||
}
|
||||
},
|
||||
|
||||
@@ -12,9 +12,9 @@
|
||||
"dependencies": {
|
||||
"@heroicons/react": "^2.2.0",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"axios": "^1.6.0",
|
||||
"axios": "^1.11.0",
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"dompurify": "^3.0.5",
|
||||
"dompurify": "^3.2.6",
|
||||
"next": "14.0.0",
|
||||
"postcss": "^8.4.31",
|
||||
"react": "^18",
|
||||
|
||||
@@ -1,39 +1,554 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect } from 'react';
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import { useAuth } from '../../contexts/AuthContext';
|
||||
import ImportLayout from '../../components/layout/ImportLayout';
|
||||
import { Input, Textarea } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import TagInput from '../../components/stories/TagInput';
|
||||
import RichTextEditor from '../../components/stories/RichTextEditor';
|
||||
import ImageUpload from '../../components/ui/ImageUpload';
|
||||
import AuthorSelector from '../../components/stories/AuthorSelector';
|
||||
import SeriesSelector from '../../components/stories/SeriesSelector';
|
||||
import { storyApi, authorApi } from '../../lib/api';
|
||||
|
||||
export default function AddStoryPage() {
|
||||
const [formData, setFormData] = useState({
|
||||
title: '',
|
||||
summary: '',
|
||||
authorName: '',
|
||||
authorId: undefined as string | undefined,
|
||||
contentHtml: '',
|
||||
sourceUrl: '',
|
||||
tags: [] as string[],
|
||||
seriesName: '',
|
||||
seriesId: undefined as string | undefined,
|
||||
volume: '',
|
||||
});
|
||||
|
||||
const [coverImage, setCoverImage] = useState<File | null>(null);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [processingImages, setProcessingImages] = useState(false);
|
||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||
const [duplicateWarning, setDuplicateWarning] = useState<{
|
||||
show: boolean;
|
||||
count: number;
|
||||
duplicates: Array<{
|
||||
id: string;
|
||||
title: string;
|
||||
authorName: string;
|
||||
createdAt: string;
|
||||
}>;
|
||||
}>({ show: false, count: 0, duplicates: [] });
|
||||
const [checkingDuplicates, setCheckingDuplicates] = useState(false);
|
||||
|
||||
export default function AddStoryRedirectPage() {
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { isAuthenticated } = useAuth();
|
||||
|
||||
// Handle URL parameters
|
||||
useEffect(() => {
|
||||
// Redirect to the new /import route while preserving query parameters
|
||||
const mode = searchParams.get('mode');
|
||||
const authorId = searchParams.get('authorId');
|
||||
const from = searchParams.get('from');
|
||||
|
||||
let redirectUrl = '/import';
|
||||
const queryParams = new URLSearchParams();
|
||||
|
||||
if (mode) queryParams.set('mode', mode);
|
||||
if (authorId) queryParams.set('authorId', authorId);
|
||||
if (from) queryParams.set('from', from);
|
||||
|
||||
const queryString = queryParams.toString();
|
||||
if (queryString) {
|
||||
redirectUrl += '?' + queryString;
|
||||
// Pre-fill author if authorId is provided in URL
|
||||
if (authorId) {
|
||||
const loadAuthor = async () => {
|
||||
try {
|
||||
const author = await authorApi.getAuthor(authorId);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName: author.name,
|
||||
authorId: author.id
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error('Failed to load author:', error);
|
||||
}
|
||||
};
|
||||
loadAuthor();
|
||||
}
|
||||
|
||||
router.replace(redirectUrl);
|
||||
}, [router, searchParams]);
|
||||
// Handle URL import data
|
||||
if (from === 'url-import') {
|
||||
const title = searchParams.get('title') || '';
|
||||
const summary = searchParams.get('summary') || '';
|
||||
const author = searchParams.get('author') || '';
|
||||
const sourceUrl = searchParams.get('sourceUrl') || '';
|
||||
const tagsParam = searchParams.get('tags');
|
||||
const content = searchParams.get('content') || '';
|
||||
|
||||
let tags: string[] = [];
|
||||
try {
|
||||
tags = tagsParam ? JSON.parse(tagsParam) : [];
|
||||
} catch (error) {
|
||||
console.error('Failed to parse tags:', error);
|
||||
tags = [];
|
||||
}
|
||||
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
title,
|
||||
summary,
|
||||
authorName: author,
|
||||
authorId: undefined, // Reset author ID when importing from URL
|
||||
contentHtml: content,
|
||||
sourceUrl,
|
||||
tags
|
||||
}));
|
||||
|
||||
// Show success message
|
||||
setErrors({ success: 'Story data imported successfully! Review and edit as needed before saving.' });
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Load pending story data from bulk combine operation
|
||||
useEffect(() => {
|
||||
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
|
||||
if (fromBulkCombine) {
|
||||
const pendingStoryData = localStorage.getItem('pendingStory');
|
||||
if (pendingStoryData) {
|
||||
try {
|
||||
const storyData = JSON.parse(pendingStoryData);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
title: storyData.title || '',
|
||||
authorName: storyData.author || '',
|
||||
authorId: undefined, // Reset author ID for bulk combined stories
|
||||
contentHtml: storyData.content || '',
|
||||
sourceUrl: storyData.sourceUrl || '',
|
||||
summary: storyData.summary || '',
|
||||
tags: storyData.tags || []
|
||||
}));
|
||||
// Clear the pending data
|
||||
localStorage.removeItem('pendingStory');
|
||||
} catch (error) {
|
||||
console.error('Failed to load pending story data:', error);
|
||||
}
|
||||
}
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Check for duplicates when title and author are both present
|
||||
useEffect(() => {
|
||||
const checkDuplicates = async () => {
|
||||
const title = formData.title.trim();
|
||||
const authorName = formData.authorName.trim();
|
||||
|
||||
// Don't check if user isn't authenticated or if title/author are empty
|
||||
if (!isAuthenticated || !title || !authorName) {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
return;
|
||||
}
|
||||
|
||||
// Debounce the check to avoid too many API calls
|
||||
const timeoutId = setTimeout(async () => {
|
||||
try {
|
||||
setCheckingDuplicates(true);
|
||||
const result = await storyApi.checkDuplicate(title, authorName);
|
||||
|
||||
if (result.hasDuplicates) {
|
||||
setDuplicateWarning({
|
||||
show: true,
|
||||
count: result.count,
|
||||
duplicates: result.duplicates
|
||||
});
|
||||
} else {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to check for duplicates:', error);
|
||||
// Clear any existing duplicate warnings on error
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
// Don't show error to user as this is just a helpful warning
|
||||
// Authentication errors will be handled by the API interceptor
|
||||
} finally {
|
||||
setCheckingDuplicates(false);
|
||||
}
|
||||
}, 500); // 500ms debounce
|
||||
|
||||
return () => clearTimeout(timeoutId);
|
||||
};
|
||||
|
||||
checkDuplicates();
|
||||
}, [formData.title, formData.authorName, isAuthenticated]);
|
||||
|
||||
const handleInputChange = (field: string) => (
|
||||
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
|
||||
) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
[field]: e.target.value
|
||||
}));
|
||||
|
||||
// Clear error when user starts typing
|
||||
if (errors[field]) {
|
||||
setErrors(prev => ({ ...prev, [field]: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleContentChange = (html: string) => {
|
||||
setFormData(prev => ({ ...prev, contentHtml: html }));
|
||||
if (errors.contentHtml) {
|
||||
setErrors(prev => ({ ...prev, contentHtml: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleTagsChange = (tags: string[]) => {
|
||||
setFormData(prev => ({ ...prev, tags }));
|
||||
};
|
||||
|
||||
const handleAuthorChange = (authorName: string, authorId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName,
|
||||
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes author
|
||||
if (errors.authorName) {
|
||||
setErrors(prev => ({ ...prev, authorName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleSeriesChange = (seriesName: string, seriesId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
seriesName,
|
||||
seriesId: seriesId // This will be undefined if creating new series, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes series
|
||||
if (errors.seriesName) {
|
||||
setErrors(prev => ({ ...prev, seriesName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
const newErrors: Record<string, string> = {};
|
||||
|
||||
if (!formData.title.trim()) {
|
||||
newErrors.title = 'Title is required';
|
||||
}
|
||||
|
||||
if (!formData.authorName.trim()) {
|
||||
newErrors.authorName = 'Author name is required';
|
||||
}
|
||||
|
||||
if (!formData.contentHtml.trim()) {
|
||||
newErrors.contentHtml = 'Story content is required';
|
||||
}
|
||||
|
||||
if (formData.seriesName && !formData.volume) {
|
||||
newErrors.volume = 'Volume number is required when series is specified';
|
||||
}
|
||||
|
||||
if (formData.volume && !formData.seriesName.trim()) {
|
||||
newErrors.seriesName = 'Series name is required when volume is specified';
|
||||
}
|
||||
|
||||
setErrors(newErrors);
|
||||
return Object.keys(newErrors).length === 0;
|
||||
};
|
||||
|
||||
// Helper function to detect external images in HTML content
|
||||
const hasExternalImages = (htmlContent: string): boolean => {
|
||||
if (!htmlContent) return false;
|
||||
|
||||
// Create a temporary DOM element to parse HTML
|
||||
const tempDiv = document.createElement('div');
|
||||
tempDiv.innerHTML = htmlContent;
|
||||
|
||||
const images = tempDiv.querySelectorAll('img');
|
||||
for (let i = 0; i < images.length; i++) {
|
||||
const img = images[i];
|
||||
const src = img.getAttribute('src');
|
||||
if (src && (src.startsWith('http://') || src.startsWith('https://'))) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!validateForm()) {
|
||||
return;
|
||||
}
|
||||
|
||||
setLoading(true);
|
||||
|
||||
try {
|
||||
// First, create the story with JSON data
|
||||
const storyData = {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: formData.contentHtml,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
// Send seriesId if we have it (existing series), otherwise send seriesName (new series)
|
||||
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
|
||||
// Send authorId if we have it (existing author), otherwise send authorName (new author)
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
};
|
||||
|
||||
const story = await storyApi.createStory(storyData);
|
||||
|
||||
// Process images if there are external images in the content
|
||||
if (hasExternalImages(formData.contentHtml)) {
|
||||
try {
|
||||
setProcessingImages(true);
|
||||
const imageResult = await storyApi.processContentImages(story.id, formData.contentHtml);
|
||||
|
||||
// If images were processed and content was updated, save the updated content
|
||||
if (imageResult.processedContent !== formData.contentHtml) {
|
||||
await storyApi.updateStory(story.id, {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: imageResult.processedContent,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
});
|
||||
|
||||
// Show success message with image processing info
|
||||
if (imageResult.downloadedImages.length > 0) {
|
||||
console.log(`Successfully processed ${imageResult.downloadedImages.length} images`);
|
||||
}
|
||||
if (imageResult.warnings && imageResult.warnings.length > 0) {
|
||||
console.warn('Image processing warnings:', imageResult.warnings);
|
||||
}
|
||||
}
|
||||
} catch (imageError) {
|
||||
console.error('Failed to process images:', imageError);
|
||||
// Don't fail the entire operation if image processing fails
|
||||
// The story was created successfully, just without processed images
|
||||
} finally {
|
||||
setProcessingImages(false);
|
||||
}
|
||||
}
|
||||
|
||||
// If there's a cover image, upload it separately
|
||||
if (coverImage) {
|
||||
await storyApi.uploadCover(story.id, coverImage);
|
||||
}
|
||||
|
||||
router.push(`/stories/${story.id}/detail`);
|
||||
} catch (error: any) {
|
||||
console.error('Failed to create story:', error);
|
||||
const errorMessage = error.response?.data?.message || 'Failed to create story';
|
||||
setErrors({ submit: errorMessage });
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center">
|
||||
<div className="text-center">
|
||||
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600 mx-auto mb-4"></div>
|
||||
<p className="text-gray-600">Redirecting...</p>
|
||||
<ImportLayout
|
||||
title="Add New Story"
|
||||
description="Add a story to your personal collection"
|
||||
>
|
||||
{/* Success Message */}
|
||||
{errors.success && (
|
||||
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg mb-6">
|
||||
<p className="text-green-800 dark:text-green-200">{errors.success}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<form onSubmit={handleSubmit} className="space-y-6">
|
||||
{/* Title */}
|
||||
<Input
|
||||
label="Title *"
|
||||
value={formData.title}
|
||||
onChange={handleInputChange('title')}
|
||||
placeholder="Enter the story title"
|
||||
error={errors.title}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Author Selector */}
|
||||
<AuthorSelector
|
||||
label="Author *"
|
||||
value={formData.authorName}
|
||||
onChange={handleAuthorChange}
|
||||
placeholder="Select or enter author name"
|
||||
error={errors.authorName}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Duplicate Warning */}
|
||||
{duplicateWarning.show && (
|
||||
<div className="p-4 bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg">
|
||||
<div className="flex items-start gap-3">
|
||||
<div className="text-yellow-600 dark:text-yellow-400 mt-0.5">
|
||||
⚠️
|
||||
</div>
|
||||
<div>
|
||||
<h4 className="font-medium text-yellow-800 dark:text-yellow-200">
|
||||
Potential Duplicate Detected
|
||||
</h4>
|
||||
<p className="text-sm text-yellow-700 dark:text-yellow-300 mt-1">
|
||||
Found {duplicateWarning.count} existing {duplicateWarning.count === 1 ? 'story' : 'stories'} with the same title and author:
|
||||
</p>
|
||||
<ul className="mt-2 space-y-1">
|
||||
{duplicateWarning.duplicates.map((duplicate, index) => (
|
||||
<li key={duplicate.id} className="text-sm text-yellow-700 dark:text-yellow-300">
|
||||
• <span className="font-medium">{duplicate.title}</span> by {duplicate.authorName}
|
||||
<span className="text-xs ml-2">
|
||||
(added {new Date(duplicate.createdAt).toLocaleDateString()})
|
||||
</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-xs text-yellow-600 dark:text-yellow-400 mt-2">
|
||||
You can still create this story if it's different from the existing ones.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Checking indicator */}
|
||||
{checkingDuplicates && (
|
||||
<div className="flex items-center gap-2 text-sm theme-text">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||
Checking for duplicates...
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Summary */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Summary
|
||||
</label>
|
||||
<Textarea
|
||||
value={formData.summary}
|
||||
onChange={handleInputChange('summary')}
|
||||
placeholder="Brief summary or description of the story..."
|
||||
rows={3}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-1">
|
||||
Optional summary that will be displayed on the story detail page
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Cover Image Upload */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Cover Image
|
||||
</label>
|
||||
<ImageUpload
|
||||
onImageSelect={setCoverImage}
|
||||
accept="image/jpeg,image/png"
|
||||
maxSizeMB={5}
|
||||
aspectRatio="3:4"
|
||||
placeholder="Drop a cover image here or click to select"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Story Content *
|
||||
</label>
|
||||
<RichTextEditor
|
||||
value={formData.contentHtml}
|
||||
onChange={handleContentChange}
|
||||
placeholder="Write or paste your story content here..."
|
||||
error={errors.contentHtml}
|
||||
enableImageProcessing={false}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-2">
|
||||
💡 <strong>Tip:</strong> If you paste content with images, they'll be automatically downloaded and stored locally when you save the story.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Tags
|
||||
</label>
|
||||
<TagInput
|
||||
tags={formData.tags}
|
||||
onChange={handleTagsChange}
|
||||
placeholder="Add tags to categorize your story..."
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Series and Volume */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<SeriesSelector
|
||||
label="Series (optional)"
|
||||
value={formData.seriesName}
|
||||
onChange={handleSeriesChange}
|
||||
placeholder="Select or enter series name if part of a series"
|
||||
error={errors.seriesName}
|
||||
authorId={formData.authorId}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Volume/Part (optional)"
|
||||
type="number"
|
||||
min="1"
|
||||
value={formData.volume}
|
||||
onChange={handleInputChange('volume')}
|
||||
placeholder="Enter volume/part number"
|
||||
error={errors.volume}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Source URL */}
|
||||
<Input
|
||||
label="Source URL (optional)"
|
||||
type="url"
|
||||
value={formData.sourceUrl}
|
||||
onChange={handleInputChange('sourceUrl')}
|
||||
placeholder="https://example.com/original-story-url"
|
||||
/>
|
||||
|
||||
{/* Image Processing Indicator */}
|
||||
{processingImages && (
|
||||
<div className="p-4 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg">
|
||||
<div className="flex items-center gap-3">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
|
||||
<p className="text-blue-800 dark:text-blue-200">
|
||||
Processing and downloading images...
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Submit Error */}
|
||||
{errors.submit && (
|
||||
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<p className="text-red-800 dark:text-red-200">{errors.submit}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4 pt-6">
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={() => router.back()}
|
||||
disabled={loading}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
type="submit"
|
||||
loading={loading}
|
||||
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
|
||||
>
|
||||
{processingImages ? 'Processing Images...' : 'Add Story'}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</ImportLayout>
|
||||
);
|
||||
}
|
||||
@@ -211,7 +211,7 @@ export default function AuthorDetailPage() {
|
||||
<p className="theme-text">
|
||||
{stories.length} {stories.length === 1 ? 'story' : 'stories'}
|
||||
</p>
|
||||
<Button href={`/import?authorId=${authorId}`}>
|
||||
<Button href={`/add-story?authorId=${authorId}`}>
|
||||
Add Story
|
||||
</Button>
|
||||
</div>
|
||||
@@ -220,7 +220,7 @@ export default function AuthorDetailPage() {
|
||||
{stories.length === 0 ? (
|
||||
<div className="text-center py-12 theme-card theme-shadow rounded-lg">
|
||||
<p className="theme-text text-lg mb-4">No stories by this author yet.</p>
|
||||
<Button href="/import">Add a Story</Button>
|
||||
<Button href="/add-story">Add a Story</Button>
|
||||
</div>
|
||||
) : (
|
||||
<div className="space-y-4">
|
||||
|
||||
@@ -14,6 +14,7 @@ export default function AuthorsPage() {
|
||||
const [authors, setAuthors] = useState<Author[]>([]);
|
||||
const [filteredAuthors, setFilteredAuthors] = useState<Author[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [searchLoading, setSearchLoading] = useState(false);
|
||||
const [searchQuery, setSearchQuery] = useState('');
|
||||
const [viewMode, setViewMode] = useState<'grid' | 'list'>('grid');
|
||||
const [sortBy, setSortBy] = useState('name');
|
||||
@@ -21,53 +22,53 @@ export default function AuthorsPage() {
|
||||
const [currentPage, setCurrentPage] = useState(0);
|
||||
const [totalHits, setTotalHits] = useState(0);
|
||||
const [hasMore, setHasMore] = useState(false);
|
||||
const ITEMS_PER_PAGE = 50; // Safe limit under Typesense's 250 limit
|
||||
const ITEMS_PER_PAGE = 50;
|
||||
|
||||
useEffect(() => {
|
||||
const debounceTimer = setTimeout(() => {
|
||||
const loadAuthors = async () => {
|
||||
try {
|
||||
// Use searchLoading for background search, loading only for initial load
|
||||
const isInitialLoad = authors.length === 0 && !searchQuery && currentPage === 0;
|
||||
if (isInitialLoad) {
|
||||
setLoading(true);
|
||||
const searchResults = await authorApi.searchAuthorsTypesense({
|
||||
q: searchQuery || '*',
|
||||
} else {
|
||||
setSearchLoading(true);
|
||||
}
|
||||
const searchResults = await authorApi.getAuthors({
|
||||
page: currentPage,
|
||||
size: ITEMS_PER_PAGE,
|
||||
sortBy: sortBy,
|
||||
sortOrder: sortOrder
|
||||
sortDir: sortOrder
|
||||
});
|
||||
|
||||
if (currentPage === 0) {
|
||||
// First page - replace all results
|
||||
setAuthors(searchResults.results || []);
|
||||
setFilteredAuthors(searchResults.results || []);
|
||||
setAuthors(searchResults.content || []);
|
||||
setFilteredAuthors(searchResults.content || []);
|
||||
} else {
|
||||
// Subsequent pages - append results
|
||||
setAuthors(prev => [...prev, ...(searchResults.results || [])]);
|
||||
setFilteredAuthors(prev => [...prev, ...(searchResults.results || [])]);
|
||||
setAuthors(prev => [...prev, ...(searchResults.content || [])]);
|
||||
setFilteredAuthors(prev => [...prev, ...(searchResults.content || [])]);
|
||||
}
|
||||
|
||||
setTotalHits(searchResults.totalHits);
|
||||
setHasMore(searchResults.results.length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < searchResults.totalHits);
|
||||
setTotalHits(searchResults.totalElements || 0);
|
||||
setHasMore(searchResults.content.length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < (searchResults.totalElements || 0));
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to load authors:', error);
|
||||
// Fallback to regular API if Typesense fails (only for first page)
|
||||
if (currentPage === 0) {
|
||||
try {
|
||||
const authorsResult = await authorApi.getAuthors({ page: 0, size: ITEMS_PER_PAGE });
|
||||
setAuthors(authorsResult.content || []);
|
||||
setFilteredAuthors(authorsResult.content || []);
|
||||
setTotalHits(authorsResult.totalElements || 0);
|
||||
setHasMore(authorsResult.content.length === ITEMS_PER_PAGE);
|
||||
} catch (fallbackError) {
|
||||
console.error('Fallback also failed:', fallbackError);
|
||||
}
|
||||
}
|
||||
// Error handling for API failures
|
||||
console.error('Failed to load authors:', error);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
setSearchLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
loadAuthors();
|
||||
}, searchQuery ? 500 : 0); // 500ms debounce for search, immediate for other changes
|
||||
|
||||
return () => clearTimeout(debounceTimer);
|
||||
}, [searchQuery, sortBy, sortOrder, currentPage]);
|
||||
|
||||
// Reset pagination when search or sort changes
|
||||
@@ -83,7 +84,17 @@ export default function AuthorsPage() {
|
||||
}
|
||||
};
|
||||
|
||||
// Client-side filtering no longer needed since we use Typesense
|
||||
// Client-side filtering for search query when using regular API
|
||||
useEffect(() => {
|
||||
if (searchQuery) {
|
||||
const filtered = authors.filter(author =>
|
||||
author.name.toLowerCase().includes(searchQuery.toLowerCase())
|
||||
);
|
||||
setFilteredAuthors(filtered);
|
||||
} else {
|
||||
setFilteredAuthors(authors);
|
||||
}
|
||||
}, [authors, searchQuery]);
|
||||
|
||||
// Note: We no longer have individual story ratings in the author list
|
||||
// Average rating would need to be calculated on backend if needed
|
||||
@@ -106,9 +117,9 @@ export default function AuthorsPage() {
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold theme-header">Authors</h1>
|
||||
<p className="theme-text mt-1">
|
||||
{filteredAuthors.length} of {totalHits} {totalHits === 1 ? 'author' : 'authors'}
|
||||
{searchQuery ? `${filteredAuthors.length} of ${authors.length}` : filteredAuthors.length} {(searchQuery ? authors.length : filteredAuthors.length) === 1 ? 'author' : 'authors'}
|
||||
{searchQuery ? ` found` : ` in your library`}
|
||||
{hasMore && ` (showing first ${filteredAuthors.length})`}
|
||||
{!searchQuery && hasMore && ` (showing first ${filteredAuthors.length})`}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
@@ -133,13 +144,18 @@ export default function AuthorsPage() {
|
||||
|
||||
{/* Search and Sort Controls */}
|
||||
<div className="flex flex-col md:flex-row gap-4">
|
||||
<div className="flex-1 max-w-md">
|
||||
<div className="flex-1 max-w-md relative">
|
||||
<Input
|
||||
type="search"
|
||||
placeholder="Search authors..."
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
/>
|
||||
{searchLoading && (
|
||||
<div className="absolute right-3 top-1/2 transform -translate-y-1/2">
|
||||
<div className="animate-spin h-4 w-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex gap-2">
|
||||
@@ -201,7 +217,7 @@ export default function AuthorsPage() {
|
||||
)}
|
||||
|
||||
{/* Load More Button */}
|
||||
{hasMore && (
|
||||
{hasMore && !searchQuery && (
|
||||
<div className="flex justify-center pt-8">
|
||||
<Button
|
||||
onClick={loadMore}
|
||||
@@ -210,7 +226,7 @@ export default function AuthorsPage() {
|
||||
className="px-8 py-3"
|
||||
loading={loading}
|
||||
>
|
||||
{loading ? 'Loading...' : `Load More Authors (${totalHits - filteredAuthors.length} remaining)`}
|
||||
{loading ? 'Loading...' : `Load More Authors (${totalHits - authors.length} remaining)`}
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -85,13 +85,28 @@
|
||||
line-height: 1.7;
|
||||
}
|
||||
|
||||
.reading-content h1,
|
||||
.reading-content h2,
|
||||
.reading-content h3,
|
||||
.reading-content h4,
|
||||
.reading-content h5,
|
||||
.reading-content h1 {
|
||||
@apply text-2xl font-bold mt-8 mb-4 theme-header;
|
||||
}
|
||||
|
||||
.reading-content h2 {
|
||||
@apply text-xl font-bold mt-6 mb-3 theme-header;
|
||||
}
|
||||
|
||||
.reading-content h3 {
|
||||
@apply text-lg font-semibold mt-6 mb-3 theme-header;
|
||||
}
|
||||
|
||||
.reading-content h4 {
|
||||
@apply text-base font-semibold mt-4 mb-2 theme-header;
|
||||
}
|
||||
|
||||
.reading-content h5 {
|
||||
@apply text-sm font-semibold mt-4 mb-2 theme-header;
|
||||
}
|
||||
|
||||
.reading-content h6 {
|
||||
@apply font-bold mt-8 mb-4 theme-header;
|
||||
@apply text-xs font-semibold mt-4 mb-2 theme-header uppercase tracking-wide;
|
||||
}
|
||||
|
||||
.reading-content p {
|
||||
@@ -118,4 +133,107 @@
|
||||
.reading-content em {
|
||||
@apply italic;
|
||||
}
|
||||
|
||||
/* Image styling for story content */
|
||||
.reading-content img {
|
||||
@apply max-w-full h-auto mx-auto my-6 rounded-lg shadow-sm;
|
||||
max-height: 80vh; /* Prevent images from being too tall */
|
||||
display: block;
|
||||
}
|
||||
|
||||
.reading-content img[align="left"] {
|
||||
@apply float-left mr-4 mb-4 ml-0;
|
||||
max-width: 50%;
|
||||
}
|
||||
|
||||
.reading-content img[align="right"] {
|
||||
@apply float-right ml-4 mb-4 mr-0;
|
||||
max-width: 50%;
|
||||
}
|
||||
|
||||
.reading-content img[align="center"] {
|
||||
@apply block mx-auto;
|
||||
}
|
||||
|
||||
/* Editor content styling - same as reading content but for the rich text editor */
|
||||
.editor-content h1 {
|
||||
@apply text-2xl font-bold mt-8 mb-4 theme-header;
|
||||
}
|
||||
|
||||
.editor-content h2 {
|
||||
@apply text-xl font-bold mt-6 mb-3 theme-header;
|
||||
}
|
||||
|
||||
.editor-content h3 {
|
||||
@apply text-lg font-semibold mt-6 mb-3 theme-header;
|
||||
}
|
||||
|
||||
.editor-content h4 {
|
||||
@apply text-base font-semibold mt-4 mb-2 theme-header;
|
||||
}
|
||||
|
||||
.editor-content h5 {
|
||||
@apply text-sm font-semibold mt-4 mb-2 theme-header;
|
||||
}
|
||||
|
||||
.editor-content h6 {
|
||||
@apply text-xs font-semibold mt-4 mb-2 theme-header uppercase tracking-wide;
|
||||
}
|
||||
|
||||
.editor-content p {
|
||||
@apply mb-4 theme-text;
|
||||
}
|
||||
|
||||
.editor-content blockquote {
|
||||
@apply border-l-4 pl-4 italic my-6 theme-border theme-text;
|
||||
}
|
||||
|
||||
.editor-content ul,
|
||||
.editor-content ol {
|
||||
@apply mb-4 pl-6 theme-text;
|
||||
}
|
||||
|
||||
.editor-content li {
|
||||
@apply mb-2;
|
||||
}
|
||||
|
||||
.editor-content strong {
|
||||
@apply font-semibold theme-header;
|
||||
}
|
||||
|
||||
.editor-content em {
|
||||
@apply italic;
|
||||
}
|
||||
|
||||
/* Image styling for editor content */
|
||||
.editor-content img {
|
||||
@apply max-w-full h-auto mx-auto my-4 rounded border;
|
||||
max-height: 60vh; /* Slightly smaller for editor */
|
||||
display: block;
|
||||
}
|
||||
|
||||
.editor-content img[align="left"] {
|
||||
@apply float-left mr-4 mb-4 ml-0;
|
||||
max-width: 50%;
|
||||
}
|
||||
|
||||
.editor-content img[align="right"] {
|
||||
@apply float-right ml-4 mb-4 mr-0;
|
||||
max-width: 50%;
|
||||
}
|
||||
|
||||
.editor-content img[align="center"] {
|
||||
@apply block mx-auto;
|
||||
}
|
||||
|
||||
/* Loading placeholder for images being processed */
|
||||
.image-processing-placeholder {
|
||||
@apply bg-gray-100 dark:bg-gray-800 animate-pulse rounded border-2 border-dashed border-gray-300 dark:border-gray-600 flex items-center justify-center;
|
||||
min-height: 200px;
|
||||
}
|
||||
|
||||
.image-processing-placeholder::before {
|
||||
content: "🖼️ Processing image...";
|
||||
@apply text-gray-500 dark:text-gray-400 text-sm;
|
||||
}
|
||||
}
|
||||
@@ -131,7 +131,7 @@ export default function BulkImportPage() {
|
||||
if (data.combinedStory && combineIntoOne) {
|
||||
// For combine mode, redirect to import page with the combined content
|
||||
localStorage.setItem('pendingStory', JSON.stringify(data.combinedStory));
|
||||
router.push('/import?from=bulk-combine');
|
||||
router.push('/add-story?from=bulk-combine');
|
||||
return;
|
||||
} else if (data.results && data.summary) {
|
||||
// For individual mode, show the results
|
||||
|
||||
@@ -1,188 +1,17 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useRef, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import { useAuth } from '../../contexts/AuthContext';
|
||||
import { useState } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
import ImportLayout from '../../components/layout/ImportLayout';
|
||||
import { Input, Textarea } from '../../components/ui/Input';
|
||||
import { Input } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import TagInput from '../../components/stories/TagInput';
|
||||
import RichTextEditor from '../../components/stories/RichTextEditor';
|
||||
import ImageUpload from '../../components/ui/ImageUpload';
|
||||
import AuthorSelector from '../../components/stories/AuthorSelector';
|
||||
import { storyApi, authorApi } from '../../lib/api';
|
||||
|
||||
export default function AddStoryPage() {
|
||||
const [importMode, setImportMode] = useState<'manual' | 'url'>('manual');
|
||||
export default function ImportFromUrlPage() {
|
||||
const [importUrl, setImportUrl] = useState('');
|
||||
const [scraping, setScraping] = useState(false);
|
||||
const [formData, setFormData] = useState({
|
||||
title: '',
|
||||
summary: '',
|
||||
authorName: '',
|
||||
authorId: undefined as string | undefined,
|
||||
contentHtml: '',
|
||||
sourceUrl: '',
|
||||
tags: [] as string[],
|
||||
seriesName: '',
|
||||
volume: '',
|
||||
});
|
||||
|
||||
const [coverImage, setCoverImage] = useState<File | null>(null);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||
const [duplicateWarning, setDuplicateWarning] = useState<{
|
||||
show: boolean;
|
||||
count: number;
|
||||
duplicates: Array<{
|
||||
id: string;
|
||||
title: string;
|
||||
authorName: string;
|
||||
createdAt: string;
|
||||
}>;
|
||||
}>({ show: false, count: 0, duplicates: [] });
|
||||
const [checkingDuplicates, setCheckingDuplicates] = useState(false);
|
||||
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { isAuthenticated } = useAuth();
|
||||
|
||||
// Handle URL parameters
|
||||
useEffect(() => {
|
||||
const authorId = searchParams.get('authorId');
|
||||
const mode = searchParams.get('mode');
|
||||
|
||||
// Set import mode if specified in URL
|
||||
if (mode === 'url') {
|
||||
setImportMode('url');
|
||||
}
|
||||
|
||||
// Pre-fill author if authorId is provided in URL
|
||||
if (authorId) {
|
||||
const loadAuthor = async () => {
|
||||
try {
|
||||
const author = await authorApi.getAuthor(authorId);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName: author.name,
|
||||
authorId: author.id
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error('Failed to load author:', error);
|
||||
}
|
||||
};
|
||||
loadAuthor();
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Load pending story data from bulk combine operation
|
||||
useEffect(() => {
|
||||
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
|
||||
if (fromBulkCombine) {
|
||||
const pendingStoryData = localStorage.getItem('pendingStory');
|
||||
if (pendingStoryData) {
|
||||
try {
|
||||
const storyData = JSON.parse(pendingStoryData);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
title: storyData.title || '',
|
||||
authorName: storyData.author || '',
|
||||
authorId: undefined, // Reset author ID for bulk combined stories
|
||||
contentHtml: storyData.content || '',
|
||||
sourceUrl: storyData.sourceUrl || '',
|
||||
summary: storyData.summary || '',
|
||||
tags: storyData.tags || []
|
||||
}));
|
||||
// Clear the pending data
|
||||
localStorage.removeItem('pendingStory');
|
||||
} catch (error) {
|
||||
console.error('Failed to load pending story data:', error);
|
||||
}
|
||||
}
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Check for duplicates when title and author are both present
|
||||
useEffect(() => {
|
||||
const checkDuplicates = async () => {
|
||||
const title = formData.title.trim();
|
||||
const authorName = formData.authorName.trim();
|
||||
|
||||
// Don't check if user isn't authenticated or if title/author are empty
|
||||
if (!isAuthenticated || !title || !authorName) {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
return;
|
||||
}
|
||||
|
||||
// Debounce the check to avoid too many API calls
|
||||
const timeoutId = setTimeout(async () => {
|
||||
try {
|
||||
setCheckingDuplicates(true);
|
||||
const result = await storyApi.checkDuplicate(title, authorName);
|
||||
|
||||
if (result.hasDuplicates) {
|
||||
setDuplicateWarning({
|
||||
show: true,
|
||||
count: result.count,
|
||||
duplicates: result.duplicates
|
||||
});
|
||||
} else {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to check for duplicates:', error);
|
||||
// Clear any existing duplicate warnings on error
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
// Don't show error to user as this is just a helpful warning
|
||||
// Authentication errors will be handled by the API interceptor
|
||||
} finally {
|
||||
setCheckingDuplicates(false);
|
||||
}
|
||||
}, 500); // 500ms debounce
|
||||
|
||||
return () => clearTimeout(timeoutId);
|
||||
};
|
||||
|
||||
checkDuplicates();
|
||||
}, [formData.title, formData.authorName, isAuthenticated]);
|
||||
|
||||
const handleInputChange = (field: string) => (
|
||||
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
|
||||
) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
[field]: e.target.value
|
||||
}));
|
||||
|
||||
// Clear error when user starts typing
|
||||
if (errors[field]) {
|
||||
setErrors(prev => ({ ...prev, [field]: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleContentChange = (html: string) => {
|
||||
setFormData(prev => ({ ...prev, contentHtml: html }));
|
||||
if (errors.contentHtml) {
|
||||
setErrors(prev => ({ ...prev, contentHtml: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleTagsChange = (tags: string[]) => {
|
||||
setFormData(prev => ({ ...prev, tags }));
|
||||
};
|
||||
|
||||
const handleAuthorChange = (authorName: string, authorId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName,
|
||||
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes author
|
||||
if (errors.authorName) {
|
||||
setErrors(prev => ({ ...prev, authorName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleImportFromUrl = async () => {
|
||||
if (!importUrl.trim()) {
|
||||
@@ -209,25 +38,18 @@ export default function AddStoryPage() {
|
||||
|
||||
const scrapedStory = await response.json();
|
||||
|
||||
// Pre-fill the form with scraped data
|
||||
setFormData({
|
||||
// Redirect to add-story page with pre-filled data
|
||||
const queryParams = new URLSearchParams({
|
||||
from: 'url-import',
|
||||
title: scrapedStory.title || '',
|
||||
summary: scrapedStory.summary || '',
|
||||
authorName: scrapedStory.author || '',
|
||||
authorId: undefined, // Reset author ID when importing from URL (likely new author)
|
||||
contentHtml: scrapedStory.content || '',
|
||||
author: scrapedStory.author || '',
|
||||
sourceUrl: scrapedStory.sourceUrl || importUrl,
|
||||
tags: scrapedStory.tags || [],
|
||||
seriesName: '',
|
||||
volume: '',
|
||||
tags: JSON.stringify(scrapedStory.tags || []),
|
||||
content: scrapedStory.content || ''
|
||||
});
|
||||
|
||||
// Switch to manual mode so user can edit the pre-filled data
|
||||
setImportMode('manual');
|
||||
setImportUrl('');
|
||||
|
||||
// Show success message
|
||||
setErrors({ success: 'Story data imported successfully! Review and edit as needed before saving.' });
|
||||
router.push(`/add-story?${queryParams.toString()}`);
|
||||
} catch (error: any) {
|
||||
console.error('Failed to import story:', error);
|
||||
setErrors({ importUrl: error.message });
|
||||
@@ -236,85 +58,16 @@ export default function AddStoryPage() {
|
||||
}
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
const newErrors: Record<string, string> = {};
|
||||
|
||||
if (!formData.title.trim()) {
|
||||
newErrors.title = 'Title is required';
|
||||
}
|
||||
|
||||
if (!formData.authorName.trim()) {
|
||||
newErrors.authorName = 'Author name is required';
|
||||
}
|
||||
|
||||
if (!formData.contentHtml.trim()) {
|
||||
newErrors.contentHtml = 'Story content is required';
|
||||
}
|
||||
|
||||
if (formData.seriesName && !formData.volume) {
|
||||
newErrors.volume = 'Volume number is required when series is specified';
|
||||
}
|
||||
|
||||
if (formData.volume && !formData.seriesName.trim()) {
|
||||
newErrors.seriesName = 'Series name is required when volume is specified';
|
||||
}
|
||||
|
||||
setErrors(newErrors);
|
||||
return Object.keys(newErrors).length === 0;
|
||||
};
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!validateForm()) {
|
||||
return;
|
||||
}
|
||||
|
||||
setLoading(true);
|
||||
|
||||
try {
|
||||
// First, create the story with JSON data
|
||||
const storyData = {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: formData.contentHtml,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
seriesName: formData.seriesName || undefined,
|
||||
// Send authorId if we have it (existing author), otherwise send authorName (new author)
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
};
|
||||
|
||||
const story = await storyApi.createStory(storyData);
|
||||
|
||||
// If there's a cover image, upload it separately
|
||||
if (coverImage) {
|
||||
await storyApi.uploadCover(story.id, coverImage);
|
||||
}
|
||||
|
||||
router.push(`/stories/${story.id}`);
|
||||
} catch (error: any) {
|
||||
console.error('Failed to create story:', error);
|
||||
const errorMessage = error.response?.data?.message || 'Failed to create story';
|
||||
setErrors({ submit: errorMessage });
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<ImportLayout
|
||||
title="Add New Story"
|
||||
description="Add a story to your personal collection"
|
||||
title="Import Story from URL"
|
||||
description="Import a single story from a website"
|
||||
>
|
||||
{/* URL Import Section */}
|
||||
{importMode === 'url' && (
|
||||
<div className="space-y-6">
|
||||
<div className="bg-gray-50 dark:bg-gray-800/50 rounded-lg p-6">
|
||||
<h3 className="text-lg font-medium theme-header mb-4">Import Story from URL</h3>
|
||||
<p className="theme-text text-sm mb-4">
|
||||
Enter a URL from a supported story site to automatically extract the story content, title, author, and other metadata.
|
||||
Enter a URL from a supported story site to automatically extract the story content, title, author, and other metadata. After importing, you'll be able to review and edit the data before saving.
|
||||
</p>
|
||||
|
||||
<div className="space-y-4">
|
||||
@@ -341,7 +94,7 @@ export default function AddStoryPage() {
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={() => setImportMode('manual')}
|
||||
href="/add-story"
|
||||
disabled={scraping}
|
||||
>
|
||||
Enter Manually Instead
|
||||
@@ -355,191 +108,6 @@ export default function AddStoryPage() {
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Success Message */}
|
||||
{errors.success && (
|
||||
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg mb-6">
|
||||
<p className="text-green-800 dark:text-green-200">{errors.success}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Manual Entry Form */}
|
||||
{importMode === 'manual' && (
|
||||
<form onSubmit={handleSubmit} className="space-y-6">
|
||||
{/* Title */}
|
||||
<Input
|
||||
label="Title *"
|
||||
value={formData.title}
|
||||
onChange={handleInputChange('title')}
|
||||
placeholder="Enter the story title"
|
||||
error={errors.title}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Author Selector */}
|
||||
<AuthorSelector
|
||||
label="Author *"
|
||||
value={formData.authorName}
|
||||
onChange={handleAuthorChange}
|
||||
placeholder="Select or enter author name"
|
||||
error={errors.authorName}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Duplicate Warning */}
|
||||
{duplicateWarning.show && (
|
||||
<div className="p-4 bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg">
|
||||
<div className="flex items-start gap-3">
|
||||
<div className="text-yellow-600 dark:text-yellow-400 mt-0.5">
|
||||
⚠️
|
||||
</div>
|
||||
<div>
|
||||
<h4 className="font-medium text-yellow-800 dark:text-yellow-200">
|
||||
Potential Duplicate Detected
|
||||
</h4>
|
||||
<p className="text-sm text-yellow-700 dark:text-yellow-300 mt-1">
|
||||
Found {duplicateWarning.count} existing {duplicateWarning.count === 1 ? 'story' : 'stories'} with the same title and author:
|
||||
</p>
|
||||
<ul className="mt-2 space-y-1">
|
||||
{duplicateWarning.duplicates.map((duplicate, index) => (
|
||||
<li key={duplicate.id} className="text-sm text-yellow-700 dark:text-yellow-300">
|
||||
• <span className="font-medium">{duplicate.title}</span> by {duplicate.authorName}
|
||||
<span className="text-xs ml-2">
|
||||
(added {new Date(duplicate.createdAt).toLocaleDateString()})
|
||||
</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-xs text-yellow-600 dark:text-yellow-400 mt-2">
|
||||
You can still create this story if it's different from the existing ones.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Checking indicator */}
|
||||
{checkingDuplicates && (
|
||||
<div className="flex items-center gap-2 text-sm theme-text">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||
Checking for duplicates...
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Summary */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Summary
|
||||
</label>
|
||||
<Textarea
|
||||
value={formData.summary}
|
||||
onChange={handleInputChange('summary')}
|
||||
placeholder="Brief summary or description of the story..."
|
||||
rows={3}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-1">
|
||||
Optional summary that will be displayed on the story detail page
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Cover Image Upload */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Cover Image
|
||||
</label>
|
||||
<ImageUpload
|
||||
onImageSelect={setCoverImage}
|
||||
accept="image/jpeg,image/png"
|
||||
maxSizeMB={5}
|
||||
aspectRatio="3:4"
|
||||
placeholder="Drop a cover image here or click to select"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Story Content *
|
||||
</label>
|
||||
<RichTextEditor
|
||||
value={formData.contentHtml}
|
||||
onChange={handleContentChange}
|
||||
placeholder="Write or paste your story content here..."
|
||||
error={errors.contentHtml}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Tags
|
||||
</label>
|
||||
<TagInput
|
||||
tags={formData.tags}
|
||||
onChange={handleTagsChange}
|
||||
placeholder="Add tags to categorize your story..."
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Series and Volume */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<Input
|
||||
label="Series (optional)"
|
||||
value={formData.seriesName}
|
||||
onChange={handleInputChange('seriesName')}
|
||||
placeholder="Enter series name if part of a series"
|
||||
error={errors.seriesName}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Volume/Part (optional)"
|
||||
type="number"
|
||||
min="1"
|
||||
value={formData.volume}
|
||||
onChange={handleInputChange('volume')}
|
||||
placeholder="Enter volume/part number"
|
||||
error={errors.volume}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Source URL */}
|
||||
<Input
|
||||
label="Source URL (optional)"
|
||||
type="url"
|
||||
value={formData.sourceUrl}
|
||||
onChange={handleInputChange('sourceUrl')}
|
||||
placeholder="https://example.com/original-story-url"
|
||||
/>
|
||||
|
||||
{/* Submit Error */}
|
||||
{errors.submit && (
|
||||
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<p className="text-red-800 dark:text-red-200">{errors.submit}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4 pt-6">
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={() => router.back()}
|
||||
disabled={loading}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
type="submit"
|
||||
loading={loading}
|
||||
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
|
||||
>
|
||||
Add Story
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
)}
|
||||
</ImportLayout>
|
||||
);
|
||||
}
|
||||
@@ -1,21 +1,27 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
import { searchApi, storyApi } from '../../lib/api';
|
||||
import { Story, Tag, FacetCount } from '../../types/api';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import { searchApi, storyApi, tagApi } from '../../lib/api';
|
||||
import { Story, Tag, FacetCount, AdvancedFilters } from '../../types/api';
|
||||
import AppLayout from '../../components/layout/AppLayout';
|
||||
import { Input } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import StoryMultiSelect from '../../components/stories/StoryMultiSelect';
|
||||
import TagFilter from '../../components/stories/TagFilter';
|
||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||
import SidebarLayout from '../../components/library/SidebarLayout';
|
||||
import ToolbarLayout from '../../components/library/ToolbarLayout';
|
||||
import MinimalLayout from '../../components/library/MinimalLayout';
|
||||
import { useLibraryLayout } from '../../hooks/useLibraryLayout';
|
||||
|
||||
type ViewMode = 'grid' | 'list';
|
||||
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastRead';
|
||||
|
||||
export default function LibraryPage() {
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { layout } = useLibraryLayout();
|
||||
const [stories, setStories] = useState<Story[]>([]);
|
||||
const [tags, setTags] = useState<Tag[]>([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
@@ -30,29 +36,101 @@ export default function LibraryPage() {
|
||||
const [totalPages, setTotalPages] = useState(1);
|
||||
const [totalElements, setTotalElements] = useState(0);
|
||||
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
||||
const [urlParamsProcessed, setUrlParamsProcessed] = useState(false);
|
||||
const [advancedFilters, setAdvancedFilters] = useState<AdvancedFilters>({});
|
||||
|
||||
// Initialize filters from URL parameters
|
||||
useEffect(() => {
|
||||
const tagsParam = searchParams.get('tags');
|
||||
if (tagsParam) {
|
||||
console.log('URL tag filter detected:', tagsParam);
|
||||
// Use functional updates to ensure all state changes happen together
|
||||
setSelectedTags([tagsParam]);
|
||||
setPage(0); // Reset to first page when applying URL filter
|
||||
}
|
||||
setUrlParamsProcessed(true);
|
||||
}, [searchParams]);
|
||||
|
||||
// Convert facet counts to Tag objects for the UI, enriched with full tag data
|
||||
const [fullTags, setFullTags] = useState<Tag[]>([]);
|
||||
|
||||
// Fetch full tag data for enrichment
|
||||
useEffect(() => {
|
||||
const fetchFullTags = async () => {
|
||||
try {
|
||||
const result = await tagApi.getTags({ size: 1000 }); // Get all tags
|
||||
setFullTags(result.content || []);
|
||||
} catch (error) {
|
||||
console.error('Failed to fetch full tag data:', error);
|
||||
setFullTags([]);
|
||||
}
|
||||
};
|
||||
|
||||
fetchFullTags();
|
||||
}, []);
|
||||
|
||||
// Convert facet counts to Tag objects for the UI
|
||||
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
|
||||
if (!facets || !facets.tagNames) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return facets.tagNames.map(facet => ({
|
||||
id: facet.value, // Use tag name as ID since we don't have actual IDs from search results
|
||||
return facets.tagNames.map(facet => {
|
||||
// Find the full tag data by name
|
||||
const fullTag = fullTags.find(tag => tag.name.toLowerCase() === facet.value.toLowerCase());
|
||||
|
||||
return {
|
||||
id: fullTag?.id || facet.value, // Use actual ID if available, fallback to name
|
||||
name: facet.value,
|
||||
storyCount: facet.count
|
||||
}));
|
||||
storyCount: facet.count,
|
||||
// Include color and other metadata from the full tag data
|
||||
color: fullTag?.color,
|
||||
description: fullTag?.description,
|
||||
aliasCount: fullTag?.aliasCount,
|
||||
createdAt: fullTag?.createdAt,
|
||||
aliases: fullTag?.aliases
|
||||
};
|
||||
});
|
||||
};
|
||||
|
||||
// Enrich existing tags when fullTags are loaded
|
||||
useEffect(() => {
|
||||
if (fullTags.length > 0) {
|
||||
// Use functional update to get the current tags state
|
||||
setTags(currentTags => {
|
||||
if (currentTags.length > 0) {
|
||||
// Check if tags already have color data to avoid infinite loops
|
||||
const hasColors = currentTags.some(tag => tag.color);
|
||||
if (!hasColors) {
|
||||
// Re-enrich existing tags with color data
|
||||
return currentTags.map(tag => {
|
||||
const fullTag = fullTags.find(ft => ft.name.toLowerCase() === tag.name.toLowerCase());
|
||||
return {
|
||||
...tag,
|
||||
color: fullTag?.color,
|
||||
description: fullTag?.description,
|
||||
aliasCount: fullTag?.aliasCount,
|
||||
createdAt: fullTag?.createdAt,
|
||||
aliases: fullTag?.aliases,
|
||||
id: fullTag?.id || tag.id
|
||||
};
|
||||
});
|
||||
}
|
||||
}
|
||||
return currentTags; // Return unchanged if no enrichment needed
|
||||
});
|
||||
}
|
||||
}, [fullTags]); // Only run when fullTags change
|
||||
|
||||
// Debounce search to avoid too many API calls
|
||||
useEffect(() => {
|
||||
// Don't run search until URL parameters have been processed
|
||||
if (!urlParamsProcessed) return;
|
||||
|
||||
const debounceTimer = setTimeout(() => {
|
||||
const performSearch = async () => {
|
||||
try {
|
||||
// Use searchLoading for background search, loading only for initial load
|
||||
const isInitialLoad = stories.length === 0 && !searchQuery && selectedTags.length === 0;
|
||||
const isInitialLoad = stories.length === 0 && !searchQuery;
|
||||
if (isInitialLoad) {
|
||||
setLoading(true);
|
||||
} else {
|
||||
@@ -60,7 +138,7 @@ export default function LibraryPage() {
|
||||
}
|
||||
|
||||
// Always use search API for consistency - use '*' for match-all when no query
|
||||
const result = await searchApi.search({
|
||||
const apiParams = {
|
||||
query: searchQuery.trim() || '*',
|
||||
page: page, // Use 0-based pagination consistently
|
||||
size: 20,
|
||||
@@ -68,7 +146,12 @@ export default function LibraryPage() {
|
||||
sortBy: sortOption,
|
||||
sortDir: sortDirection,
|
||||
facetBy: ['tagNames'], // Request tag facets for the filter UI
|
||||
});
|
||||
// Advanced filters
|
||||
...advancedFilters
|
||||
};
|
||||
|
||||
console.log('Performing search with params:', apiParams);
|
||||
const result = await searchApi.search(apiParams);
|
||||
|
||||
const currentStories = result?.results || [];
|
||||
setStories(currentStories);
|
||||
@@ -78,9 +161,11 @@ export default function LibraryPage() {
|
||||
// Update tags from facets - these represent all matching stories, not just current page
|
||||
const resultTags = convertFacetsToTags(result?.facets);
|
||||
setTags(resultTags);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to load stories:', error);
|
||||
setStories([]);
|
||||
setTags([]);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
setSearchLoading(false);
|
||||
@@ -88,90 +173,35 @@ export default function LibraryPage() {
|
||||
};
|
||||
|
||||
performSearch();
|
||||
}, searchQuery ? 500 : 0); // 500ms debounce for search, immediate for other changes
|
||||
}, searchQuery ? 500 : 0); // Debounce search queries, but load immediately for filters/pagination
|
||||
|
||||
return () => clearTimeout(debounceTimer);
|
||||
}, [searchQuery, selectedTags, page, sortOption, sortDirection, refreshTrigger]);
|
||||
|
||||
// Reset page when search or filters change
|
||||
const resetPage = () => {
|
||||
if (page !== 0) {
|
||||
setPage(0);
|
||||
}
|
||||
};
|
||||
|
||||
const handleTagToggle = (tagName: string) => {
|
||||
setSelectedTags(prev => {
|
||||
const newTags = prev.includes(tagName)
|
||||
? prev.filter(t => t !== tagName)
|
||||
: [...prev, tagName];
|
||||
resetPage();
|
||||
return newTags;
|
||||
});
|
||||
};
|
||||
}, [searchQuery, selectedTags, sortOption, sortDirection, page, refreshTrigger, urlParamsProcessed, advancedFilters]);
|
||||
|
||||
const handleSearchChange = (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
setSearchQuery(e.target.value);
|
||||
resetPage();
|
||||
};
|
||||
|
||||
const handleSortChange = (newSortOption: SortOption) => {
|
||||
setSortOption(newSortOption);
|
||||
// Set appropriate default direction for the sort option
|
||||
if (newSortOption === 'title' || newSortOption === 'authorName') {
|
||||
setSortDirection('asc'); // Alphabetical fields default to ascending
|
||||
} else {
|
||||
setSortDirection('desc'); // Numeric/date fields default to descending
|
||||
}
|
||||
resetPage();
|
||||
};
|
||||
|
||||
const toggleSortDirection = () => {
|
||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||
resetPage();
|
||||
};
|
||||
|
||||
const clearFilters = () => {
|
||||
setSearchQuery('');
|
||||
setSelectedTags([]);
|
||||
resetPage();
|
||||
setPage(0);
|
||||
};
|
||||
|
||||
const handleStoryUpdate = () => {
|
||||
// Trigger reload by incrementing refresh trigger
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleRandomStory = async () => {
|
||||
if (totalElements === 0) return;
|
||||
|
||||
try {
|
||||
setRandomLoading(true);
|
||||
|
||||
// Build filter parameters based on current UI state
|
||||
const filters: Record<string, any> = {};
|
||||
|
||||
// Include search query if present
|
||||
if (searchQuery && searchQuery.trim() !== '' && searchQuery !== '*') {
|
||||
filters.searchQuery = searchQuery.trim();
|
||||
const randomStory = await storyApi.getRandomStory({
|
||||
searchQuery: searchQuery || undefined,
|
||||
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
||||
...advancedFilters
|
||||
});
|
||||
if (randomStory) {
|
||||
router.push(`/stories/${randomStory.id}`);
|
||||
} else {
|
||||
alert('No stories available. Please add some stories first.');
|
||||
}
|
||||
|
||||
// Include all selected tags
|
||||
if (selectedTags.length > 0) {
|
||||
filters.tags = selectedTags;
|
||||
}
|
||||
|
||||
console.log('Getting random story with filters:', filters);
|
||||
|
||||
const randomStory = await storyApi.getRandomStory(filters);
|
||||
|
||||
if (!randomStory) {
|
||||
// No stories match the current filters
|
||||
alert('No stories match your current filters. Try clearing some filters or adding more stories to your library.');
|
||||
return;
|
||||
}
|
||||
|
||||
// Navigate to the random story's reading page
|
||||
router.push(`/stories/${randomStory.id}/read`);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to get random story:', error);
|
||||
alert('Failed to get a random story. Please try again.');
|
||||
@@ -180,6 +210,34 @@ export default function LibraryPage() {
|
||||
}
|
||||
};
|
||||
|
||||
const clearFilters = () => {
|
||||
setSearchQuery('');
|
||||
setSelectedTags([]);
|
||||
setAdvancedFilters({});
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleTagToggle = (tagName: string) => {
|
||||
setSelectedTags(prev =>
|
||||
prev.includes(tagName)
|
||||
? prev.filter(t => t !== tagName)
|
||||
: [...prev, tagName]
|
||||
);
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
const handleSortDirectionToggle = () => {
|
||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||
};
|
||||
|
||||
const handleAdvancedFiltersChange = (filters: AdvancedFilters) => {
|
||||
setAdvancedFilters(filters);
|
||||
setPage(0);
|
||||
setRefreshTrigger(prev => prev + 1);
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<AppLayout>
|
||||
@@ -190,154 +248,61 @@ export default function LibraryPage() {
|
||||
);
|
||||
}
|
||||
|
||||
const handleSortChange = (option: string) => {
|
||||
setSortOption(option as SortOption);
|
||||
};
|
||||
|
||||
const layoutProps = {
|
||||
stories,
|
||||
tags,
|
||||
totalElements,
|
||||
searchQuery,
|
||||
selectedTags,
|
||||
viewMode,
|
||||
sortOption,
|
||||
sortDirection,
|
||||
advancedFilters,
|
||||
onSearchChange: handleSearchChange,
|
||||
onTagToggle: handleTagToggle,
|
||||
onViewModeChange: setViewMode,
|
||||
onSortChange: handleSortChange,
|
||||
onSortDirectionToggle: handleSortDirectionToggle,
|
||||
onAdvancedFiltersChange: handleAdvancedFiltersChange,
|
||||
onRandomStory: handleRandomStory,
|
||||
onClearFilters: clearFilters,
|
||||
};
|
||||
|
||||
const renderContent = () => {
|
||||
if (stories.length === 0 && !loading) {
|
||||
return (
|
||||
<AppLayout>
|
||||
<div className="space-y-6">
|
||||
{/* Header */}
|
||||
<div className="flex flex-col sm:flex-row sm:items-center sm:justify-between gap-4">
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold theme-header">Your Story Library</h1>
|
||||
<p className="theme-text mt-1">
|
||||
{totalElements} {totalElements === 1 ? 'story' : 'stories'}
|
||||
{searchQuery || selectedTags.length > 0 ? ` found` : ` total`}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="flex gap-2">
|
||||
<Button
|
||||
onClick={handleRandomStory}
|
||||
disabled={randomLoading || totalElements === 0}
|
||||
variant="secondary"
|
||||
>
|
||||
{randomLoading ? '🎲 ...' : '🎲 Random Story'}
|
||||
</Button>
|
||||
<Button href="/import">
|
||||
Add New Story
|
||||
</Button>
|
||||
<Button href="/import/epub" variant="secondary">
|
||||
📖 Import EPUB
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Search and Filters */}
|
||||
<div className="space-y-4">
|
||||
{/* Search Bar */}
|
||||
<div className="flex flex-col sm:flex-row gap-4">
|
||||
<div className="flex-1 relative">
|
||||
<Input
|
||||
type="search"
|
||||
placeholder="Search by title, author, or tags..."
|
||||
value={searchQuery}
|
||||
onChange={handleSearchChange}
|
||||
className="w-full"
|
||||
/>
|
||||
{searchLoading && (
|
||||
<div className="absolute right-3 top-1/2 transform -translate-y-1/2">
|
||||
<div className="animate-spin h-4 w-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* View Mode Toggle */}
|
||||
<div className="flex items-center gap-2">
|
||||
<button
|
||||
onClick={() => setViewMode('grid')}
|
||||
className={`p-2 rounded-lg transition-colors ${
|
||||
viewMode === 'grid'
|
||||
? 'theme-accent-bg text-white'
|
||||
: 'theme-card theme-text hover:bg-opacity-80'
|
||||
}`}
|
||||
aria-label="Grid view"
|
||||
>
|
||||
⊞
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setViewMode('list')}
|
||||
className={`p-2 rounded-lg transition-colors ${
|
||||
viewMode === 'list'
|
||||
? 'theme-accent-bg text-white'
|
||||
: 'theme-card theme-text hover:bg-opacity-80'
|
||||
}`}
|
||||
aria-label="List view"
|
||||
>
|
||||
☰
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Sort and Tag Filters */}
|
||||
<div className="flex flex-col sm:flex-row gap-4">
|
||||
{/* Sort Options */}
|
||||
<div className="flex items-center gap-2">
|
||||
<label className="theme-text font-medium text-sm">Sort by:</label>
|
||||
<select
|
||||
value={sortOption}
|
||||
onChange={(e) => handleSortChange(e.target.value as SortOption)}
|
||||
className="px-3 py-1 rounded-lg theme-card theme-text theme-border border focus:outline-none focus:ring-2 focus:ring-theme-accent"
|
||||
>
|
||||
<option value="createdAt">Date Added</option>
|
||||
<option value="title">Title</option>
|
||||
<option value="authorName">Author</option>
|
||||
<option value="rating">Rating</option>
|
||||
<option value="wordCount">Word Count</option>
|
||||
<option value="lastRead">Last Read</option>
|
||||
</select>
|
||||
|
||||
{/* Sort Direction Toggle */}
|
||||
<button
|
||||
onClick={toggleSortDirection}
|
||||
className="p-2 rounded-lg theme-card theme-text hover:bg-opacity-80 transition-colors border theme-border"
|
||||
title={`Sort ${sortDirection === 'asc' ? 'Ascending' : 'Descending'}`}
|
||||
aria-label={`Toggle sort direction - currently ${sortDirection === 'asc' ? 'ascending' : 'descending'}`}
|
||||
>
|
||||
{sortDirection === 'asc' ? '↑' : '↓'}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{/* Clear Filters */}
|
||||
{(searchQuery || selectedTags.length > 0) && (
|
||||
<Button variant="ghost" size="sm" onClick={clearFilters}>
|
||||
Clear Filters
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Tag Filter */}
|
||||
<TagFilter
|
||||
tags={tags}
|
||||
selectedTags={selectedTags}
|
||||
onTagToggle={handleTagToggle}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Stories Display */}
|
||||
{stories.length === 0 && !loading ? (
|
||||
<div className="text-center py-20">
|
||||
<div className="theme-text text-lg mb-4">
|
||||
{searchQuery || selectedTags.length > 0
|
||||
? 'No stories match your filters'
|
||||
: 'No stories in your library yet'
|
||||
<div className="text-center py-12 theme-card theme-shadow rounded-lg">
|
||||
<p className="theme-text text-lg mb-4">
|
||||
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false)
|
||||
? 'No stories match your search criteria.'
|
||||
: 'Your library is empty.'
|
||||
}
|
||||
</div>
|
||||
{searchQuery || selectedTags.length > 0 ? (
|
||||
</p>
|
||||
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false) ? (
|
||||
<Button variant="ghost" onClick={clearFilters}>
|
||||
Clear Filters
|
||||
</Button>
|
||||
) : (
|
||||
<Button href="/import">
|
||||
<Button href="/add-story">
|
||||
Add Your First Story
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<StoryMultiSelect
|
||||
stories={stories}
|
||||
viewMode={viewMode}
|
||||
onUpdate={handleStoryUpdate}
|
||||
allowMultiSelect={true}
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Pagination */}
|
||||
{totalPages > 1 && (
|
||||
@@ -363,7 +328,19 @@ export default function LibraryPage() {
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
const LayoutComponent = layout === 'sidebar' ? SidebarLayout :
|
||||
layout === 'toolbar' ? ToolbarLayout :
|
||||
MinimalLayout;
|
||||
|
||||
return (
|
||||
<AppLayout>
|
||||
<LayoutComponent {...layoutProps}>
|
||||
{renderContent()}
|
||||
</LayoutComponent>
|
||||
</AppLayout>
|
||||
);
|
||||
}
|
||||
@@ -193,24 +193,42 @@ async function processCombinedMode(
|
||||
console.log(`Combined content character length: ${combinedContentString.length}`);
|
||||
console.log(`Combined content parts count: ${combinedContent.length}`);
|
||||
|
||||
// Handle content truncation if needed
|
||||
let finalContent = contentSizeInMB > 10 ?
|
||||
combinedContentString.substring(0, Math.floor(combinedContentString.length * (10 / contentSizeInMB))) + '\n\n<!-- Content truncated due to size limit -->' :
|
||||
combinedContentString;
|
||||
|
||||
let finalSummary = contentSizeInMB > 10 ? baseSummary + ' (Content truncated due to size limit)' : baseSummary;
|
||||
|
||||
// Check if combined content has images and mark for processing
|
||||
const hasImages = /<img[^>]+src=['"'][^'"']*['"][^>]*>/i.test(finalContent);
|
||||
if (hasImages) {
|
||||
finalSummary += ' (Contains embedded images - will be processed after story creation)';
|
||||
console.log(`Combined story contains embedded images - will need processing after creation`);
|
||||
}
|
||||
|
||||
// Return the combined story data via progress update
|
||||
const combinedStory = {
|
||||
title: baseTitle,
|
||||
author: baseAuthor,
|
||||
content: contentSizeInMB > 10 ?
|
||||
combinedContentString.substring(0, Math.floor(combinedContentString.length * (10 / contentSizeInMB))) + '\n\n<!-- Content truncated due to size limit -->' :
|
||||
combinedContentString,
|
||||
summary: contentSizeInMB > 10 ? baseSummary + ' (Content truncated due to size limit)' : baseSummary,
|
||||
content: finalContent,
|
||||
summary: finalSummary,
|
||||
sourceUrl: baseSourceUrl,
|
||||
tags: Array.from(combinedTags)
|
||||
tags: Array.from(combinedTags),
|
||||
hasImages: hasImages
|
||||
};
|
||||
|
||||
// Send completion notification for combine mode
|
||||
let completionMessage = `Combined scraping completed: ${totalWordCount.toLocaleString()} words from ${importedCount} stories`;
|
||||
if (hasImages) {
|
||||
completionMessage += ` (embedded images will be processed when story is created)`;
|
||||
}
|
||||
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'completed',
|
||||
current: urls.length,
|
||||
total: urls.length,
|
||||
message: `Combined scraping completed: ${totalWordCount.toLocaleString()} words from ${importedCount} stories`,
|
||||
message: completionMessage,
|
||||
totalWordCount: totalWordCount,
|
||||
combinedStory: combinedStory
|
||||
});
|
||||
@@ -347,6 +365,61 @@ async function processIndividualMode(
|
||||
|
||||
const createdStory = await createResponse.json();
|
||||
|
||||
// Process embedded images if content contains images
|
||||
let imageProcessingWarnings: string[] = [];
|
||||
const hasImages = /<img[^>]+src=['"'][^'"']*['"][^>]*>/i.test(scrapedStory.content);
|
||||
|
||||
if (hasImages) {
|
||||
try {
|
||||
console.log(`Processing embedded images for story: ${createdStory.id}`);
|
||||
const imageProcessUrl = `http://backend:8080/api/stories/${createdStory.id}/process-content-images`;
|
||||
const imageProcessResponse = await fetch(imageProcessUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ htmlContent: scrapedStory.content }),
|
||||
});
|
||||
|
||||
if (imageProcessResponse.ok) {
|
||||
const imageResult = await imageProcessResponse.json();
|
||||
if (imageResult.hasWarnings && imageResult.warnings) {
|
||||
imageProcessingWarnings = imageResult.warnings;
|
||||
console.log(`Image processing completed with warnings for story ${createdStory.id}:`, imageResult.warnings);
|
||||
} else {
|
||||
console.log(`Image processing completed successfully for story ${createdStory.id}. Downloaded ${imageResult.downloadedImages?.length || 0} images.`);
|
||||
}
|
||||
|
||||
// Update story content with processed images
|
||||
if (imageResult.processedContent && imageResult.processedContent !== scrapedStory.content) {
|
||||
const updateUrl = `http://backend:8080/api/stories/${createdStory.id}`;
|
||||
const updateResponse = await fetch(updateUrl, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Authorization': authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
contentHtml: imageResult.processedContent
|
||||
}),
|
||||
});
|
||||
|
||||
if (!updateResponse.ok) {
|
||||
console.warn(`Failed to update story content after image processing for ${createdStory.id}`);
|
||||
imageProcessingWarnings.push('Failed to update story content with processed images');
|
||||
}
|
||||
}
|
||||
} else {
|
||||
console.warn(`Image processing failed for story ${createdStory.id}:`, imageProcessResponse.status);
|
||||
imageProcessingWarnings.push('Image processing failed');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error processing images for story ${createdStory.id}:`, error);
|
||||
imageProcessingWarnings.push(`Image processing error: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
}
|
||||
|
||||
results.push({
|
||||
url: trimmedUrl,
|
||||
status: 'imported',
|
||||
@@ -356,17 +429,24 @@ async function processIndividualMode(
|
||||
});
|
||||
importedCount++;
|
||||
|
||||
console.log(`Successfully imported: ${scrapedStory.title} by ${scrapedStory.author} (ID: ${createdStory.id})`);
|
||||
console.log(`Successfully imported: ${scrapedStory.title} by ${scrapedStory.author} (ID: ${createdStory.id})${hasImages ? ` with ${imageProcessingWarnings.length > 0 ? 'warnings' : 'successful image processing'}` : ''}`);
|
||||
|
||||
// Send progress update for successful import
|
||||
let progressMessage = `Imported "${scrapedStory.title}" by ${scrapedStory.author}`;
|
||||
if (hasImages) {
|
||||
progressMessage += imageProcessingWarnings.length > 0 ? ' (with image warnings)' : ' (with images)';
|
||||
}
|
||||
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'progress',
|
||||
current: i + 1,
|
||||
total: urls.length,
|
||||
message: `Imported "${scrapedStory.title}" by ${scrapedStory.author}`,
|
||||
message: progressMessage,
|
||||
url: trimmedUrl,
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author
|
||||
author: scrapedStory.author,
|
||||
hasImages: hasImages,
|
||||
imageWarnings: imageProcessingWarnings
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
@@ -421,11 +501,11 @@ async function processIndividualMode(
|
||||
|
||||
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
|
||||
|
||||
// Trigger Typesense reindex if any stories were imported
|
||||
// Trigger OpenSearch reindex if any stories were imported
|
||||
if (importedCount > 0) {
|
||||
try {
|
||||
console.log('Triggering Typesense reindex after bulk import...');
|
||||
const reindexUrl = `http://backend:8080/api/stories/reindex-typesense`;
|
||||
console.log('Triggering OpenSearch reindex after bulk import...');
|
||||
const reindexUrl = `http://backend:8080/api/admin/search/opensearch/reindex`;
|
||||
const reindexResponse = await fetch(reindexUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
@@ -436,12 +516,12 @@ async function processIndividualMode(
|
||||
|
||||
if (reindexResponse.ok) {
|
||||
const reindexResult = await reindexResponse.json();
|
||||
console.log('Typesense reindex completed:', reindexResult);
|
||||
console.log('OpenSearch reindex completed:', reindexResult);
|
||||
} else {
|
||||
console.warn('Typesense reindex failed:', reindexResponse.status);
|
||||
console.warn('OpenSearch reindex failed:', reindexResponse.status);
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('Failed to trigger Typesense reindex:', error);
|
||||
console.warn('Failed to trigger OpenSearch reindex:', error);
|
||||
// Don't fail the whole request if reindex fails
|
||||
}
|
||||
}
|
||||
|
||||
@@ -19,6 +19,9 @@ export async function POST(request: NextRequest) {
|
||||
const scraper = new StoryScraper();
|
||||
const story = await scraper.scrapeStory(url);
|
||||
|
||||
// Check if scraped content contains embedded images
|
||||
const hasImages = story.content ? /<img[^>]+src=['"'][^'"']*['"][^>]*>/i.test(story.content) : false;
|
||||
|
||||
// Debug logging
|
||||
console.log('Scraped story data:', {
|
||||
url: url,
|
||||
@@ -28,10 +31,15 @@ export async function POST(request: NextRequest) {
|
||||
contentLength: story.content?.length || 0,
|
||||
contentPreview: story.content?.substring(0, 200) + '...',
|
||||
tags: story.tags,
|
||||
coverImage: story.coverImage
|
||||
coverImage: story.coverImage,
|
||||
hasEmbeddedImages: hasImages
|
||||
});
|
||||
|
||||
return NextResponse.json(story);
|
||||
// Add image processing flag to response for frontend handling
|
||||
return NextResponse.json({
|
||||
...story,
|
||||
hasEmbeddedImages: hasImages
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Story scraping error:', error);
|
||||
|
||||
|
||||
@@ -1,10 +1,14 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import AppLayout from '../../components/layout/AppLayout';
|
||||
import { useTheme } from '../../lib/theme';
|
||||
import TabNavigation from '../../components/ui/TabNavigation';
|
||||
import AppearanceSettings from '../../components/settings/AppearanceSettings';
|
||||
import ContentSettings from '../../components/settings/ContentSettings';
|
||||
import SystemSettings from '../../components/settings/SystemSettings';
|
||||
import Button from '../../components/ui/Button';
|
||||
import { storyApi, authorApi, databaseApi } from '../../lib/api';
|
||||
import { useTheme } from '../../lib/theme';
|
||||
|
||||
type FontFamily = 'serif' | 'sans' | 'mono';
|
||||
type FontSize = 'small' | 'medium' | 'large' | 'extra-large';
|
||||
@@ -26,28 +30,27 @@ const defaultSettings: Settings = {
|
||||
readingSpeed: 200,
|
||||
};
|
||||
|
||||
const tabs = [
|
||||
{ id: 'appearance', label: 'Appearance', icon: '🎨' },
|
||||
{ id: 'content', label: 'Content', icon: '🏷️' },
|
||||
{ id: 'system', label: 'System', icon: '🔧' },
|
||||
];
|
||||
|
||||
export default function SettingsPage() {
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { theme, setTheme } = useTheme();
|
||||
const [settings, setSettings] = useState<Settings>(defaultSettings);
|
||||
const [saved, setSaved] = useState(false);
|
||||
const [typesenseStatus, setTypesenseStatus] = useState<{
|
||||
stories: { loading: boolean; message: string; success?: boolean };
|
||||
authors: { loading: boolean; message: string; success?: boolean };
|
||||
}>({
|
||||
stories: { loading: false, message: '' },
|
||||
authors: { loading: false, message: '' }
|
||||
});
|
||||
const [authorsSchema, setAuthorsSchema] = useState<any>(null);
|
||||
const [showSchema, setShowSchema] = useState(false);
|
||||
const [databaseStatus, setDatabaseStatus] = useState<{
|
||||
completeBackup: { loading: boolean; message: string; success?: boolean };
|
||||
completeRestore: { loading: boolean; message: string; success?: boolean };
|
||||
completeClear: { loading: boolean; message: string; success?: boolean };
|
||||
}>({
|
||||
completeBackup: { loading: false, message: '' },
|
||||
completeRestore: { loading: false, message: '' },
|
||||
completeClear: { loading: false, message: '' }
|
||||
});
|
||||
const [activeTab, setActiveTab] = useState('appearance');
|
||||
|
||||
// Initialize tab from URL parameter
|
||||
useEffect(() => {
|
||||
const tabFromUrl = searchParams.get('tab');
|
||||
if (tabFromUrl && tabs.some(tab => tab.id === tabFromUrl)) {
|
||||
setActiveTab(tabFromUrl);
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Load settings from localStorage on mount
|
||||
useEffect(() => {
|
||||
@@ -65,6 +68,13 @@ export default function SettingsPage() {
|
||||
}
|
||||
}, [theme]);
|
||||
|
||||
// Update URL when tab changes
|
||||
const handleTabChange = (tabId: string) => {
|
||||
setActiveTab(tabId);
|
||||
const newUrl = `/settings?tab=${tabId}`;
|
||||
router.replace(newUrl, { scroll: false });
|
||||
};
|
||||
|
||||
// Save settings to localStorage
|
||||
const saveSettings = () => {
|
||||
localStorage.setItem('storycove-settings', JSON.stringify(settings));
|
||||
@@ -106,625 +116,58 @@ export default function SettingsPage() {
|
||||
setSettings(prev => ({ ...prev, [key]: value }));
|
||||
};
|
||||
|
||||
const handleTypesenseOperation = async (
|
||||
type: 'stories' | 'authors',
|
||||
operation: 'reindex' | 'recreate',
|
||||
apiCall: () => Promise<{ success: boolean; message: string; count?: number; error?: string }>
|
||||
) => {
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: { loading: true, message: 'Processing...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await apiCall();
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: {
|
||||
loading: false,
|
||||
message: result.success ? result.message : result.error || 'Operation failed',
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
|
||||
// Clear message after 5 seconds
|
||||
setTimeout(() => {
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 5000);
|
||||
} catch (error) {
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: {
|
||||
loading: false,
|
||||
message: 'Network error occurred',
|
||||
success: false
|
||||
}
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setTypesenseStatus(prev => ({
|
||||
...prev,
|
||||
[type]: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 5000);
|
||||
}
|
||||
const resetToDefaults = () => {
|
||||
setSettings({ ...defaultSettings, theme });
|
||||
};
|
||||
|
||||
const fetchAuthorsSchema = async () => {
|
||||
try {
|
||||
const result = await authorApi.getTypesenseSchema();
|
||||
if (result.success) {
|
||||
setAuthorsSchema(result.schema);
|
||||
} else {
|
||||
setAuthorsSchema({ error: result.error });
|
||||
}
|
||||
} catch (error) {
|
||||
setAuthorsSchema({ error: 'Failed to fetch schema' });
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
const handleCompleteBackup = async () => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: true, message: 'Creating complete backup...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const backupBlob = await databaseApi.backupComplete();
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(backupBlob);
|
||||
const link = document.createElement('a');
|
||||
link.href = url;
|
||||
|
||||
const timestamp = new Date().toISOString().replace(/[:.]/g, '-').slice(0, 19);
|
||||
link.download = `storycove_complete_backup_${timestamp}.zip`;
|
||||
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: 'Complete backup downloaded successfully', success: true }
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: error.message || 'Complete backup failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 5 seconds
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 5000);
|
||||
};
|
||||
|
||||
const handleCompleteRestore = async (event: React.ChangeEvent<HTMLInputElement>) => {
|
||||
const file = event.target.files?.[0];
|
||||
if (!file) return;
|
||||
|
||||
// Reset the input so the same file can be selected again
|
||||
event.target.value = '';
|
||||
|
||||
if (!file.name.endsWith('.zip')) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: 'Please select a .zip file', success: false }
|
||||
}));
|
||||
return;
|
||||
}
|
||||
|
||||
const confirmed = window.confirm(
|
||||
'Are you sure you want to restore the complete backup? This will PERMANENTLY DELETE all current data AND files (cover images, avatars) and replace them with the backup data. This action cannot be undone!'
|
||||
const renderTabContent = () => {
|
||||
switch (activeTab) {
|
||||
case 'appearance':
|
||||
return (
|
||||
<AppearanceSettings
|
||||
settings={settings}
|
||||
onSettingChange={updateSetting}
|
||||
/>
|
||||
);
|
||||
|
||||
if (!confirmed) return;
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: true, message: 'Restoring complete backup...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await databaseApi.restoreComplete(file);
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: {
|
||||
loading: false,
|
||||
message: result.success ? result.message : result.message,
|
||||
success: result.success
|
||||
case 'content':
|
||||
return <ContentSettings />;
|
||||
case 'system':
|
||||
return <SystemSettings />;
|
||||
default:
|
||||
return <AppearanceSettings settings={settings} onSettingChange={updateSetting} />;
|
||||
}
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: error.message || 'Complete restore failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 10 seconds for restore (longer because it's important)
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 10000);
|
||||
};
|
||||
|
||||
const handleCompleteClear = async () => {
|
||||
const confirmed = window.confirm(
|
||||
'Are you ABSOLUTELY SURE you want to clear the entire database AND all files? This will PERMANENTLY DELETE ALL stories, authors, series, tags, collections, AND all uploaded images (covers, avatars). This action cannot be undone!'
|
||||
);
|
||||
|
||||
if (!confirmed) return;
|
||||
|
||||
const doubleConfirmed = window.confirm(
|
||||
'This is your final warning! Clicking OK will DELETE EVERYTHING in your StoryCove database AND all uploaded files. Are you completely certain you want to proceed?'
|
||||
);
|
||||
|
||||
if (!doubleConfirmed) return;
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: true, message: 'Clearing database and files...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await databaseApi.clearComplete();
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: {
|
||||
loading: false,
|
||||
message: result.success
|
||||
? `Database and files cleared successfully. Deleted ${result.deletedRecords} records.`
|
||||
: result.message,
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: false, message: error.message || 'Clear operation failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 10 seconds for clear (longer because it's important)
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 10000);
|
||||
};
|
||||
|
||||
return (
|
||||
<AppLayout>
|
||||
<div className="max-w-2xl mx-auto space-y-8">
|
||||
<div className="max-w-4xl mx-auto space-y-6">
|
||||
{/* Header */}
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold theme-header">Settings</h1>
|
||||
<p className="theme-text mt-2">
|
||||
Customize your StoryCove reading experience
|
||||
Customize your StoryCove experience and manage system settings
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Theme Settings */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Appearance</h2>
|
||||
|
||||
<div className="space-y-4">
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Theme
|
||||
</label>
|
||||
<div className="flex gap-4">
|
||||
<button
|
||||
onClick={() => updateSetting('theme', 'light')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors ${
|
||||
settings.theme === 'light'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
☀️ Light
|
||||
</button>
|
||||
<button
|
||||
onClick={() => updateSetting('theme', 'dark')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors ${
|
||||
settings.theme === 'dark'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
🌙 Dark
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Reading Settings */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Reading Experience</h2>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Font Family */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Font Family
|
||||
</label>
|
||||
<div className="flex gap-4 flex-wrap">
|
||||
<button
|
||||
onClick={() => updateSetting('fontFamily', 'serif')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors font-serif ${
|
||||
settings.fontFamily === 'serif'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
Serif
|
||||
</button>
|
||||
<button
|
||||
onClick={() => updateSetting('fontFamily', 'sans')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors font-sans ${
|
||||
settings.fontFamily === 'sans'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
Sans Serif
|
||||
</button>
|
||||
<button
|
||||
onClick={() => updateSetting('fontFamily', 'mono')}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors font-mono ${
|
||||
settings.fontFamily === 'mono'
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
Monospace
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Font Size */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Font Size
|
||||
</label>
|
||||
<div className="flex gap-4 flex-wrap">
|
||||
{(['small', 'medium', 'large', 'extra-large'] as FontSize[]).map((size) => (
|
||||
<button
|
||||
key={size}
|
||||
onClick={() => updateSetting('fontSize', size)}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors capitalize ${
|
||||
settings.fontSize === size
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
{size.replace('-', ' ')}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Reading Width */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Reading Width
|
||||
</label>
|
||||
<div className="flex gap-4">
|
||||
{(['narrow', 'medium', 'wide'] as ReadingWidth[]).map((width) => (
|
||||
<button
|
||||
key={width}
|
||||
onClick={() => updateSetting('readingWidth', width)}
|
||||
className={`px-4 py-2 rounded-lg border transition-colors capitalize ${
|
||||
settings.readingWidth === width
|
||||
? 'theme-accent-bg text-white border-transparent'
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
{width}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Reading Speed */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Reading Speed (words per minute)
|
||||
</label>
|
||||
<div className="flex items-center gap-4">
|
||||
<input
|
||||
type="range"
|
||||
min="100"
|
||||
max="400"
|
||||
step="25"
|
||||
value={settings.readingSpeed}
|
||||
onChange={(e) => updateSetting('readingSpeed', parseInt(e.target.value))}
|
||||
className="flex-1 h-2 bg-gray-200 rounded-lg appearance-none cursor-pointer dark:bg-gray-700"
|
||||
{/* Tab Navigation */}
|
||||
<TabNavigation
|
||||
tabs={tabs}
|
||||
activeTab={activeTab}
|
||||
onTabChange={handleTabChange}
|
||||
className="mb-6"
|
||||
/>
|
||||
<div className="min-w-[80px] text-center">
|
||||
<span className="text-lg font-medium theme-header">{settings.readingSpeed}</span>
|
||||
<div className="text-xs theme-text">WPM</div>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex justify-between text-xs theme-text mt-1">
|
||||
<span>Slow (100)</span>
|
||||
<span>Average (200)</span>
|
||||
<span>Fast (400)</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Tab Content */}
|
||||
<div className="min-h-[400px]">
|
||||
{renderTabContent()}
|
||||
</div>
|
||||
|
||||
{/* Preview */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Preview</h2>
|
||||
|
||||
<div
|
||||
className="p-4 theme-card border theme-border rounded-lg"
|
||||
style={{
|
||||
fontFamily: settings.fontFamily === 'serif' ? 'Georgia, Times, serif'
|
||||
: settings.fontFamily === 'sans' ? 'Inter, system-ui, sans-serif'
|
||||
: 'Monaco, Consolas, monospace',
|
||||
fontSize: settings.fontSize === 'small' ? '14px'
|
||||
: settings.fontSize === 'medium' ? '16px'
|
||||
: settings.fontSize === 'large' ? '18px'
|
||||
: '20px',
|
||||
maxWidth: settings.readingWidth === 'narrow' ? '600px'
|
||||
: settings.readingWidth === 'medium' ? '800px'
|
||||
: '1000px',
|
||||
}}
|
||||
>
|
||||
<h3 className="text-xl font-bold theme-header mb-2">Sample Story Title</h3>
|
||||
<p className="theme-text mb-4">by Sample Author</p>
|
||||
<p className="theme-text leading-relaxed">
|
||||
This is how your story text will look with the current settings.
|
||||
The quick brown fox jumps over the lazy dog. Lorem ipsum dolor sit amet,
|
||||
consectetur adipiscing elit. Sed do eiusmod tempor incididunt ut labore
|
||||
et dolore magna aliqua.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Typesense Search Management */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Search Index Management</h2>
|
||||
<p className="theme-text mb-6">
|
||||
Manage the Typesense search indexes for stories and authors. Use these tools if search functionality isn't working properly.
|
||||
</p>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Stories Section */}
|
||||
<div className="border theme-border rounded-lg p-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">Stories Index</h3>
|
||||
<div className="flex flex-col sm:flex-row gap-3 mb-3">
|
||||
<Button
|
||||
onClick={() => handleTypesenseOperation('stories', 'reindex', storyApi.reindexTypesense)}
|
||||
disabled={typesenseStatus.stories.loading}
|
||||
loading={typesenseStatus.stories.loading}
|
||||
variant="ghost"
|
||||
className="flex-1"
|
||||
>
|
||||
{typesenseStatus.stories.loading ? 'Reindexing...' : 'Reindex Stories'}
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => handleTypesenseOperation('stories', 'recreate', storyApi.recreateTypesenseCollection)}
|
||||
disabled={typesenseStatus.stories.loading}
|
||||
loading={typesenseStatus.stories.loading}
|
||||
variant="secondary"
|
||||
className="flex-1"
|
||||
>
|
||||
{typesenseStatus.stories.loading ? 'Recreating...' : 'Recreate Collection'}
|
||||
</Button>
|
||||
</div>
|
||||
{typesenseStatus.stories.message && (
|
||||
<div className={`text-sm p-2 rounded ${
|
||||
typesenseStatus.stories.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{typesenseStatus.stories.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Authors Section */}
|
||||
<div className="border theme-border rounded-lg p-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">Authors Index</h3>
|
||||
<div className="flex flex-col sm:flex-row gap-3 mb-3">
|
||||
<Button
|
||||
onClick={() => handleTypesenseOperation('authors', 'reindex', authorApi.reindexTypesense)}
|
||||
disabled={typesenseStatus.authors.loading}
|
||||
loading={typesenseStatus.authors.loading}
|
||||
variant="ghost"
|
||||
className="flex-1"
|
||||
>
|
||||
{typesenseStatus.authors.loading ? 'Reindexing...' : 'Reindex Authors'}
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => handleTypesenseOperation('authors', 'recreate', authorApi.recreateTypesenseCollection)}
|
||||
disabled={typesenseStatus.authors.loading}
|
||||
loading={typesenseStatus.authors.loading}
|
||||
variant="secondary"
|
||||
className="flex-1"
|
||||
>
|
||||
{typesenseStatus.authors.loading ? 'Recreating...' : 'Recreate Collection'}
|
||||
</Button>
|
||||
</div>
|
||||
{typesenseStatus.authors.message && (
|
||||
<div className={`text-sm p-2 rounded ${
|
||||
typesenseStatus.authors.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{typesenseStatus.authors.message}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Debug Schema Section */}
|
||||
<div className="border-t theme-border pt-3">
|
||||
<div className="flex items-center gap-2 mb-2">
|
||||
<Button
|
||||
onClick={fetchAuthorsSchema}
|
||||
variant="ghost"
|
||||
className="text-xs"
|
||||
>
|
||||
Inspect Schema
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => setShowSchema(!showSchema)}
|
||||
variant="ghost"
|
||||
className="text-xs"
|
||||
disabled={!authorsSchema}
|
||||
>
|
||||
{showSchema ? 'Hide' : 'Show'} Schema
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{showSchema && authorsSchema && (
|
||||
<div className="text-xs theme-text bg-gray-50 dark:bg-gray-800 p-3 rounded border overflow-auto max-h-48">
|
||||
<pre>{JSON.stringify(authorsSchema, null, 2)}</pre>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="text-sm theme-text bg-blue-50 dark:bg-blue-900/20 p-3 rounded-lg">
|
||||
<p className="font-medium mb-1">When to use these tools:</p>
|
||||
<ul className="text-xs space-y-1 ml-4">
|
||||
<li>• <strong>Reindex:</strong> Refresh search data while keeping the existing schema</li>
|
||||
<li>• <strong>Recreate Collection:</strong> Delete and rebuild the entire search index (fixes schema issues)</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Database Management */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Database Management</h2>
|
||||
<p className="theme-text mb-6">
|
||||
Backup, restore, or clear your StoryCove database and files. These comprehensive operations include both your data and uploaded images.
|
||||
</p>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Complete Backup Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-blue-200 dark:border-blue-800">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">📦 Create Backup</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
Download a complete backup as a ZIP file. This includes your database AND all uploaded files (cover images, avatars). This is a comprehensive backup of your entire StoryCove installation.
|
||||
</p>
|
||||
<Button
|
||||
onClick={handleCompleteBackup}
|
||||
disabled={databaseStatus.completeBackup.loading}
|
||||
loading={databaseStatus.completeBackup.loading}
|
||||
variant="primary"
|
||||
className="w-full sm:w-auto"
|
||||
>
|
||||
{databaseStatus.completeBackup.loading ? 'Creating Backup...' : 'Download Backup'}
|
||||
</Button>
|
||||
{databaseStatus.completeBackup.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeBackup.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeBackup.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Restore Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-orange-200 dark:border-orange-800">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">📥 Restore Backup</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
<strong className="text-orange-600 dark:text-orange-400">⚠️ Warning:</strong> This will completely replace your current database AND all files with the backup. All existing data and uploaded files will be permanently deleted.
|
||||
</p>
|
||||
<div className="flex items-center gap-3">
|
||||
<input
|
||||
type="file"
|
||||
accept=".zip"
|
||||
onChange={handleCompleteRestore}
|
||||
disabled={databaseStatus.completeRestore.loading}
|
||||
className="flex-1 text-sm theme-text file:mr-4 file:py-2 file:px-4 file:rounded-lg file:border-0 file:text-sm file:font-medium file:theme-accent-bg file:text-white hover:file:bg-opacity-90 file:cursor-pointer"
|
||||
/>
|
||||
</div>
|
||||
{databaseStatus.completeRestore.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeRestore.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeRestore.message}
|
||||
</div>
|
||||
)}
|
||||
{databaseStatus.completeRestore.loading && (
|
||||
<div className="text-sm theme-text mt-3 flex items-center gap-2">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
|
||||
Restoring backup...
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Clear Everything Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-red-200 dark:border-red-800 bg-red-50 dark:bg-red-900/10">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">🗑️ Clear Everything</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
<strong className="text-red-600 dark:text-red-400">⚠️ Danger Zone:</strong> This will permanently delete ALL data from your database AND all uploaded files (cover images, avatars). Everything will be completely removed. This action cannot be undone!
|
||||
</p>
|
||||
<Button
|
||||
onClick={handleCompleteClear}
|
||||
disabled={databaseStatus.completeClear.loading}
|
||||
loading={databaseStatus.completeClear.loading}
|
||||
variant="secondary"
|
||||
className="w-full sm:w-auto bg-red-700 hover:bg-red-800 text-white border-red-700"
|
||||
>
|
||||
{databaseStatus.completeClear.loading ? 'Clearing Everything...' : 'Clear Everything'}
|
||||
</Button>
|
||||
{databaseStatus.completeClear.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeClear.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeClear.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="text-sm theme-text bg-blue-50 dark:bg-blue-900/20 p-3 rounded-lg">
|
||||
<p className="font-medium mb-1">💡 Best Practices:</p>
|
||||
<ul className="text-xs space-y-1 ml-4">
|
||||
<li>• <strong>Always backup</strong> before performing restore or clear operations</li>
|
||||
<li>• <strong>Store backups safely</strong> in multiple locations for important data</li>
|
||||
<li>• <strong>Test restores</strong> in a development environment when possible</li>
|
||||
<li>• <strong>Backup files (.zip)</strong> contain both database and all uploaded files</li>
|
||||
<li>• <strong>Verify backup files</strong> are complete before relying on them</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4">
|
||||
{/* Save Actions - Only show for Appearance tab */}
|
||||
{activeTab === 'appearance' && (
|
||||
<div className="flex justify-end gap-4 pt-6 border-t theme-border">
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => {
|
||||
setSettings({ ...defaultSettings, theme });
|
||||
}}
|
||||
onClick={resetToDefaults}
|
||||
>
|
||||
Reset to Defaults
|
||||
</Button>
|
||||
@@ -736,7 +179,7 @@ export default function SettingsPage() {
|
||||
{saved ? '✓ Saved!' : 'Save Settings'}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</AppLayout>
|
||||
);
|
||||
|
||||
799
frontend/src/app/settings/tag-maintenance/page.tsx
Normal file
799
frontend/src/app/settings/tag-maintenance/page.tsx
Normal file
@@ -0,0 +1,799 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import AppLayout from '../../../components/layout/AppLayout';
|
||||
import { tagApi } from '../../../lib/api';
|
||||
import { Tag } from '../../../types/api';
|
||||
import Button from '../../../components/ui/Button';
|
||||
import { Input } from '../../../components/ui/Input';
|
||||
import LoadingSpinner from '../../../components/ui/LoadingSpinner';
|
||||
import TagDisplay from '../../../components/tags/TagDisplay';
|
||||
import TagEditModal from '../../../components/tags/TagEditModal';
|
||||
|
||||
export default function TagMaintenancePage() {
|
||||
const [tags, setTags] = useState<Tag[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [searchQuery, setSearchQuery] = useState('');
|
||||
const [sortBy, setSortBy] = useState<'name' | 'storyCount' | 'createdAt'>('name');
|
||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('asc');
|
||||
const [selectedTag, setSelectedTag] = useState<Tag | null>(null);
|
||||
const [isEditModalOpen, setIsEditModalOpen] = useState(false);
|
||||
const [isCreateModalOpen, setIsCreateModalOpen] = useState(false);
|
||||
const [selectedTagIds, setSelectedTagIds] = useState<Set<string>>(new Set());
|
||||
const [isMergeModalOpen, setIsMergeModalOpen] = useState(false);
|
||||
const [mergeTargetTagId, setMergeTargetTagId] = useState<string>('');
|
||||
const [mergePreview, setMergePreview] = useState<any>(null);
|
||||
const [merging, setMerging] = useState(false);
|
||||
const [isMergeSuggestionsModalOpen, setIsMergeSuggestionsModalOpen] = useState(false);
|
||||
const [mergeSuggestions, setMergeSuggestions] = useState<Array<{
|
||||
group: Tag[];
|
||||
similarity: number;
|
||||
reason: string;
|
||||
}>>([]);
|
||||
|
||||
useEffect(() => {
|
||||
loadTags();
|
||||
}, []);
|
||||
|
||||
const loadTags = async () => {
|
||||
try {
|
||||
setLoading(true);
|
||||
const result = await tagApi.getTags({
|
||||
page: 0,
|
||||
size: 1000, // Load all tags for maintenance
|
||||
sortBy,
|
||||
sortDir: sortDirection
|
||||
});
|
||||
setTags(result.content || []);
|
||||
} catch (error) {
|
||||
console.error('Failed to load tags:', error);
|
||||
setTags([]);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleTagSave = (updatedTag: Tag) => {
|
||||
if (selectedTag) {
|
||||
// Update existing tag
|
||||
setTags(prev => prev.map(tag =>
|
||||
tag.id === updatedTag.id ? updatedTag : tag
|
||||
));
|
||||
} else {
|
||||
// Add new tag
|
||||
setTags(prev => [...prev, updatedTag]);
|
||||
}
|
||||
setSelectedTag(null);
|
||||
setIsEditModalOpen(false);
|
||||
setIsCreateModalOpen(false);
|
||||
};
|
||||
|
||||
const handleTagDelete = (deletedTag: Tag) => {
|
||||
setTags(prev => prev.filter(tag => tag.id !== deletedTag.id));
|
||||
setSelectedTag(null);
|
||||
setIsEditModalOpen(false);
|
||||
};
|
||||
|
||||
const handleEditTag = (tag: Tag) => {
|
||||
setSelectedTag(tag);
|
||||
setIsEditModalOpen(true);
|
||||
};
|
||||
|
||||
const handleCreateTag = () => {
|
||||
setSelectedTag(null);
|
||||
setIsCreateModalOpen(true);
|
||||
};
|
||||
|
||||
const handleSortChange = (newSortBy: typeof sortBy) => {
|
||||
if (newSortBy === sortBy) {
|
||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||
} else {
|
||||
setSortBy(newSortBy);
|
||||
setSortDirection('asc');
|
||||
}
|
||||
};
|
||||
|
||||
const handleTagSelection = (tagId: string, selected: boolean) => {
|
||||
setSelectedTagIds(prev => {
|
||||
const newSet = new Set(prev);
|
||||
if (selected) {
|
||||
newSet.add(tagId);
|
||||
} else {
|
||||
newSet.delete(tagId);
|
||||
}
|
||||
return newSet;
|
||||
});
|
||||
};
|
||||
|
||||
const handleSelectAll = (selected: boolean) => {
|
||||
if (selected) {
|
||||
setSelectedTagIds(new Set(filteredTags.map(tag => tag.id)));
|
||||
} else {
|
||||
setSelectedTagIds(new Set());
|
||||
}
|
||||
};
|
||||
|
||||
const handleSelectUnused = () => {
|
||||
const unusedTags = filteredTags.filter(tag => !tag.storyCount || tag.storyCount === 0);
|
||||
setSelectedTagIds(new Set(unusedTags.map(tag => tag.id)));
|
||||
};
|
||||
|
||||
const handleDeleteSelected = async () => {
|
||||
if (selectedTagIds.size === 0) return;
|
||||
|
||||
const confirmation = confirm(
|
||||
`Are you sure you want to delete ${selectedTagIds.size} selected tag(s)? This action cannot be undone.`
|
||||
);
|
||||
|
||||
if (!confirmation) return;
|
||||
|
||||
try {
|
||||
const deletePromises = Array.from(selectedTagIds).map(tagId =>
|
||||
tagApi.deleteTag(tagId)
|
||||
);
|
||||
|
||||
await Promise.all(deletePromises);
|
||||
|
||||
// Reload tags and reset selection
|
||||
await loadTags();
|
||||
setSelectedTagIds(new Set());
|
||||
} catch (error) {
|
||||
console.error('Failed to delete tags:', error);
|
||||
alert('Failed to delete some tags. Please try again.');
|
||||
}
|
||||
};
|
||||
|
||||
const generateMergeSuggestions = () => {
|
||||
const suggestions: Array<{
|
||||
group: Tag[];
|
||||
similarity: number;
|
||||
reason: string;
|
||||
}> = [];
|
||||
|
||||
// Helper function to calculate similarity between two strings
|
||||
const calculateSimilarity = (str1: string, str2: string): number => {
|
||||
const s1 = str1.toLowerCase();
|
||||
const s2 = str2.toLowerCase();
|
||||
|
||||
// Exact match
|
||||
if (s1 === s2) return 1.0;
|
||||
|
||||
// Check for common patterns
|
||||
const patterns = [
|
||||
// Plural vs singular
|
||||
{ regex: /(.+)s$/, match: (a: string, b: string) => a === b + 's' || b === a + 's' },
|
||||
// Hyphen vs underscore vs space
|
||||
{ regex: /[-_\s]/, match: (a: string, b: string) =>
|
||||
a.replace(/[-_\s]/g, '') === b.replace(/[-_\s]/g, '') },
|
||||
// Common abbreviations
|
||||
{ regex: /\b(and|&)\b/, match: (a: string, b: string) =>
|
||||
a.replace(/\band\b/g, '&') === b || a === b.replace(/\band\b/g, '&') },
|
||||
];
|
||||
|
||||
for (const pattern of patterns) {
|
||||
if (pattern.match(s1, s2)) return 0.9;
|
||||
}
|
||||
|
||||
// Levenshtein distance for similar words
|
||||
const distance = levenshteinDistance(s1, s2);
|
||||
const maxLength = Math.max(s1.length, s2.length);
|
||||
const similarity = 1 - (distance / maxLength);
|
||||
|
||||
return similarity > 0.8 ? similarity : 0;
|
||||
};
|
||||
|
||||
// Simple Levenshtein distance implementation
|
||||
const levenshteinDistance = (str1: string, str2: string): number => {
|
||||
const matrix = Array(str2.length + 1).fill(null).map(() => Array(str1.length + 1).fill(null));
|
||||
|
||||
for (let i = 0; i <= str1.length; i++) matrix[0][i] = i;
|
||||
for (let j = 0; j <= str2.length; j++) matrix[j][0] = j;
|
||||
|
||||
for (let j = 1; j <= str2.length; j++) {
|
||||
for (let i = 1; i <= str1.length; i++) {
|
||||
const indicator = str1[i - 1] === str2[j - 1] ? 0 : 1;
|
||||
matrix[j][i] = Math.min(
|
||||
matrix[j][i - 1] + 1,
|
||||
matrix[j - 1][i] + 1,
|
||||
matrix[j - 1][i - 1] + indicator
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return matrix[str2.length][str1.length];
|
||||
};
|
||||
|
||||
// Find similar tags
|
||||
const processedTags = new Set<string>();
|
||||
|
||||
for (let i = 0; i < tags.length; i++) {
|
||||
if (processedTags.has(tags[i].id)) continue;
|
||||
|
||||
const similarTags = [tags[i]];
|
||||
processedTags.add(tags[i].id);
|
||||
|
||||
for (let j = i + 1; j < tags.length; j++) {
|
||||
if (processedTags.has(tags[j].id)) continue;
|
||||
|
||||
const similarity = calculateSimilarity(tags[i].name, tags[j].name);
|
||||
if (similarity > 0.8) {
|
||||
similarTags.push(tags[j]);
|
||||
processedTags.add(tags[j].id);
|
||||
}
|
||||
}
|
||||
|
||||
if (similarTags.length > 1) {
|
||||
const maxSimilarity = Math.max(...similarTags.slice(1).map(tag =>
|
||||
calculateSimilarity(similarTags[0].name, tag.name)
|
||||
));
|
||||
|
||||
let reason = 'Similar names detected';
|
||||
if (maxSimilarity === 0.9) {
|
||||
reason = 'Likely plural/singular or formatting variations';
|
||||
} else if (maxSimilarity > 0.95) {
|
||||
reason = 'Very similar names, possible duplicates';
|
||||
}
|
||||
|
||||
suggestions.push({
|
||||
group: similarTags,
|
||||
similarity: maxSimilarity,
|
||||
reason
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by similarity descending
|
||||
suggestions.sort((a, b) => b.similarity - a.similarity);
|
||||
|
||||
setMergeSuggestions(suggestions);
|
||||
setIsMergeSuggestionsModalOpen(true);
|
||||
};
|
||||
|
||||
const handleMergeSelected = () => {
|
||||
if (selectedTagIds.size < 2) {
|
||||
alert('Please select at least 2 tags to merge');
|
||||
return;
|
||||
}
|
||||
setIsMergeModalOpen(true);
|
||||
};
|
||||
|
||||
const handleMergePreview = async () => {
|
||||
if (!mergeTargetTagId || selectedTagIds.size < 2) return;
|
||||
|
||||
try {
|
||||
const sourceTagIds = Array.from(selectedTagIds).filter(id => id !== mergeTargetTagId);
|
||||
const preview = await tagApi.previewMerge(sourceTagIds, mergeTargetTagId);
|
||||
setMergePreview(preview);
|
||||
} catch (error) {
|
||||
console.error('Failed to preview merge:', error);
|
||||
alert('Failed to preview merge');
|
||||
}
|
||||
};
|
||||
|
||||
const handleConfirmMerge = async () => {
|
||||
if (!mergeTargetTagId || selectedTagIds.size < 2) return;
|
||||
|
||||
try {
|
||||
setMerging(true);
|
||||
const sourceTagIds = Array.from(selectedTagIds).filter(id => id !== mergeTargetTagId);
|
||||
await tagApi.mergeTags(sourceTagIds, mergeTargetTagId);
|
||||
|
||||
// Reload tags and reset state
|
||||
await loadTags();
|
||||
setSelectedTagIds(new Set());
|
||||
setMergeTargetTagId('');
|
||||
setMergePreview(null);
|
||||
setIsMergeModalOpen(false);
|
||||
} catch (error) {
|
||||
console.error('Failed to merge tags:', error);
|
||||
alert('Failed to merge tags');
|
||||
} finally {
|
||||
setMerging(false);
|
||||
}
|
||||
};
|
||||
|
||||
// Filter and sort tags
|
||||
const filteredTags = tags
|
||||
.filter(tag =>
|
||||
tag.name.toLowerCase().includes(searchQuery.toLowerCase()) ||
|
||||
(tag.description && tag.description.toLowerCase().includes(searchQuery.toLowerCase()))
|
||||
)
|
||||
.sort((a, b) => {
|
||||
let aValue, bValue;
|
||||
|
||||
switch (sortBy) {
|
||||
case 'name':
|
||||
aValue = a.name.toLowerCase();
|
||||
bValue = b.name.toLowerCase();
|
||||
break;
|
||||
case 'storyCount':
|
||||
aValue = a.storyCount || 0;
|
||||
bValue = b.storyCount || 0;
|
||||
break;
|
||||
case 'createdAt':
|
||||
aValue = new Date(a.createdAt || 0).getTime();
|
||||
bValue = new Date(b.createdAt || 0).getTime();
|
||||
break;
|
||||
default:
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (sortDirection === 'asc') {
|
||||
return aValue < bValue ? -1 : aValue > bValue ? 1 : 0;
|
||||
} else {
|
||||
return aValue > bValue ? -1 : aValue < bValue ? 1 : 0;
|
||||
}
|
||||
});
|
||||
|
||||
const getSortIcon = (column: typeof sortBy) => {
|
||||
if (sortBy !== column) return '↕️';
|
||||
return sortDirection === 'asc' ? '↑' : '↓';
|
||||
};
|
||||
|
||||
const tagStats = {
|
||||
total: tags.length,
|
||||
withColors: tags.filter(tag => tag.color).length,
|
||||
withDescriptions: tags.filter(tag => tag.description).length,
|
||||
withAliases: tags.filter(tag => tag.aliasCount && tag.aliasCount > 0).length,
|
||||
unused: tags.filter(tag => !tag.storyCount || tag.storyCount === 0).length
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<AppLayout>
|
||||
<div className="flex items-center justify-center py-20">
|
||||
<LoadingSpinner size="lg" />
|
||||
</div>
|
||||
</AppLayout>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<AppLayout>
|
||||
<div className="max-w-6xl mx-auto space-y-6">
|
||||
{/* Header */}
|
||||
<div className="flex justify-between items-start">
|
||||
<div>
|
||||
<h1 className="text-3xl font-bold theme-header">Tag Maintenance</h1>
|
||||
<p className="theme-text mt-2">
|
||||
Manage tag colors, descriptions, and aliases for better organization
|
||||
</p>
|
||||
</div>
|
||||
<div className="flex gap-3">
|
||||
<Button href="/settings" variant="ghost">
|
||||
← Back to Settings
|
||||
</Button>
|
||||
<Button onClick={handleCreateTag} variant="primary">
|
||||
+ Create Tag
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Statistics */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-lg font-semibold theme-header mb-4">Tag Statistics</h2>
|
||||
<div className="grid grid-cols-2 md:grid-cols-5 gap-4 text-center">
|
||||
<div>
|
||||
<div className="text-2xl font-bold theme-accent">{tagStats.total}</div>
|
||||
<div className="text-sm theme-text">Total Tags</div>
|
||||
</div>
|
||||
<div>
|
||||
<div className="text-2xl font-bold text-blue-600">{tagStats.withColors}</div>
|
||||
<div className="text-sm theme-text">With Colors</div>
|
||||
</div>
|
||||
<div>
|
||||
<div className="text-2xl font-bold text-green-600">{tagStats.withDescriptions}</div>
|
||||
<div className="text-sm theme-text">With Descriptions</div>
|
||||
</div>
|
||||
<div>
|
||||
<div className="text-2xl font-bold text-purple-600">{tagStats.withAliases}</div>
|
||||
<div className="text-sm theme-text">With Aliases</div>
|
||||
</div>
|
||||
<div>
|
||||
<div className="text-2xl font-bold text-gray-500">{tagStats.unused}</div>
|
||||
<div className="text-sm theme-text">Unused</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Controls */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<div className="flex flex-col md:flex-row gap-4 items-center">
|
||||
<div className="flex-1">
|
||||
<Input
|
||||
type="search"
|
||||
placeholder="Search tags by name or description..."
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
className="w-full"
|
||||
/>
|
||||
</div>
|
||||
<div className="flex gap-2">
|
||||
<button
|
||||
onClick={() => handleSortChange('name')}
|
||||
className="px-3 py-2 text-sm border theme-border rounded-lg theme-card theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
Name {getSortIcon('name')}
|
||||
</button>
|
||||
<button
|
||||
onClick={() => handleSortChange('storyCount')}
|
||||
className="px-3 py-2 text-sm border theme-border rounded-lg theme-card theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
Usage {getSortIcon('storyCount')}
|
||||
</button>
|
||||
<button
|
||||
onClick={() => handleSortChange('createdAt')}
|
||||
className="px-3 py-2 text-sm border theme-border rounded-lg theme-card theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
Date {getSortIcon('createdAt')}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Tags List */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<div className="flex justify-between items-center mb-4">
|
||||
<h2 className="text-lg font-semibold theme-header">
|
||||
Tags ({filteredTags.length})
|
||||
</h2>
|
||||
<div className="flex gap-2">
|
||||
<Button
|
||||
variant="secondary"
|
||||
size="sm"
|
||||
onClick={generateMergeSuggestions}
|
||||
>
|
||||
🔍 Merge Suggestions
|
||||
</Button>
|
||||
{selectedTagIds.size > 0 && (
|
||||
<>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => setSelectedTagIds(new Set())}
|
||||
>
|
||||
Clear Selection ({selectedTagIds.size})
|
||||
</Button>
|
||||
<Button
|
||||
variant="danger"
|
||||
size="sm"
|
||||
onClick={handleDeleteSelected}
|
||||
>
|
||||
🗑️ Delete Selected
|
||||
</Button>
|
||||
<Button
|
||||
variant="primary"
|
||||
size="sm"
|
||||
onClick={handleMergeSelected}
|
||||
disabled={selectedTagIds.size < 2}
|
||||
>
|
||||
Merge Selected
|
||||
</Button>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{filteredTags.length > 0 && (
|
||||
<div className="mb-4 flex items-center gap-4">
|
||||
<div className="flex items-center gap-2">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={filteredTags.length > 0 && selectedTagIds.size === filteredTags.length}
|
||||
onChange={(e) => handleSelectAll(e.target.checked)}
|
||||
className="rounded"
|
||||
/>
|
||||
<label className="text-sm theme-text">Select All</label>
|
||||
</div>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={handleSelectUnused}
|
||||
disabled={tagStats.unused === 0}
|
||||
>
|
||||
Select Unused ({tagStats.unused})
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{filteredTags.length === 0 ? (
|
||||
<div className="text-center py-12">
|
||||
<p className="theme-text text-lg mb-4">
|
||||
{searchQuery ? 'No tags match your search.' : 'No tags found.'}
|
||||
</p>
|
||||
{!searchQuery && (
|
||||
<Button onClick={handleCreateTag} variant="primary">
|
||||
Create Your First Tag
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
<div className="space-y-3">
|
||||
{filteredTags.map((tag) => (
|
||||
<div
|
||||
key={tag.id}
|
||||
className="flex items-center justify-between p-4 border theme-border rounded-lg hover:bg-gray-50 dark:hover:bg-gray-800 transition-colors"
|
||||
>
|
||||
<div className="flex items-center gap-4 min-w-0 flex-1">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={selectedTagIds.has(tag.id)}
|
||||
onChange={(e) => handleTagSelection(tag.id, e.target.checked)}
|
||||
className="rounded"
|
||||
/>
|
||||
<TagDisplay
|
||||
tag={tag}
|
||||
size="md"
|
||||
showAliasesTooltip={true}
|
||||
clickable={false}
|
||||
/>
|
||||
<div className="min-w-0 flex-1">
|
||||
{tag.description && (
|
||||
<p className="text-sm theme-text-muted mt-1 truncate">
|
||||
{tag.description}
|
||||
</p>
|
||||
)}
|
||||
<div className="flex gap-4 text-xs theme-text-muted mt-1">
|
||||
<a
|
||||
href={`/library?tags=${encodeURIComponent(tag.name)}`}
|
||||
className="hover:theme-accent hover:underline cursor-pointer"
|
||||
title={`View ${tag.storyCount || 0} stories with tag "${tag.name}"`}
|
||||
>
|
||||
{tag.storyCount || 0} stories
|
||||
</a>
|
||||
{tag.aliasCount && tag.aliasCount > 0 && (
|
||||
<span>{tag.aliasCount} aliases</span>
|
||||
)}
|
||||
{tag.createdAt && (
|
||||
<span>Created {new Date(tag.createdAt).toLocaleDateString()}</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex gap-2 ml-4">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => handleEditTag(tag)}
|
||||
>
|
||||
Edit
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Edit Modal */}
|
||||
<TagEditModal
|
||||
tag={selectedTag || undefined}
|
||||
isOpen={isEditModalOpen}
|
||||
onClose={() => {
|
||||
setIsEditModalOpen(false);
|
||||
setSelectedTag(null);
|
||||
}}
|
||||
onSave={handleTagSave}
|
||||
onDelete={handleTagDelete}
|
||||
/>
|
||||
|
||||
{/* Create Modal */}
|
||||
<TagEditModal
|
||||
tag={undefined}
|
||||
isOpen={isCreateModalOpen}
|
||||
onClose={() => setIsCreateModalOpen(false)}
|
||||
onSave={handleTagSave}
|
||||
/>
|
||||
|
||||
{/* Merge Modal */}
|
||||
{isMergeModalOpen && (
|
||||
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||
<div className="bg-white dark:bg-gray-800 rounded-lg p-6 max-w-2xl w-full mx-4 max-h-[80vh] overflow-y-auto">
|
||||
<h2 className="text-2xl font-bold theme-header mb-4">Merge Tags</h2>
|
||||
|
||||
<div className="space-y-4">
|
||||
<div>
|
||||
<p className="theme-text mb-2">
|
||||
You have selected {selectedTagIds.size} tags to merge.
|
||||
</p>
|
||||
<p className="text-sm theme-text-muted mb-4">
|
||||
Choose which tag should become the canonical name. All other tags will become aliases.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Target Tag Selection */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-text mb-2">
|
||||
Canonical Tag (keep this name):
|
||||
</label>
|
||||
<select
|
||||
value={mergeTargetTagId}
|
||||
onChange={(e) => {
|
||||
setMergeTargetTagId(e.target.value);
|
||||
setMergePreview(null);
|
||||
}}
|
||||
className="w-full p-2 border theme-border rounded-lg theme-card theme-text"
|
||||
>
|
||||
<option value="">Select canonical tag...</option>
|
||||
{Array.from(selectedTagIds).map(tagId => {
|
||||
const tag = tags.find(t => t.id === tagId);
|
||||
return tag ? (
|
||||
<option key={tagId} value={tagId}>
|
||||
{tag.name} ({tag.storyCount || 0} stories)
|
||||
</option>
|
||||
) : null;
|
||||
})}
|
||||
</select>
|
||||
</div>
|
||||
|
||||
{/* Preview Button */}
|
||||
{mergeTargetTagId && (
|
||||
<Button
|
||||
onClick={handleMergePreview}
|
||||
variant="secondary"
|
||||
className="w-full"
|
||||
>
|
||||
Preview Merge
|
||||
</Button>
|
||||
)}
|
||||
|
||||
{/* Merge Preview */}
|
||||
{mergePreview && (
|
||||
<div className="bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-700 rounded-lg p-4">
|
||||
<h3 className="font-medium theme-header mb-2">Merge Preview</h3>
|
||||
<div className="space-y-2 text-sm theme-text">
|
||||
<p>
|
||||
<strong>Result:</strong> "{mergePreview.targetTagName}" with {mergePreview.totalResultStoryCount} stories
|
||||
</p>
|
||||
{mergePreview.aliasesToCreate && mergePreview.aliasesToCreate.length > 0 && (
|
||||
<div>
|
||||
<strong>Aliases to create:</strong>
|
||||
<ul className="ml-4 mt-1 list-disc">
|
||||
{mergePreview.aliasesToCreate.map((alias: string) => (
|
||||
<li key={alias}>{alias}</li>
|
||||
))}
|
||||
</ul>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex gap-3 pt-4">
|
||||
<Button
|
||||
onClick={() => {
|
||||
setIsMergeModalOpen(false);
|
||||
setMergeTargetTagId('');
|
||||
setMergePreview(null);
|
||||
}}
|
||||
variant="ghost"
|
||||
className="flex-1"
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
onClick={handleConfirmMerge}
|
||||
variant="primary"
|
||||
disabled={!mergeTargetTagId || merging}
|
||||
className="flex-1"
|
||||
>
|
||||
{merging ? 'Merging...' : 'Confirm Merge'}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Merge Suggestions Modal */}
|
||||
{isMergeSuggestionsModalOpen && (
|
||||
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
|
||||
<div className="bg-white dark:bg-gray-800 rounded-lg p-6 max-w-4xl w-full mx-4 max-h-[80vh] overflow-y-auto">
|
||||
<h2 className="text-2xl font-bold theme-header mb-4">Merge Suggestions</h2>
|
||||
|
||||
<div className="space-y-4">
|
||||
<p className="theme-text">
|
||||
Found {mergeSuggestions.length} potential merge opportunities based on similar tag names.
|
||||
</p>
|
||||
|
||||
{mergeSuggestions.length === 0 ? (
|
||||
<div className="text-center py-8">
|
||||
<p className="theme-text text-lg">No similar tags found.</p>
|
||||
<p className="theme-text-muted text-sm mt-2">
|
||||
All your tags appear to have unique names.
|
||||
</p>
|
||||
</div>
|
||||
) : (
|
||||
<div className="space-y-4">
|
||||
{mergeSuggestions.map((suggestion, index) => (
|
||||
<div
|
||||
key={index}
|
||||
className="border theme-border rounded-lg p-4 bg-yellow-50 dark:bg-yellow-900/20"
|
||||
>
|
||||
<div className="flex justify-between items-start mb-3">
|
||||
<div>
|
||||
<h3 className="font-medium theme-header">
|
||||
Suggestion {index + 1}
|
||||
</h3>
|
||||
<p className="text-sm theme-text-muted">
|
||||
{suggestion.reason} (Similarity: {(suggestion.similarity * 100).toFixed(1)}%)
|
||||
</p>
|
||||
</div>
|
||||
<Button
|
||||
variant="primary"
|
||||
size="sm"
|
||||
onClick={() => {
|
||||
// Pre-select these tags for merging and go directly to merge modal
|
||||
const suggestedTagIds = new Set(suggestion.group.map(tag => tag.id));
|
||||
setSelectedTagIds(suggestedTagIds);
|
||||
setIsMergeSuggestionsModalOpen(false);
|
||||
|
||||
// Open merge modal directly
|
||||
setIsMergeModalOpen(true);
|
||||
setMergeTargetTagId('');
|
||||
setMergePreview(null);
|
||||
}}
|
||||
>
|
||||
Merge These
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<div className="flex flex-wrap gap-2">
|
||||
{suggestion.group.map((tag, tagIndex) => (
|
||||
<div key={tag.id} className="flex items-center gap-2">
|
||||
<TagDisplay
|
||||
tag={tag}
|
||||
size="sm"
|
||||
showAliasesTooltip={true}
|
||||
clickable={false}
|
||||
/>
|
||||
<span className="text-xs theme-text-muted">
|
||||
({tag.storyCount || 0} stories)
|
||||
</span>
|
||||
{tagIndex < suggestion.group.length - 1 && (
|
||||
<span className="text-gray-400">→</span>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex gap-3 pt-4 border-t theme-border">
|
||||
<Button
|
||||
onClick={() => setIsMergeSuggestionsModalOpen(false)}
|
||||
variant="ghost"
|
||||
className="flex-1"
|
||||
>
|
||||
Close
|
||||
</Button>
|
||||
{mergeSuggestions.length > 0 && (
|
||||
<Button
|
||||
onClick={() => {
|
||||
// Select all suggested tags for batch processing
|
||||
const allSuggestedTagIds = new Set<string>();
|
||||
mergeSuggestions.forEach(suggestion => {
|
||||
suggestion.group.forEach(tag => allSuggestedTagIds.add(tag.id));
|
||||
});
|
||||
setSelectedTagIds(allSuggestedTagIds);
|
||||
setIsMergeSuggestionsModalOpen(false);
|
||||
}}
|
||||
variant="secondary"
|
||||
className="flex-1"
|
||||
>
|
||||
Select All Suggested ({mergeSuggestions.reduce((acc, s) => acc + s.group.length, 0)} tags)
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</AppLayout>
|
||||
);
|
||||
}
|
||||
@@ -9,6 +9,8 @@ import { Story, Collection } from '../../../../types/api';
|
||||
import AppLayout from '../../../../components/layout/AppLayout';
|
||||
import Button from '../../../../components/ui/Button';
|
||||
import LoadingSpinner from '../../../../components/ui/LoadingSpinner';
|
||||
import TagDisplay from '../../../../components/tags/TagDisplay';
|
||||
import TableOfContents from '../../../../components/stories/TableOfContents';
|
||||
import { calculateReadingTime } from '../../../../lib/settings';
|
||||
|
||||
export default function StoryDetailPage() {
|
||||
@@ -365,18 +367,27 @@ export default function StoryDetailPage() {
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Table of Contents */}
|
||||
<TableOfContents
|
||||
htmlContent={story.contentHtml || ''}
|
||||
onItemClick={(item) => {
|
||||
// Scroll to the story reading view with the specific heading
|
||||
window.location.href = `/stories/${story.id}#${item.id}`;
|
||||
}}
|
||||
/>
|
||||
|
||||
{/* Tags */}
|
||||
{story.tags && story.tags.length > 0 && (
|
||||
<div className="theme-card theme-shadow rounded-lg p-4">
|
||||
<h3 className="font-semibold theme-header mb-3">Tags</h3>
|
||||
<div className="flex flex-wrap gap-2">
|
||||
{story.tags.map((tag) => (
|
||||
<span
|
||||
<TagDisplay
|
||||
key={tag.id}
|
||||
className="px-3 py-1 text-sm rounded-full theme-accent-bg text-white"
|
||||
>
|
||||
{tag.name}
|
||||
</span>
|
||||
tag={tag}
|
||||
size="md"
|
||||
clickable={false}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -6,9 +6,11 @@ import AppLayout from '../../../../components/layout/AppLayout';
|
||||
import { Input, Textarea } from '../../../../components/ui/Input';
|
||||
import Button from '../../../../components/ui/Button';
|
||||
import TagInput from '../../../../components/stories/TagInput';
|
||||
import TagSuggestions from '../../../../components/tags/TagSuggestions';
|
||||
import RichTextEditor from '../../../../components/stories/RichTextEditor';
|
||||
import ImageUpload from '../../../../components/ui/ImageUpload';
|
||||
import AuthorSelector from '../../../../components/stories/AuthorSelector';
|
||||
import SeriesSelector from '../../../../components/stories/SeriesSelector';
|
||||
import LoadingSpinner from '../../../../components/ui/LoadingSpinner';
|
||||
import { storyApi } from '../../../../lib/api';
|
||||
import { Story } from '../../../../types/api';
|
||||
@@ -21,6 +23,7 @@ export default function EditStoryPage() {
|
||||
const [story, setStory] = useState<Story | null>(null);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [saving, setSaving] = useState(false);
|
||||
const [resetingPosition, setResetingPosition] = useState(false);
|
||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||
|
||||
const [formData, setFormData] = useState({
|
||||
@@ -32,6 +35,7 @@ export default function EditStoryPage() {
|
||||
sourceUrl: '',
|
||||
tags: [] as string[],
|
||||
seriesName: '',
|
||||
seriesId: undefined as string | undefined,
|
||||
volume: '',
|
||||
});
|
||||
|
||||
@@ -54,6 +58,7 @@ export default function EditStoryPage() {
|
||||
sourceUrl: storyData.sourceUrl || '',
|
||||
tags: storyData.tags?.map(tag => tag.name) || [],
|
||||
seriesName: storyData.seriesName || '',
|
||||
seriesId: storyData.seriesId,
|
||||
volume: storyData.volume?.toString() || '',
|
||||
});
|
||||
} catch (error) {
|
||||
@@ -94,6 +99,15 @@ export default function EditStoryPage() {
|
||||
setFormData(prev => ({ ...prev, tags }));
|
||||
};
|
||||
|
||||
const handleAddSuggestedTag = (tagName: string) => {
|
||||
if (!formData.tags.includes(tagName.toLowerCase())) {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
tags: [...prev.tags, tagName.toLowerCase()]
|
||||
}));
|
||||
}
|
||||
};
|
||||
|
||||
const handleAuthorChange = (authorName: string, authorId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
@@ -107,6 +121,19 @@ export default function EditStoryPage() {
|
||||
}
|
||||
};
|
||||
|
||||
const handleSeriesChange = (seriesName: string, seriesId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
seriesName,
|
||||
seriesId: seriesId // This will be undefined if creating new series, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes series
|
||||
if (errors.seriesName) {
|
||||
setErrors(prev => ({ ...prev, seriesName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
const newErrors: Record<string, string> = {};
|
||||
|
||||
@@ -150,8 +177,9 @@ export default function EditStoryPage() {
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: formData.contentHtml,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
seriesName: formData.seriesName || undefined,
|
||||
volume: formData.seriesName && formData.volume ? parseInt(formData.volume) : undefined,
|
||||
// Send seriesId if we have it (existing series), otherwise send seriesName (new/changed series)
|
||||
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName }),
|
||||
// Send authorId if we have it (existing author), otherwise send authorName (new/changed author)
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags,
|
||||
@@ -174,6 +202,32 @@ export default function EditStoryPage() {
|
||||
}
|
||||
};
|
||||
|
||||
const handleResetReadingPosition = async () => {
|
||||
if (!story || !confirm('Are you sure you want to reset the reading position to the beginning? This will remove your current place in the story.')) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
setResetingPosition(true);
|
||||
await storyApi.updateReadingProgress(storyId, 0);
|
||||
setStory(prev => prev ? { ...prev, readingPosition: 0 } : null);
|
||||
// Show success feedback
|
||||
setErrors({ resetSuccess: 'Reading position reset! The story will start from the beginning next time you read it.' });
|
||||
// Clear success message after 4 seconds
|
||||
setTimeout(() => {
|
||||
setErrors(prev => {
|
||||
const { resetSuccess, ...rest } = prev;
|
||||
return rest;
|
||||
});
|
||||
}, 4000);
|
||||
} catch (error) {
|
||||
console.error('Failed to reset reading position:', error);
|
||||
setErrors({ submit: 'Failed to reset reading position' });
|
||||
} finally {
|
||||
setResetingPosition(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleDelete = async () => {
|
||||
if (!story || !confirm('Are you sure you want to delete this story? This action cannot be undone.')) {
|
||||
return;
|
||||
@@ -288,6 +342,8 @@ export default function EditStoryPage() {
|
||||
onChange={handleContentChange}
|
||||
placeholder="Edit your story content here..."
|
||||
error={errors.contentHtml}
|
||||
storyId={storyId}
|
||||
enableImageProcessing={true}
|
||||
/>
|
||||
</div>
|
||||
|
||||
@@ -301,17 +357,28 @@ export default function EditStoryPage() {
|
||||
onChange={handleTagsChange}
|
||||
placeholder="Edit tags to categorize your story..."
|
||||
/>
|
||||
|
||||
{/* Tag Suggestions */}
|
||||
<TagSuggestions
|
||||
title={formData.title}
|
||||
content={formData.contentHtml}
|
||||
summary={formData.summary}
|
||||
currentTags={formData.tags}
|
||||
onAddTag={handleAddSuggestedTag}
|
||||
disabled={saving}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Series and Volume */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<div>
|
||||
<Input
|
||||
<SeriesSelector
|
||||
label="Series (optional)"
|
||||
value={formData.seriesName}
|
||||
onChange={handleInputChange('seriesName')}
|
||||
placeholder="Enter series name if part of a series"
|
||||
onChange={handleSeriesChange}
|
||||
placeholder="Select or enter series name if part of a series"
|
||||
error={errors.seriesName}
|
||||
authorId={formData.authorId}
|
||||
/>
|
||||
</div>
|
||||
|
||||
@@ -336,6 +403,38 @@ export default function EditStoryPage() {
|
||||
placeholder="https://example.com/original-story-url"
|
||||
/>
|
||||
|
||||
{/* Reading Position Reset Section */}
|
||||
<div className="theme-card p-4 rounded-lg border theme-border">
|
||||
<div className="flex items-center justify-between">
|
||||
<div>
|
||||
<h3 className="text-sm font-medium theme-header">Reading Position</h3>
|
||||
<p className="text-sm theme-text mt-1">
|
||||
{story?.readingPosition && story.readingPosition > 0
|
||||
? `Currently saved at position ${story.readingPosition.toLocaleString()}`
|
||||
: 'No reading position saved (story will start from the beginning)'
|
||||
}
|
||||
</p>
|
||||
</div>
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={handleResetReadingPosition}
|
||||
loading={resetingPosition}
|
||||
disabled={saving || !story?.readingPosition || story.readingPosition === 0}
|
||||
className="text-orange-600 hover:text-orange-700 dark:text-orange-400 dark:hover:text-orange-300"
|
||||
>
|
||||
Reset to Beginning
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Success Message */}
|
||||
{errors.resetSuccess && (
|
||||
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg">
|
||||
<p className="text-green-800 dark:text-green-200">{errors.resetSuccess}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Submit Error */}
|
||||
{errors.submit && (
|
||||
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
|
||||
@@ -8,6 +8,8 @@ import { Story } from '../../../types/api';
|
||||
import LoadingSpinner from '../../../components/ui/LoadingSpinner';
|
||||
import Button from '../../../components/ui/Button';
|
||||
import StoryRating from '../../../components/stories/StoryRating';
|
||||
import TagDisplay from '../../../components/tags/TagDisplay';
|
||||
import TableOfContents from '../../../components/stories/TableOfContents';
|
||||
import { sanitizeHtml, preloadSanitizationConfig } from '../../../lib/sanitization';
|
||||
|
||||
export default function StoryReadingPage() {
|
||||
@@ -18,8 +20,14 @@ export default function StoryReadingPage() {
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const [readingProgress, setReadingProgress] = useState(0);
|
||||
const [readingPercentage, setReadingPercentage] = useState(0);
|
||||
const [sanitizedContent, setSanitizedContent] = useState<string>('');
|
||||
const [hasScrolledToPosition, setHasScrolledToPosition] = useState(false);
|
||||
const [showToc, setShowToc] = useState(false);
|
||||
const [hasHeadings, setHasHeadings] = useState(false);
|
||||
const [showEndOfStoryPopup, setShowEndOfStoryPopup] = useState(false);
|
||||
const [hasReachedEnd, setHasReachedEnd] = useState(false);
|
||||
const [resettingPosition, setResettingPosition] = useState(false);
|
||||
const contentRef = useRef<HTMLDivElement>(null);
|
||||
const saveTimeoutRef = useRef<NodeJS.Timeout | null>(null);
|
||||
|
||||
@@ -41,15 +49,25 @@ export default function StoryReadingPage() {
|
||||
));
|
||||
|
||||
// Convert to character position in the plain text content
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
return Math.floor(scrollRatio * textLength);
|
||||
}, [story]);
|
||||
|
||||
// Calculate reading percentage from character position
|
||||
const calculateReadingPercentage = useCallback((currentPosition: number): number => {
|
||||
if (!story) return 0;
|
||||
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (totalLength === 0) return 0;
|
||||
|
||||
return Math.round((currentPosition / totalLength) * 100);
|
||||
}, [story]);
|
||||
|
||||
// Convert character position back to scroll position for auto-scroll
|
||||
const scrollToCharacterPosition = useCallback((position: number) => {
|
||||
if (!contentRef.current || !story || hasScrolledToPosition) return;
|
||||
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (textLength === 0 || position === 0) return;
|
||||
|
||||
const ratio = position / textLength;
|
||||
@@ -111,9 +129,32 @@ export default function StoryReadingPage() {
|
||||
|
||||
setStory(storyData);
|
||||
|
||||
// Sanitize story content
|
||||
// Sanitize story content and add IDs to headings
|
||||
const sanitized = await sanitizeHtml(storyData.contentHtml || '');
|
||||
setSanitizedContent(sanitized);
|
||||
|
||||
// Add IDs to headings for TOC functionality using regex instead of DOMParser
|
||||
// This avoids potential browser-specific sanitization that might strip src attributes
|
||||
let processedContent = sanitized;
|
||||
const headingMatches = processedContent.match(/<h[1-6][^>]*>/gi);
|
||||
let headingCount = 0;
|
||||
|
||||
if (headingMatches) {
|
||||
processedContent = processedContent.replace(/<h([1-6])([^>]*)>/gi, (match, level, attrs) => {
|
||||
const headingId = `heading-${headingCount++}`;
|
||||
|
||||
// Check if id attribute already exists
|
||||
if (attrs.includes('id=')) {
|
||||
// Replace existing id
|
||||
return match.replace(/id=['"][^'"]*['"]/, `id="${headingId}"`);
|
||||
} else {
|
||||
// Add id attribute
|
||||
return `<h${level}${attrs} id="${headingId}">`;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
setSanitizedContent(processedContent);
|
||||
setHasHeadings(headingCount > 0);
|
||||
|
||||
// Load series stories if part of a series
|
||||
if (storyData.seriesId) {
|
||||
@@ -133,25 +174,45 @@ export default function StoryReadingPage() {
|
||||
}
|
||||
}, [storyId]);
|
||||
|
||||
// Auto-scroll to saved reading position when story content is loaded
|
||||
// Auto-scroll to saved reading position or URL hash when story content is loaded
|
||||
useEffect(() => {
|
||||
if (story && sanitizedContent && !hasScrolledToPosition) {
|
||||
// Use a small delay to ensure content is rendered
|
||||
const timeout = setTimeout(() => {
|
||||
console.log('Initializing reading position tracking, saved position:', story.readingPosition);
|
||||
|
||||
// Check if there's a hash in the URL (for TOC navigation)
|
||||
const hash = window.location.hash.substring(1);
|
||||
if (hash && hash.startsWith('heading-')) {
|
||||
console.log('Auto-scrolling to heading from URL hash:', hash);
|
||||
const element = document.getElementById(hash);
|
||||
if (element) {
|
||||
element.scrollIntoView({
|
||||
behavior: 'smooth',
|
||||
block: 'start'
|
||||
});
|
||||
setHasScrolledToPosition(true);
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Otherwise, use saved reading position
|
||||
if (story.readingPosition && story.readingPosition > 0) {
|
||||
console.log('Auto-scrolling to saved position:', story.readingPosition);
|
||||
const initialPercentage = calculateReadingPercentage(story.readingPosition);
|
||||
setReadingPercentage(initialPercentage);
|
||||
scrollToCharacterPosition(story.readingPosition);
|
||||
} else {
|
||||
// Even if there's no saved position, mark as ready for tracking
|
||||
console.log('No saved position, starting fresh tracking');
|
||||
setReadingPercentage(0);
|
||||
setHasScrolledToPosition(true);
|
||||
}
|
||||
}, 500);
|
||||
|
||||
return () => clearTimeout(timeout);
|
||||
}
|
||||
}, [story, sanitizedContent, scrollToCharacterPosition, hasScrolledToPosition]);
|
||||
}, [story, sanitizedContent, scrollToCharacterPosition, calculateReadingPercentage, hasScrolledToPosition]);
|
||||
|
||||
// Track reading progress and save position
|
||||
useEffect(() => {
|
||||
@@ -169,10 +230,40 @@ export default function StoryReadingPage() {
|
||||
|
||||
setReadingProgress(progress);
|
||||
|
||||
// Save reading position (debounced)
|
||||
// Multi-method end-of-story detection
|
||||
const documentHeight = document.documentElement.scrollHeight;
|
||||
const windowBottom = scrolled + windowHeight;
|
||||
const distanceFromBottom = documentHeight - windowBottom;
|
||||
|
||||
// Method 1: Distance from bottom (most reliable)
|
||||
const nearBottom = distanceFromBottom <= 200;
|
||||
|
||||
// Method 2: High progress but only as secondary check
|
||||
const highProgress = progress >= 98;
|
||||
|
||||
// Method 3: Check if story content itself is fully visible
|
||||
const storyContentElement = contentRef.current;
|
||||
let storyContentFullyVisible = false;
|
||||
if (storyContentElement) {
|
||||
const contentRect = storyContentElement.getBoundingClientRect();
|
||||
const contentBottom = scrolled + contentRect.bottom;
|
||||
const documentContentHeight = Math.max(documentHeight - 300, contentBottom); // Account for footer padding
|
||||
storyContentFullyVisible = windowBottom >= documentContentHeight;
|
||||
}
|
||||
|
||||
// Trigger end detection if user is near bottom AND (has high progress OR story content is fully visible)
|
||||
if (nearBottom && (highProgress || storyContentFullyVisible) && !hasReachedEnd && hasScrolledToPosition) {
|
||||
console.log('End of story detected:', { nearBottom, highProgress, storyContentFullyVisible, distanceFromBottom, progress });
|
||||
setHasReachedEnd(true);
|
||||
setShowEndOfStoryPopup(true);
|
||||
}
|
||||
|
||||
// Save reading position and update percentage (debounced)
|
||||
if (hasScrolledToPosition) { // Only save after initial auto-scroll
|
||||
const characterPosition = getCharacterPositionFromScroll();
|
||||
console.log('Scroll detected, character position:', characterPosition);
|
||||
const percentage = calculateReadingPercentage(characterPosition);
|
||||
console.log('Scroll detected, character position:', characterPosition, 'percentage:', percentage);
|
||||
setReadingPercentage(percentage);
|
||||
debouncedSavePosition(characterPosition);
|
||||
} else {
|
||||
console.log('Scroll detected but not ready for tracking yet');
|
||||
@@ -188,7 +279,7 @@ export default function StoryReadingPage() {
|
||||
clearTimeout(saveTimeoutRef.current);
|
||||
}
|
||||
};
|
||||
}, [story, hasScrolledToPosition, getCharacterPositionFromScroll, debouncedSavePosition]);
|
||||
}, [story, hasScrolledToPosition, getCharacterPositionFromScroll, calculateReadingPercentage, debouncedSavePosition, hasReachedEnd]);
|
||||
|
||||
const handleRatingUpdate = async (newRating: number) => {
|
||||
if (!story) return;
|
||||
@@ -201,6 +292,25 @@ export default function StoryReadingPage() {
|
||||
}
|
||||
};
|
||||
|
||||
const handleResetReadingPosition = async () => {
|
||||
if (!story) return;
|
||||
|
||||
try {
|
||||
setResettingPosition(true);
|
||||
await storyApi.updateReadingProgress(story.id, 0);
|
||||
setStory(prev => prev ? { ...prev, readingPosition: 0 } : null);
|
||||
setShowEndOfStoryPopup(false);
|
||||
setHasReachedEnd(false);
|
||||
|
||||
// DON'T scroll immediately - let user stay at current position
|
||||
// The reset will take effect when they next open the story
|
||||
} catch (error) {
|
||||
console.error('Failed to reset reading position:', error);
|
||||
} finally {
|
||||
setResettingPosition(false);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
const findNextStory = (): Story | null => {
|
||||
if (!story?.seriesId || seriesStories.length <= 1) return null;
|
||||
@@ -265,6 +375,21 @@ export default function StoryReadingPage() {
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-4">
|
||||
{/* Reading percentage indicator */}
|
||||
<div className="text-sm theme-text font-mono bg-gray-100 dark:bg-gray-800 px-2 py-1 rounded">
|
||||
{readingPercentage}%
|
||||
</div>
|
||||
|
||||
{hasHeadings && (
|
||||
<button
|
||||
onClick={() => setShowToc(!showToc)}
|
||||
className="text-sm theme-text hover:theme-accent transition-colors"
|
||||
title="Table of Contents"
|
||||
>
|
||||
📋 TOC
|
||||
</button>
|
||||
)}
|
||||
|
||||
<StoryRating
|
||||
rating={story.rating || 0}
|
||||
onRatingChange={handleRatingUpdate}
|
||||
@@ -279,6 +404,76 @@ export default function StoryReadingPage() {
|
||||
</div>
|
||||
</header>
|
||||
|
||||
{/* Table of Contents Modal */}
|
||||
{showToc && (
|
||||
<>
|
||||
{/* Backdrop */}
|
||||
<div
|
||||
className="fixed inset-0 bg-black bg-opacity-50 z-50"
|
||||
onClick={() => setShowToc(false)}
|
||||
/>
|
||||
|
||||
{/* TOC Modal */}
|
||||
<div className="fixed top-20 right-4 left-4 md:left-auto md:w-80 max-h-96 z-50">
|
||||
<TableOfContents
|
||||
htmlContent={sanitizedContent}
|
||||
collapsible={false}
|
||||
onItemClick={(item) => {
|
||||
const element = document.getElementById(item.id);
|
||||
if (element) {
|
||||
element.scrollIntoView({
|
||||
behavior: 'smooth',
|
||||
block: 'start'
|
||||
});
|
||||
setShowToc(false); // Close TOC after navigation
|
||||
}
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* End of Story Popup */}
|
||||
{showEndOfStoryPopup && (
|
||||
<>
|
||||
{/* Backdrop */}
|
||||
<div
|
||||
className="fixed inset-0 bg-black bg-opacity-50 z-50"
|
||||
onClick={() => setShowEndOfStoryPopup(false)}
|
||||
/>
|
||||
|
||||
{/* Popup Modal */}
|
||||
<div className="fixed top-1/2 left-1/2 transform -translate-x-1/2 -translate-y-1/2 z-50 max-w-md w-full mx-4">
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<div className="text-center">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">
|
||||
🎉 Story Complete!
|
||||
</h3>
|
||||
<p className="theme-text mb-6">
|
||||
You've reached the end of "{story?.title}". Would you like to reset your reading position so the story starts from the beginning next time you open it?
|
||||
</p>
|
||||
|
||||
<div className="flex gap-3 justify-center">
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => setShowEndOfStoryPopup(false)}
|
||||
>
|
||||
Keep Current Position
|
||||
</Button>
|
||||
<Button
|
||||
variant="primary"
|
||||
onClick={handleResetReadingPosition}
|
||||
loading={resettingPosition}
|
||||
>
|
||||
Reset for Next Time
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* Story Content */}
|
||||
<main className="max-w-4xl mx-auto px-4 py-8">
|
||||
<article data-reading-content>
|
||||
@@ -314,12 +509,12 @@ export default function StoryReadingPage() {
|
||||
{story.tags && story.tags.length > 0 && (
|
||||
<div className="flex flex-wrap justify-center gap-2 mt-4">
|
||||
{story.tags.map((tag) => (
|
||||
<span
|
||||
<TagDisplay
|
||||
key={tag.id}
|
||||
className="px-3 py-1 text-sm theme-accent-bg text-white rounded-full"
|
||||
>
|
||||
{tag.name}
|
||||
</span>
|
||||
tag={tag}
|
||||
size="md"
|
||||
clickable={false}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { searchApi, tagApi } from '../../lib/api';
|
||||
import { searchApi, tagApi, getImageUrl } from '../../lib/api';
|
||||
import { Story, Tag } from '../../types/api';
|
||||
import { Input } from '../ui/Input';
|
||||
import Button from '../ui/Button';
|
||||
@@ -239,7 +239,7 @@ export default function CollectionForm({
|
||||
{(coverImagePreview || initialData?.coverImagePath) && (
|
||||
<div className="w-20 h-24 rounded overflow-hidden bg-gray-100">
|
||||
<img
|
||||
src={coverImagePreview || (initialData?.coverImagePath ? `/images/${initialData.coverImagePath}` : '')}
|
||||
src={coverImagePreview || (initialData?.coverImagePath ? getImageUrl(initialData.coverImagePath) : '')}
|
||||
alt="Cover preview"
|
||||
className="w-full h-full object-cover"
|
||||
/>
|
||||
|
||||
@@ -2,8 +2,9 @@
|
||||
|
||||
import { useState, useEffect, useRef, useCallback } from 'react';
|
||||
import { StoryWithCollectionContext } from '../../types/api';
|
||||
import { storyApi } from '../../lib/api';
|
||||
import { storyApi, getImageUrl } from '../../lib/api';
|
||||
import Button from '../ui/Button';
|
||||
import TagDisplay from '../tags/TagDisplay';
|
||||
import Link from 'next/link';
|
||||
|
||||
interface CollectionReadingViewProps {
|
||||
@@ -19,6 +20,7 @@ export default function CollectionReadingView({
|
||||
}: CollectionReadingViewProps) {
|
||||
const { story, collection } = data;
|
||||
const [hasScrolledToPosition, setHasScrolledToPosition] = useState(false);
|
||||
const [readingPercentage, setReadingPercentage] = useState(0);
|
||||
const contentRef = useRef<HTMLDivElement>(null);
|
||||
const saveTimeoutRef = useRef<NodeJS.Timeout | null>(null);
|
||||
|
||||
@@ -38,15 +40,25 @@ export default function CollectionReadingView({
|
||||
));
|
||||
|
||||
// Convert to character position in the plain text content
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
return Math.floor(scrollRatio * textLength);
|
||||
}, [story]);
|
||||
|
||||
// Calculate reading percentage from character position
|
||||
const calculateReadingPercentage = useCallback((currentPosition: number): number => {
|
||||
if (!story) return 0;
|
||||
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (totalLength === 0) return 0;
|
||||
|
||||
return Math.round((currentPosition / totalLength) * 100);
|
||||
}, [story]);
|
||||
|
||||
// Convert character position back to scroll position for auto-scroll
|
||||
const scrollToCharacterPosition = useCallback((position: number) => {
|
||||
if (!contentRef.current || !story || hasScrolledToPosition) return;
|
||||
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (textLength === 0 || position === 0) return;
|
||||
|
||||
const ratio = position / textLength;
|
||||
@@ -101,23 +113,28 @@ export default function CollectionReadingView({
|
||||
console.log('Collection view - initializing reading position tracking, saved position:', story.readingPosition);
|
||||
if (story.readingPosition && story.readingPosition > 0) {
|
||||
console.log('Collection view - auto-scrolling to saved position:', story.readingPosition);
|
||||
const initialPercentage = calculateReadingPercentage(story.readingPosition);
|
||||
setReadingPercentage(initialPercentage);
|
||||
scrollToCharacterPosition(story.readingPosition);
|
||||
} else {
|
||||
console.log('Collection view - no saved position, starting fresh tracking');
|
||||
setReadingPercentage(0);
|
||||
setHasScrolledToPosition(true);
|
||||
}
|
||||
}, 500);
|
||||
|
||||
return () => clearTimeout(timeout);
|
||||
}
|
||||
}, [story, scrollToCharacterPosition, hasScrolledToPosition]);
|
||||
}, [story, scrollToCharacterPosition, calculateReadingPercentage, hasScrolledToPosition]);
|
||||
|
||||
// Track reading progress and save position
|
||||
useEffect(() => {
|
||||
const handleScroll = () => {
|
||||
if (hasScrolledToPosition) {
|
||||
const characterPosition = getCharacterPositionFromScroll();
|
||||
console.log('Collection view - scroll detected, character position:', characterPosition);
|
||||
const percentage = calculateReadingPercentage(characterPosition);
|
||||
console.log('Collection view - scroll detected, character position:', characterPosition, 'percentage:', percentage);
|
||||
setReadingPercentage(percentage);
|
||||
debouncedSavePosition(characterPosition);
|
||||
} else {
|
||||
console.log('Collection view - scroll detected but not ready for tracking yet');
|
||||
@@ -131,7 +148,7 @@ export default function CollectionReadingView({
|
||||
clearTimeout(saveTimeoutRef.current);
|
||||
}
|
||||
};
|
||||
}, [hasScrolledToPosition, getCharacterPositionFromScroll, debouncedSavePosition]);
|
||||
}, [hasScrolledToPosition, getCharacterPositionFromScroll, calculateReadingPercentage, debouncedSavePosition]);
|
||||
|
||||
const handlePrevious = () => {
|
||||
if (collection.previousStoryId) {
|
||||
@@ -189,6 +206,11 @@ export default function CollectionReadingView({
|
||||
|
||||
{/* Progress Bar */}
|
||||
<div className="flex items-center gap-4">
|
||||
{/* Reading percentage indicator */}
|
||||
<div className="text-sm text-blue-700 dark:text-blue-300 font-mono bg-blue-100 dark:bg-blue-900 px-2 py-1 rounded">
|
||||
{readingPercentage}%
|
||||
</div>
|
||||
|
||||
<div className="w-32 bg-blue-200 dark:bg-blue-800 rounded-full h-2">
|
||||
<div
|
||||
className="bg-blue-600 dark:bg-blue-400 h-2 rounded-full transition-all duration-300"
|
||||
@@ -211,7 +233,7 @@ export default function CollectionReadingView({
|
||||
{story.coverPath && (
|
||||
<div className="flex-shrink-0">
|
||||
<img
|
||||
src={`/images/${story.coverPath}`}
|
||||
src={getImageUrl(story.coverPath)}
|
||||
alt={`${story.title} cover`}
|
||||
className="w-32 h-40 object-cover rounded-lg mx-auto md:mx-0"
|
||||
/>
|
||||
@@ -255,12 +277,12 @@ export default function CollectionReadingView({
|
||||
{story.tags && story.tags.length > 0 && (
|
||||
<div className="flex flex-wrap gap-2">
|
||||
{story.tags.map((tag) => (
|
||||
<span
|
||||
<TagDisplay
|
||||
key={tag.id}
|
||||
className="inline-block px-2 py-1 text-xs rounded-full theme-accent-bg text-white"
|
||||
>
|
||||
{tag.name}
|
||||
</span>
|
||||
tag={tag}
|
||||
size="sm"
|
||||
clickable={false}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -17,12 +17,12 @@ export default function Header() {
|
||||
|
||||
const addStoryItems = [
|
||||
{
|
||||
href: '/import',
|
||||
href: '/add-story',
|
||||
label: 'Manual Entry',
|
||||
description: 'Add a story by manually entering details'
|
||||
},
|
||||
{
|
||||
href: '/import?mode=url',
|
||||
href: '/import',
|
||||
label: 'Import from URL',
|
||||
description: 'Import a single story from a website'
|
||||
},
|
||||
@@ -156,34 +156,16 @@ export default function Header() {
|
||||
<div className="px-2 py-1">
|
||||
<div className="font-medium theme-text mb-1">Add Story</div>
|
||||
<div className="pl-4 space-y-1">
|
||||
{addStoryItems.map((item) => (
|
||||
<Link
|
||||
href="/import"
|
||||
key={item.href}
|
||||
href={item.href}
|
||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Manual Entry
|
||||
</Link>
|
||||
<Link
|
||||
href="/import?mode=url"
|
||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Import from URL
|
||||
</Link>
|
||||
<Link
|
||||
href="/import/epub"
|
||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Import EPUB
|
||||
</Link>
|
||||
<Link
|
||||
href="/import/bulk"
|
||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Bulk Import
|
||||
{item.label}
|
||||
</Link>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
<Link
|
||||
|
||||
@@ -22,13 +22,13 @@ const importTabs: ImportTab[] = [
|
||||
{
|
||||
id: 'manual',
|
||||
label: 'Manual Entry',
|
||||
href: '/import',
|
||||
href: '/add-story',
|
||||
description: 'Add a story by manually entering details'
|
||||
},
|
||||
{
|
||||
id: 'url',
|
||||
label: 'Import from URL',
|
||||
href: '/import?mode=url',
|
||||
href: '/import',
|
||||
description: 'Import a single story from a website'
|
||||
},
|
||||
{
|
||||
@@ -52,8 +52,10 @@ export default function ImportLayout({ children, title, description }: ImportLay
|
||||
|
||||
// Determine which tab is active
|
||||
const getActiveTab = () => {
|
||||
if (pathname === '/import') {
|
||||
return mode === 'url' ? 'url' : 'manual';
|
||||
if (pathname === '/add-story') {
|
||||
return 'manual';
|
||||
} else if (pathname === '/import') {
|
||||
return 'url';
|
||||
} else if (pathname === '/import/epub') {
|
||||
return 'epub';
|
||||
} else if (pathname === '/import/bulk') {
|
||||
|
||||
531
frontend/src/components/library/AdvancedFilters.tsx
Normal file
531
frontend/src/components/library/AdvancedFilters.tsx
Normal file
@@ -0,0 +1,531 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import type { AdvancedFilters, FilterPreset } from '../../types/api';
|
||||
import Button from '../ui/Button';
|
||||
import { Input } from '../ui/Input';
|
||||
|
||||
interface AdvancedFiltersProps {
|
||||
filters: AdvancedFilters;
|
||||
onChange: (filters: AdvancedFilters) => void;
|
||||
onReset: () => void;
|
||||
className?: string;
|
||||
}
|
||||
|
||||
// Predefined filter presets with both detailed controls and quick buttons
|
||||
const FILTER_PRESETS: FilterPreset[] = [
|
||||
// Length presets
|
||||
{
|
||||
id: 'short-stories',
|
||||
label: '< 5k words',
|
||||
description: 'Short stories under 5,000 words',
|
||||
filters: { maxWordCount: 5000 },
|
||||
category: 'length'
|
||||
},
|
||||
{
|
||||
id: 'medium-stories',
|
||||
label: '5k - 20k',
|
||||
description: 'Medium length stories (5k-20k words)',
|
||||
filters: { minWordCount: 5000, maxWordCount: 20000 },
|
||||
category: 'length'
|
||||
},
|
||||
{
|
||||
id: 'long-stories',
|
||||
label: '> 20k words',
|
||||
description: 'Long stories over 20,000 words',
|
||||
filters: { minWordCount: 20000 },
|
||||
category: 'length'
|
||||
},
|
||||
{
|
||||
id: 'very-long',
|
||||
label: '> 50k words',
|
||||
description: 'Very long stories over 50,000 words',
|
||||
filters: { minWordCount: 50000 },
|
||||
category: 'length'
|
||||
},
|
||||
|
||||
// Date presets
|
||||
{
|
||||
id: 'last-week',
|
||||
label: 'Last 7 days',
|
||||
description: 'Stories added in the last week',
|
||||
filters: { createdAfter: new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString().split('T')[0] },
|
||||
category: 'date'
|
||||
},
|
||||
{
|
||||
id: 'last-month',
|
||||
label: 'Last 30 days',
|
||||
description: 'Stories added in the last month',
|
||||
filters: { createdAfter: new Date(Date.now() - 30 * 24 * 60 * 60 * 1000).toISOString().split('T')[0] },
|
||||
category: 'date'
|
||||
},
|
||||
{
|
||||
id: 'this-year',
|
||||
label: 'This year',
|
||||
description: 'Stories added this year',
|
||||
filters: { createdAfter: `${new Date().getFullYear()}-01-01` },
|
||||
category: 'date'
|
||||
},
|
||||
|
||||
// Reading status presets
|
||||
{
|
||||
id: 'unread',
|
||||
label: 'Unread',
|
||||
description: 'Stories you haven\'t read yet',
|
||||
filters: { readingStatus: 'unread' },
|
||||
category: 'reading'
|
||||
},
|
||||
{
|
||||
id: 'in-progress',
|
||||
label: 'Started',
|
||||
description: 'Stories you\'ve started reading',
|
||||
filters: { readingStatus: 'started' },
|
||||
category: 'reading'
|
||||
},
|
||||
{
|
||||
id: 'completed',
|
||||
label: 'Finished',
|
||||
description: 'Stories you\'ve completed',
|
||||
filters: { readingStatus: 'completed' },
|
||||
category: 'reading'
|
||||
},
|
||||
|
||||
// Rating presets
|
||||
{
|
||||
id: 'highly-rated',
|
||||
label: '4+ stars',
|
||||
description: 'Highly rated stories (4 stars or more)',
|
||||
filters: { minRating: 4 },
|
||||
category: 'rating'
|
||||
},
|
||||
{
|
||||
id: 'unrated',
|
||||
label: 'Unrated',
|
||||
description: 'Stories without ratings',
|
||||
filters: { unratedOnly: true },
|
||||
category: 'rating'
|
||||
},
|
||||
|
||||
// Content presets
|
||||
{
|
||||
id: 'with-covers',
|
||||
label: 'Has Cover',
|
||||
description: 'Stories with cover images',
|
||||
filters: { hasCoverImage: true },
|
||||
category: 'content'
|
||||
},
|
||||
{
|
||||
id: 'standalone',
|
||||
label: 'Standalone',
|
||||
description: 'Stories not part of a series',
|
||||
filters: { seriesFilter: 'standalone' },
|
||||
category: 'content'
|
||||
},
|
||||
{
|
||||
id: 'series-only',
|
||||
label: 'Series',
|
||||
description: 'Stories that are part of a series',
|
||||
filters: { seriesFilter: 'series' },
|
||||
category: 'content'
|
||||
}
|
||||
];
|
||||
|
||||
export default function AdvancedFilters({
|
||||
filters,
|
||||
onChange,
|
||||
onReset,
|
||||
className = ''
|
||||
}: AdvancedFiltersProps) {
|
||||
|
||||
// Prevent event bubbling when interacting with the component
|
||||
const handleContainerClick = (e: React.MouseEvent) => {
|
||||
e.stopPropagation();
|
||||
};
|
||||
|
||||
const handleKeyDown = (e: React.KeyboardEvent) => {
|
||||
// Prevent escape key from bubbling up (let parent handle it)
|
||||
e.stopPropagation();
|
||||
};
|
||||
const [expandedSections, setExpandedSections] = useState<Record<string, boolean>>({
|
||||
length: false,
|
||||
date: false,
|
||||
rating: false,
|
||||
reading: false,
|
||||
content: false
|
||||
});
|
||||
|
||||
// Helper functions
|
||||
const updateFilter = <K extends keyof AdvancedFilters>(
|
||||
key: K,
|
||||
value: AdvancedFilters[K]
|
||||
) => {
|
||||
onChange({ ...filters, [key]: value });
|
||||
};
|
||||
|
||||
const applyPreset = (preset: FilterPreset) => {
|
||||
onChange({ ...filters, ...preset.filters });
|
||||
};
|
||||
|
||||
const isPresetActive = (preset: FilterPreset) => {
|
||||
return Object.entries(preset.filters).every(([key, value]) =>
|
||||
filters[key as keyof AdvancedFilters] === value
|
||||
);
|
||||
};
|
||||
|
||||
const toggleSection = (section: string) => {
|
||||
setExpandedSections(prev => ({ ...prev, [section]: !prev[section] }));
|
||||
};
|
||||
|
||||
const hasActiveFilters = Object.values(filters).some(value =>
|
||||
value !== undefined && value !== '' && value !== 'all'
|
||||
);
|
||||
|
||||
// Group presets by category
|
||||
const presetsByCategory = FILTER_PRESETS.reduce((acc, preset) => {
|
||||
if (!acc[preset.category]) acc[preset.category] = [];
|
||||
acc[preset.category].push(preset);
|
||||
return acc;
|
||||
}, {} as Record<string, FilterPreset[]>);
|
||||
|
||||
return (
|
||||
<div
|
||||
className={`space-y-4 ${className}`}
|
||||
onClick={handleContainerClick}
|
||||
onKeyDown={handleKeyDown}
|
||||
>
|
||||
{/* Quick Filter Buttons */}
|
||||
<div className="space-y-3">
|
||||
<div className="flex items-center justify-between">
|
||||
<h4 className="font-medium theme-header text-sm">Quick Filters</h4>
|
||||
{hasActiveFilters && (
|
||||
<Button variant="ghost" size="sm" onClick={onReset}>
|
||||
Clear All
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{Object.entries(presetsByCategory).map(([category, presets]) => (
|
||||
<div key={category} className="space-y-1">
|
||||
<div className="text-xs font-medium theme-text opacity-75 uppercase tracking-wide">
|
||||
{category.charAt(0).toUpperCase() + category.slice(1)}
|
||||
</div>
|
||||
<div className="flex flex-wrap gap-1">
|
||||
{presets.map(preset => (
|
||||
<button
|
||||
key={preset.id}
|
||||
onClick={() => applyPreset(preset)}
|
||||
className={`px-2 py-1 rounded text-xs font-medium transition-all hover:scale-105 ${
|
||||
isPresetActive(preset)
|
||||
? 'bg-blue-500 text-white'
|
||||
: 'bg-gray-100 dark:bg-gray-700 theme-text hover:bg-blue-100 dark:hover:bg-blue-900'
|
||||
}`}
|
||||
title={preset.description}
|
||||
>
|
||||
{preset.label}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<div className="border-t theme-border pt-4">
|
||||
<h4 className="font-medium theme-header text-sm mb-3">Detailed Controls</h4>
|
||||
|
||||
{/* Word Count Section */}
|
||||
<div className="space-y-2 mb-4">
|
||||
<button
|
||||
onClick={() => toggleSection('length')}
|
||||
className="flex items-center gap-2 text-sm font-medium theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
<span className={`transform transition-transform ${expandedSections.length ? 'rotate-90' : ''}`}>
|
||||
▶
|
||||
</span>
|
||||
📏 Story Length
|
||||
{(filters.minWordCount || filters.maxWordCount) && (
|
||||
<span className="text-xs bg-blue-500 text-white px-1 rounded">●</span>
|
||||
)}
|
||||
</button>
|
||||
|
||||
{expandedSections.length && (
|
||||
<div className="pl-6 space-y-3 bg-gray-50 dark:bg-gray-800 p-3 rounded">
|
||||
<div className="space-y-3">
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">Min Words</label>
|
||||
<Input
|
||||
type="number"
|
||||
value={filters.minWordCount || ''}
|
||||
onChange={(e) => updateFilter('minWordCount', e.target.value ? parseInt(e.target.value) : undefined)}
|
||||
placeholder="0"
|
||||
className="text-xs w-full"
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">Max Words</label>
|
||||
<Input
|
||||
type="number"
|
||||
value={filters.maxWordCount || ''}
|
||||
onChange={(e) => updateFilter('maxWordCount', e.target.value ? parseInt(e.target.value) : undefined)}
|
||||
placeholder="∞"
|
||||
className="text-xs w-full"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Word count range display */}
|
||||
{(filters.minWordCount || filters.maxWordCount) && (
|
||||
<div className="text-xs theme-text bg-white dark:bg-gray-700 p-2 rounded">
|
||||
Range: {filters.minWordCount || 0} - {filters.maxWordCount || '∞'} words
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Date Section */}
|
||||
<div className="space-y-2 mb-4">
|
||||
<button
|
||||
onClick={() => toggleSection('date')}
|
||||
className="flex items-center gap-2 text-sm font-medium theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
<span className={`transform transition-transform ${expandedSections.date ? 'rotate-90' : ''}`}>
|
||||
▶
|
||||
</span>
|
||||
📅 Date Added
|
||||
{(filters.createdAfter || filters.createdBefore) && (
|
||||
<span className="text-xs bg-blue-500 text-white px-1 rounded">●</span>
|
||||
)}
|
||||
</button>
|
||||
|
||||
{expandedSections.date && (
|
||||
<div className="pl-6 space-y-3 bg-gray-50 dark:bg-gray-800 p-3 rounded">
|
||||
<div className="space-y-3">
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">After Date</label>
|
||||
<Input
|
||||
type="date"
|
||||
value={filters.createdAfter || ''}
|
||||
onChange={(e) => updateFilter('createdAfter', e.target.value || undefined)}
|
||||
className="text-xs w-full"
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">Before Date</label>
|
||||
<Input
|
||||
type="date"
|
||||
value={filters.createdBefore || ''}
|
||||
onChange={(e) => updateFilter('createdBefore', e.target.value || undefined)}
|
||||
className="text-xs w-full"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Rating Section */}
|
||||
<div className="space-y-2 mb-4">
|
||||
<button
|
||||
onClick={() => toggleSection('rating')}
|
||||
className="flex items-center gap-2 text-sm font-medium theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
<span className={`transform transition-transform ${expandedSections.rating ? 'rotate-90' : ''}`}>
|
||||
▶
|
||||
</span>
|
||||
⭐ Rating
|
||||
{(filters.minRating || filters.maxRating || filters.unratedOnly) && (
|
||||
<span className="text-xs bg-blue-500 text-white px-1 rounded">●</span>
|
||||
)}
|
||||
</button>
|
||||
|
||||
{expandedSections.rating && (
|
||||
<div className="pl-6 space-y-3 bg-gray-50 dark:bg-gray-800 p-3 rounded">
|
||||
<div className="space-y-2">
|
||||
<label className="flex items-center gap-2">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={filters.unratedOnly || false}
|
||||
onChange={(e) => updateFilter('unratedOnly', e.target.checked || undefined)}
|
||||
/>
|
||||
<span className="text-xs theme-text">Unrated stories only</span>
|
||||
</label>
|
||||
</div>
|
||||
|
||||
{!filters.unratedOnly && (
|
||||
<div className="space-y-3">
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">Min Rating</label>
|
||||
<select
|
||||
value={filters.minRating || ''}
|
||||
onChange={(e) => updateFilter('minRating', e.target.value ? parseInt(e.target.value) : undefined)}
|
||||
className="w-full px-2 py-1 text-xs border rounded theme-card border-gray-300 dark:border-gray-600"
|
||||
>
|
||||
<option value="">No minimum</option>
|
||||
<option value="1">1 star</option>
|
||||
<option value="2">2 stars</option>
|
||||
<option value="3">3 stars</option>
|
||||
<option value="4">4 stars</option>
|
||||
<option value="5">5 stars</option>
|
||||
</select>
|
||||
</div>
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">Max Rating</label>
|
||||
<select
|
||||
value={filters.maxRating || ''}
|
||||
onChange={(e) => updateFilter('maxRating', e.target.value ? parseInt(e.target.value) : undefined)}
|
||||
className="w-full px-2 py-1 text-xs border rounded theme-card border-gray-300 dark:border-gray-600"
|
||||
>
|
||||
<option value="">No maximum</option>
|
||||
<option value="1">1 star</option>
|
||||
<option value="2">2 stars</option>
|
||||
<option value="3">3 stars</option>
|
||||
<option value="4">4 stars</option>
|
||||
<option value="5">5 stars</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Reading Status Section */}
|
||||
<div className="space-y-2 mb-4">
|
||||
<button
|
||||
onClick={() => toggleSection('reading')}
|
||||
className="flex items-center gap-2 text-sm font-medium theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
<span className={`transform transition-transform ${expandedSections.reading ? 'rotate-90' : ''}`}>
|
||||
▶
|
||||
</span>
|
||||
👁️ Reading Status
|
||||
{(filters.readingStatus && filters.readingStatus !== 'all') && (
|
||||
<span className="text-xs bg-blue-500 text-white px-1 rounded">●</span>
|
||||
)}
|
||||
</button>
|
||||
|
||||
{expandedSections.reading && (
|
||||
<div className="pl-6 space-y-2 bg-gray-50 dark:bg-gray-800 p-3 rounded">
|
||||
<div className="space-y-1">
|
||||
{[
|
||||
{ value: 'all', label: 'All stories' },
|
||||
{ value: 'unread', label: 'Unread' },
|
||||
{ value: 'started', label: 'Started reading' },
|
||||
{ value: 'completed', label: 'Completed' }
|
||||
].map(option => (
|
||||
<label key={option.value} className="flex items-center gap-2">
|
||||
<input
|
||||
type="radio"
|
||||
name="readingStatus"
|
||||
value={option.value}
|
||||
checked={(filters.readingStatus || 'all') === option.value}
|
||||
onChange={(e) => updateFilter('readingStatus', e.target.value as any)}
|
||||
/>
|
||||
<span className="text-xs theme-text">{option.label}</span>
|
||||
</label>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Content Section */}
|
||||
<div className="space-y-2 mb-4">
|
||||
<button
|
||||
onClick={() => toggleSection('content')}
|
||||
className="flex items-center gap-2 text-sm font-medium theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
<span className={`transform transition-transform ${expandedSections.content ? 'rotate-90' : ''}`}>
|
||||
▶
|
||||
</span>
|
||||
📚 Content
|
||||
{(filters.hasCoverImage || filters.seriesFilter !== 'all' || filters.sourceDomain) && (
|
||||
<span className="text-xs bg-blue-500 text-white px-1 rounded">●</span>
|
||||
)}
|
||||
</button>
|
||||
|
||||
{expandedSections.content && (
|
||||
<div className="pl-6 space-y-3 bg-gray-50 dark:bg-gray-800 p-3 rounded">
|
||||
<div className="space-y-2">
|
||||
<label className="flex items-center gap-2">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={filters.hasCoverImage || false}
|
||||
onChange={(e) => updateFilter('hasCoverImage', e.target.checked || undefined)}
|
||||
/>
|
||||
<span className="text-xs theme-text">Has cover image</span>
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">Series Filter</label>
|
||||
<select
|
||||
value={filters.seriesFilter || 'all'}
|
||||
onChange={(e) => updateFilter('seriesFilter', e.target.value as any)}
|
||||
className="w-full px-2 py-1 text-xs border rounded theme-card border-gray-300 dark:border-gray-600"
|
||||
>
|
||||
<option value="all">All stories</option>
|
||||
<option value="standalone">Standalone only</option>
|
||||
<option value="series">Series only</option>
|
||||
<option value="firstInSeries">First in series</option>
|
||||
<option value="lastInSeries">Last in series</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">Source Domain</label>
|
||||
<Input
|
||||
type="text"
|
||||
value={filters.sourceDomain || ''}
|
||||
onChange={(e) => updateFilter('sourceDomain', e.target.value || undefined)}
|
||||
placeholder="e.g., archiveofourown.org"
|
||||
className="text-xs"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Advanced Options */}
|
||||
<div className="space-y-2">
|
||||
<div className="text-xs font-medium theme-text opacity-75 uppercase tracking-wide">
|
||||
Advanced
|
||||
</div>
|
||||
<div className="space-y-2 bg-gray-50 dark:bg-gray-800 p-3 rounded">
|
||||
<div>
|
||||
<label className="block text-xs theme-text mb-1">Minimum Tag Count</label>
|
||||
<Input
|
||||
type="number"
|
||||
value={filters.minTagCount || ''}
|
||||
onChange={(e) => updateFilter('minTagCount', e.target.value ? parseInt(e.target.value) : undefined)}
|
||||
placeholder="0"
|
||||
className="text-xs"
|
||||
min="0"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className="space-y-1">
|
||||
<label className="flex items-center gap-2">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={filters.popularOnly || false}
|
||||
onChange={(e) => updateFilter('popularOnly', e.target.checked || undefined)}
|
||||
/>
|
||||
<span className="text-xs theme-text">Popular stories only (above average rating)</span>
|
||||
</label>
|
||||
|
||||
<label className="flex items-center gap-2">
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={filters.hiddenGemsOnly || false}
|
||||
onChange={(e) => updateFilter('hiddenGemsOnly', e.target.checked || undefined)}
|
||||
/>
|
||||
<span className="text-xs theme-text">Hidden gems (underrated/unrated)</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
607
frontend/src/components/library/LibrarySettings.tsx
Normal file
607
frontend/src/components/library/LibrarySettings.tsx
Normal file
@@ -0,0 +1,607 @@
|
||||
'use client';
|
||||
|
||||
import React, { useState, useEffect } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
import Button from '../ui/Button';
|
||||
import { Input } from '../ui/Input';
|
||||
import LibrarySwitchLoader from '../ui/LibrarySwitchLoader';
|
||||
import { useLibrarySwitch } from '../../hooks/useLibrarySwitch';
|
||||
import { setCurrentLibraryId, clearLibraryCache } from '../../lib/api';
|
||||
|
||||
interface Library {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
isActive: boolean;
|
||||
isInitialized: boolean;
|
||||
}
|
||||
|
||||
export default function LibrarySettings() {
|
||||
const router = useRouter();
|
||||
const { state: switchState, switchLibrary, clearError, reset } = useLibrarySwitch();
|
||||
|
||||
const [libraries, setLibraries] = useState<Library[]>([]);
|
||||
const [currentLibrary, setCurrentLibrary] = useState<Library | null>(null);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [switchPassword, setSwitchPassword] = useState('');
|
||||
const [showSwitchForm, setShowSwitchForm] = useState(false);
|
||||
const [passwordChangeForm, setPasswordChangeForm] = useState({
|
||||
currentPassword: '',
|
||||
newPassword: '',
|
||||
confirmPassword: ''
|
||||
});
|
||||
const [showPasswordChangeForm, setShowPasswordChangeForm] = useState(false);
|
||||
const [passwordChangeLoading, setPasswordChangeLoading] = useState(false);
|
||||
const [passwordChangeMessage, setPasswordChangeMessage] = useState<{type: 'success' | 'error', text: string} | null>(null);
|
||||
const [createLibraryForm, setCreateLibraryForm] = useState({
|
||||
name: '',
|
||||
description: '',
|
||||
password: '',
|
||||
confirmPassword: ''
|
||||
});
|
||||
const [showCreateLibraryForm, setShowCreateLibraryForm] = useState(false);
|
||||
const [createLibraryLoading, setCreateLibraryLoading] = useState(false);
|
||||
const [createLibraryMessage, setCreateLibraryMessage] = useState<{type: 'success' | 'error', text: string} | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
loadLibraries();
|
||||
loadCurrentLibrary();
|
||||
}, []);
|
||||
|
||||
const loadLibraries = async () => {
|
||||
try {
|
||||
const response = await fetch('/api/libraries');
|
||||
if (response.ok) {
|
||||
const data = await response.json();
|
||||
setLibraries(data);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to load libraries:', error);
|
||||
}
|
||||
};
|
||||
|
||||
const loadCurrentLibrary = async () => {
|
||||
try {
|
||||
const response = await fetch('/api/libraries/current');
|
||||
if (response.ok) {
|
||||
const data = await response.json();
|
||||
setCurrentLibrary(data);
|
||||
// Set the library ID for image URL generation
|
||||
setCurrentLibraryId(data.id);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to load current library:', error);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleSwitchLibrary = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!switchPassword.trim()) {
|
||||
return;
|
||||
}
|
||||
|
||||
const success = await switchLibrary(switchPassword);
|
||||
if (success) {
|
||||
// The LibrarySwitchLoader will handle the rest
|
||||
}
|
||||
};
|
||||
|
||||
const handleSwitchComplete = () => {
|
||||
// Clear the library cache so images use the new library
|
||||
clearLibraryCache();
|
||||
// Refresh the page to reload with new library context
|
||||
router.refresh();
|
||||
window.location.reload();
|
||||
};
|
||||
|
||||
const handleSwitchError = (error: string) => {
|
||||
console.error('Library switch error:', error);
|
||||
reset();
|
||||
};
|
||||
|
||||
const handlePasswordChange = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (passwordChangeForm.newPassword !== passwordChangeForm.confirmPassword) {
|
||||
setPasswordChangeMessage({type: 'error', text: 'New passwords do not match'});
|
||||
return;
|
||||
}
|
||||
|
||||
if (passwordChangeForm.newPassword.length < 8) {
|
||||
setPasswordChangeMessage({type: 'error', text: 'Password must be at least 8 characters long'});
|
||||
return;
|
||||
}
|
||||
|
||||
setPasswordChangeLoading(true);
|
||||
setPasswordChangeMessage(null);
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/libraries/password', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
currentPassword: passwordChangeForm.currentPassword,
|
||||
newPassword: passwordChangeForm.newPassword,
|
||||
}),
|
||||
});
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (response.ok && data.success) {
|
||||
setPasswordChangeMessage({type: 'success', text: 'Password changed successfully'});
|
||||
setPasswordChangeForm({
|
||||
currentPassword: '',
|
||||
newPassword: '',
|
||||
confirmPassword: ''
|
||||
});
|
||||
setShowPasswordChangeForm(false);
|
||||
} else {
|
||||
setPasswordChangeMessage({type: 'error', text: data.error || 'Failed to change password'});
|
||||
}
|
||||
} catch (error) {
|
||||
setPasswordChangeMessage({type: 'error', text: 'Network error occurred'});
|
||||
} finally {
|
||||
setPasswordChangeLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleCreateLibrary = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (createLibraryForm.password !== createLibraryForm.confirmPassword) {
|
||||
setCreateLibraryMessage({type: 'error', text: 'Passwords do not match'});
|
||||
return;
|
||||
}
|
||||
|
||||
if (createLibraryForm.password.length < 8) {
|
||||
setCreateLibraryMessage({type: 'error', text: 'Password must be at least 8 characters long'});
|
||||
return;
|
||||
}
|
||||
|
||||
if (createLibraryForm.name.trim().length < 2) {
|
||||
setCreateLibraryMessage({type: 'error', text: 'Library name must be at least 2 characters long'});
|
||||
return;
|
||||
}
|
||||
|
||||
setCreateLibraryLoading(true);
|
||||
setCreateLibraryMessage(null);
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/libraries/create', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
name: createLibraryForm.name.trim(),
|
||||
description: createLibraryForm.description.trim(),
|
||||
password: createLibraryForm.password,
|
||||
}),
|
||||
});
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (response.ok && data.success) {
|
||||
setCreateLibraryMessage({
|
||||
type: 'success',
|
||||
text: `Library "${data.library.name}" created successfully! You can now log out and log in with the new password to access it.`
|
||||
});
|
||||
setCreateLibraryForm({
|
||||
name: '',
|
||||
description: '',
|
||||
password: '',
|
||||
confirmPassword: ''
|
||||
});
|
||||
setShowCreateLibraryForm(false);
|
||||
loadLibraries(); // Refresh the library list
|
||||
} else {
|
||||
setCreateLibraryMessage({type: 'error', text: data.error || 'Failed to create library'});
|
||||
}
|
||||
} catch (error) {
|
||||
setCreateLibraryMessage({type: 'error', text: 'Network error occurred'});
|
||||
} finally {
|
||||
setCreateLibraryLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="bg-white dark:bg-gray-800 rounded-lg shadow p-6">
|
||||
<h2 className="text-xl font-semibold mb-4 text-gray-900 dark:text-white">
|
||||
Library Settings
|
||||
</h2>
|
||||
<div className="animate-pulse">
|
||||
<div className="h-4 bg-gray-300 dark:bg-gray-600 rounded w-1/4 mb-2"></div>
|
||||
<div className="h-4 bg-gray-300 dark:bg-gray-600 rounded w-1/2"></div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="bg-white dark:bg-gray-800 rounded-lg shadow p-6">
|
||||
<h2 className="text-xl font-semibold mb-4 text-gray-900 dark:text-white">
|
||||
Library Settings
|
||||
</h2>
|
||||
|
||||
{/* Current Library Info */}
|
||||
{currentLibrary && (
|
||||
<div className="mb-6 p-4 bg-blue-50 dark:bg-blue-900/20 rounded-lg">
|
||||
<h3 className="font-medium text-blue-900 dark:text-blue-100 mb-1">
|
||||
Active Library
|
||||
</h3>
|
||||
<p className="text-blue-700 dark:text-blue-300 text-sm">
|
||||
<strong>{currentLibrary.name}</strong>
|
||||
</p>
|
||||
<p className="text-blue-600 dark:text-blue-400 text-xs mt-1">
|
||||
{currentLibrary.description}
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Change Password Section */}
|
||||
<div className="mb-6 border-t pt-4">
|
||||
<h3 className="font-medium text-gray-900 dark:text-white mb-3">
|
||||
Change Library Password
|
||||
</h3>
|
||||
|
||||
{passwordChangeMessage && (
|
||||
<div className={`p-3 rounded-lg mb-4 ${
|
||||
passwordChangeMessage.type === 'success'
|
||||
? 'bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800'
|
||||
: 'bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800'
|
||||
}`}>
|
||||
<p className={`text-sm ${
|
||||
passwordChangeMessage.type === 'success'
|
||||
? 'text-green-700 dark:text-green-300'
|
||||
: 'text-red-700 dark:text-red-300'
|
||||
}`}>
|
||||
{passwordChangeMessage.text}
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!showPasswordChangeForm ? (
|
||||
<div>
|
||||
<p className="text-sm text-gray-600 dark:text-gray-300 mb-3">
|
||||
Change the password for the current library ({currentLibrary?.name}).
|
||||
</p>
|
||||
<Button
|
||||
onClick={() => setShowPasswordChangeForm(true)}
|
||||
variant="secondary"
|
||||
>
|
||||
Change Password
|
||||
</Button>
|
||||
</div>
|
||||
) : (
|
||||
<form onSubmit={handlePasswordChange} className="space-y-4">
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Current Password
|
||||
</label>
|
||||
<Input
|
||||
type="password"
|
||||
value={passwordChangeForm.currentPassword}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) =>
|
||||
setPasswordChangeForm(prev => ({ ...prev, currentPassword: e.target.value }))
|
||||
}
|
||||
placeholder="Enter current password"
|
||||
required
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
New Password
|
||||
</label>
|
||||
<Input
|
||||
type="password"
|
||||
value={passwordChangeForm.newPassword}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) =>
|
||||
setPasswordChangeForm(prev => ({ ...prev, newPassword: e.target.value }))
|
||||
}
|
||||
placeholder="Enter new password (min 8 characters)"
|
||||
required
|
||||
minLength={8}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Confirm New Password
|
||||
</label>
|
||||
<Input
|
||||
type="password"
|
||||
value={passwordChangeForm.confirmPassword}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) =>
|
||||
setPasswordChangeForm(prev => ({ ...prev, confirmPassword: e.target.value }))
|
||||
}
|
||||
placeholder="Confirm new password"
|
||||
required
|
||||
minLength={8}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className="flex space-x-3">
|
||||
<Button
|
||||
type="submit"
|
||||
disabled={passwordChangeLoading}
|
||||
loading={passwordChangeLoading}
|
||||
>
|
||||
{passwordChangeLoading ? 'Changing...' : 'Change Password'}
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
variant="secondary"
|
||||
onClick={() => {
|
||||
setShowPasswordChangeForm(false);
|
||||
setPasswordChangeForm({
|
||||
currentPassword: '',
|
||||
newPassword: '',
|
||||
confirmPassword: ''
|
||||
});
|
||||
setPasswordChangeMessage(null);
|
||||
}}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Available Libraries */}
|
||||
<div className="mb-6">
|
||||
<h3 className="font-medium text-gray-900 dark:text-white mb-3">
|
||||
Available Libraries
|
||||
</h3>
|
||||
<div className="space-y-2">
|
||||
{libraries.map((library) => (
|
||||
<div
|
||||
key={library.id}
|
||||
className={`p-3 rounded-lg border ${
|
||||
library.isActive
|
||||
? 'border-blue-200 bg-blue-50 dark:border-blue-800 dark:bg-blue-900/20'
|
||||
: 'border-gray-200 bg-gray-50 dark:border-gray-700 dark:bg-gray-900/50'
|
||||
}`}
|
||||
>
|
||||
<div className="flex items-center justify-between">
|
||||
<div>
|
||||
<p className="font-medium text-gray-900 dark:text-white">
|
||||
{library.name}
|
||||
{library.isActive && (
|
||||
<span className="ml-2 text-xs px-2 py-1 bg-blue-100 dark:bg-blue-800 text-blue-800 dark:text-blue-200 rounded-full">
|
||||
Active
|
||||
</span>
|
||||
)}
|
||||
</p>
|
||||
<p className="text-sm text-gray-600 dark:text-gray-300">
|
||||
{library.description}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{!library.isActive && (
|
||||
<div className="text-xs text-gray-500 dark:text-gray-400">
|
||||
ID: {library.id}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Switch Library Section */}
|
||||
<div className="border-t pt-4">
|
||||
<h3 className="font-medium text-gray-900 dark:text-white mb-3">
|
||||
Switch Library
|
||||
</h3>
|
||||
|
||||
{!showSwitchForm ? (
|
||||
<div>
|
||||
<p className="text-sm text-gray-600 dark:text-gray-300 mb-3">
|
||||
Enter the password for a different library to switch to it.
|
||||
</p>
|
||||
<Button
|
||||
onClick={() => setShowSwitchForm(true)}
|
||||
variant="secondary"
|
||||
>
|
||||
Switch to Different Library
|
||||
</Button>
|
||||
</div>
|
||||
) : (
|
||||
<form onSubmit={handleSwitchLibrary} className="space-y-4">
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Library Password
|
||||
</label>
|
||||
<Input
|
||||
type="password"
|
||||
value={switchPassword}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) => setSwitchPassword(e.target.value)}
|
||||
placeholder="Enter password for the library you want to access"
|
||||
required
|
||||
/>
|
||||
</div>
|
||||
|
||||
{switchState.error && (
|
||||
<div className="p-3 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<p className="text-sm text-red-700 dark:text-red-300">
|
||||
{switchState.error}
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="flex space-x-3">
|
||||
<Button type="submit" disabled={switchState.isLoading}>
|
||||
{switchState.isLoading ? 'Switching...' : 'Switch Library'}
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
variant="secondary"
|
||||
onClick={() => {
|
||||
setShowSwitchForm(false);
|
||||
setSwitchPassword('');
|
||||
clearError();
|
||||
}}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Create New Library Section */}
|
||||
<div className="border-t pt-4 mb-6">
|
||||
<h3 className="font-medium text-gray-900 dark:text-white mb-3">
|
||||
Create New Library
|
||||
</h3>
|
||||
|
||||
{createLibraryMessage && (
|
||||
<div className={`p-3 rounded-lg mb-4 ${
|
||||
createLibraryMessage.type === 'success'
|
||||
? 'bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800'
|
||||
: 'bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800'
|
||||
}`}>
|
||||
<p className={`text-sm ${
|
||||
createLibraryMessage.type === 'success'
|
||||
? 'text-green-700 dark:text-green-300'
|
||||
: 'text-red-700 dark:text-red-300'
|
||||
}`}>
|
||||
{createLibraryMessage.text}
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!showCreateLibraryForm ? (
|
||||
<div>
|
||||
<p className="text-sm text-gray-600 dark:text-gray-300 mb-3">
|
||||
Create a completely separate library with its own stories, authors, and password.
|
||||
</p>
|
||||
<Button
|
||||
onClick={() => setShowCreateLibraryForm(true)}
|
||||
variant="secondary"
|
||||
>
|
||||
Create New Library
|
||||
</Button>
|
||||
</div>
|
||||
) : (
|
||||
<form onSubmit={handleCreateLibrary} className="space-y-4">
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Library Name *
|
||||
</label>
|
||||
<Input
|
||||
type="text"
|
||||
value={createLibraryForm.name}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) =>
|
||||
setCreateLibraryForm(prev => ({ ...prev, name: e.target.value }))
|
||||
}
|
||||
placeholder="e.g., Private Stories, Work Collection"
|
||||
required
|
||||
minLength={2}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Description
|
||||
</label>
|
||||
<Input
|
||||
type="text"
|
||||
value={createLibraryForm.description}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) =>
|
||||
setCreateLibraryForm(prev => ({ ...prev, description: e.target.value }))
|
||||
}
|
||||
placeholder="Optional description for this library"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Password *
|
||||
</label>
|
||||
<Input
|
||||
type="password"
|
||||
value={createLibraryForm.password}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) =>
|
||||
setCreateLibraryForm(prev => ({ ...prev, password: e.target.value }))
|
||||
}
|
||||
placeholder="Enter password (min 8 characters)"
|
||||
required
|
||||
minLength={8}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">
|
||||
Confirm Password *
|
||||
</label>
|
||||
<Input
|
||||
type="password"
|
||||
value={createLibraryForm.confirmPassword}
|
||||
onChange={(e: React.ChangeEvent<HTMLInputElement>) =>
|
||||
setCreateLibraryForm(prev => ({ ...prev, confirmPassword: e.target.value }))
|
||||
}
|
||||
placeholder="Confirm password"
|
||||
required
|
||||
minLength={8}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className="flex space-x-3">
|
||||
<Button
|
||||
type="submit"
|
||||
disabled={createLibraryLoading}
|
||||
loading={createLibraryLoading}
|
||||
>
|
||||
{createLibraryLoading ? 'Creating...' : 'Create Library'}
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
variant="secondary"
|
||||
onClick={() => {
|
||||
setShowCreateLibraryForm(false);
|
||||
setCreateLibraryForm({
|
||||
name: '',
|
||||
description: '',
|
||||
password: '',
|
||||
confirmPassword: ''
|
||||
});
|
||||
setCreateLibraryMessage(null);
|
||||
}}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Info Box */}
|
||||
<div className="mt-6 p-4 bg-yellow-50 dark:bg-yellow-900/20 rounded-lg">
|
||||
<p className="text-sm text-yellow-800 dark:text-yellow-200">
|
||||
<strong>Note:</strong> Libraries are completely separate datasets. Switching libraries
|
||||
will reload the application with a different set of stories, authors, and settings.
|
||||
Each library has its own password for security.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Library Switch Loader */}
|
||||
<LibrarySwitchLoader
|
||||
isVisible={switchState.isLoading}
|
||||
targetLibraryName={switchState.targetLibraryName || undefined}
|
||||
onComplete={handleSwitchComplete}
|
||||
onError={handleSwitchError}
|
||||
/>
|
||||
</>
|
||||
);
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user