Compare commits
51 Commits
feature/co
...
feature/em
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c7b516be31 | ||
|
|
c92308c24a | ||
|
|
f92dcc5314 | ||
|
|
702fcb33c1 | ||
|
|
11b2a8b071 | ||
|
|
d1289bd616 | ||
|
|
15708b5ab2 | ||
|
|
a660056003 | ||
|
|
35a5825e76 | ||
|
|
87a4999ffe | ||
|
|
4ee5fa2330 | ||
|
|
6128d61349 | ||
|
|
5e347f2e2e | ||
|
|
8eb126a304 | ||
|
|
3dc02420fe | ||
|
|
241a15a174 | ||
|
|
6b97c0a70f | ||
|
|
e952241e3c | ||
|
|
65f1c6edc7 | ||
|
|
40fe3fdb80 | ||
|
|
95ce5fb532 | ||
|
|
1a99d9830d | ||
|
|
6b83783381 | ||
|
|
460ec358ca | ||
|
|
1d14d3d7aa | ||
|
|
4357351ec8 | ||
|
|
4ab03953ae | ||
|
|
142d8328c2 | ||
|
|
c46108c317 | ||
|
|
75c207970d | ||
|
|
3b22d155db | ||
|
|
51e3d20c24 | ||
|
|
5d195b63ef | ||
|
|
5b3a9d183e | ||
|
|
379c8c170f | ||
|
|
090b858a54 | ||
|
|
b0c14d4b37 | ||
|
|
7227061d25 | ||
|
|
415eab07de | ||
|
|
e89331e059 | ||
|
|
370bef2f07 | ||
|
|
9e788c2018 | ||
|
|
590e2590d6 | ||
|
|
57859d7a84 | ||
|
|
5746001c4a | ||
|
|
c08082c0d6 | ||
|
|
860bf02d56 | ||
|
|
a501b27169 | ||
|
|
fcad028959 | ||
|
|
f95d7aa8bb | ||
| 5e8164c6a4 |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -47,3 +47,4 @@ Thumbs.db
|
|||||||
# Application data
|
# Application data
|
||||||
images/
|
images/
|
||||||
data/
|
data/
|
||||||
|
backend/cookies.txt
|
||||||
|
|||||||
Binary file not shown.
|
Before Width: | Height: | Size: 37 KiB |
466
EPUB_IMPORT_EXPORT_SPECIFICATION.md
Normal file
466
EPUB_IMPORT_EXPORT_SPECIFICATION.md
Normal file
@@ -0,0 +1,466 @@
|
|||||||
|
# EPUB Import/Export Specification
|
||||||
|
|
||||||
|
## 🎉 Phase 1 & 2 Implementation Complete
|
||||||
|
|
||||||
|
**Status**: Both Phase 1 and Phase 2 fully implemented and operational as of August 2025
|
||||||
|
|
||||||
|
**Phase 1 Achievements**:
|
||||||
|
- ✅ Complete EPUB import functionality with validation and error handling
|
||||||
|
- ✅ Single story EPUB export with XML validation fixes
|
||||||
|
- ✅ Reading position preservation using EPUB CFI standards
|
||||||
|
- ✅ Full frontend UI integration with navigation and authentication
|
||||||
|
- ✅ Moved export button to Story Detail View for better UX
|
||||||
|
- ✅ Added EPUB import to main Add Story menu dropdown
|
||||||
|
|
||||||
|
**Phase 2 Enhancements**:
|
||||||
|
- ✅ **Enhanced Cover Processing**: Automatic extraction and optimization of cover images during EPUB import
|
||||||
|
- ✅ **Advanced Metadata Extraction**: Comprehensive extraction of subjects/tags, keywords, publisher, language, publication dates, and identifiers
|
||||||
|
- ✅ **Collection EPUB Export**: Full collection export with table of contents, proper chapter structure, and metadata aggregation
|
||||||
|
- ✅ **Image Validation**: Robust cover image processing with format detection, resizing, and storage management
|
||||||
|
- ✅ **API Endpoints**: Complete REST API for both individual story and collection EPUB operations
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
This specification defines the requirements and implementation details for importing and exporting EPUB files in StoryCove. The feature enables users to import stories from EPUB files and export their stories/collections as EPUB files with preserved reading positions.
|
||||||
|
|
||||||
|
## Scope
|
||||||
|
|
||||||
|
### In Scope
|
||||||
|
- **EPUB Import**: Parse DRM-free EPUB files and import as stories
|
||||||
|
- **EPUB Export**: Export individual stories and collections as EPUB files
|
||||||
|
- **Reading Position Preservation**: Store and restore reading positions using EPUB standards
|
||||||
|
- **Metadata Handling**: Extract and preserve story metadata (title, author, cover, etc.)
|
||||||
|
- **Content Processing**: HTML content sanitization and formatting
|
||||||
|
|
||||||
|
### Out of Scope (Phase 1)
|
||||||
|
- DRM-protected EPUB files (future consideration)
|
||||||
|
- Real-time reading position sync between devices
|
||||||
|
- Advanced EPUB features (audio, video, interactive content)
|
||||||
|
- EPUB validation beyond basic structure
|
||||||
|
|
||||||
|
## Technical Architecture
|
||||||
|
|
||||||
|
### Backend Implementation
|
||||||
|
- **Language**: Java (Spring Boot)
|
||||||
|
- **Primary Library**: EPUBLib (nl.siegmann.epublib:epublib-core:3.1)
|
||||||
|
- **Processing**: Server-side generation and parsing
|
||||||
|
- **File Handling**: Multipart file upload for import, streaming download for export
|
||||||
|
|
||||||
|
### Dependencies
|
||||||
|
```xml
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.positiondev.epublib</groupId>
|
||||||
|
<artifactId>epublib-core</artifactId>
|
||||||
|
<version>3.1</version>
|
||||||
|
</dependency>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Phase 1 Implementation Notes
|
||||||
|
- **EPUBImportService**: Implemented with full validation, metadata extraction, and reading position handling
|
||||||
|
- **EPUBExportService**: Implemented with XML validation fixes for EPUB reader compatibility
|
||||||
|
- **ReadingPosition Entity**: Created with EPUB CFI support and database indexing
|
||||||
|
- **Authentication**: All endpoints secured with JWT authentication and proper frontend integration
|
||||||
|
- **UI Integration**: Export moved to Story Detail View, Import added to main navigation menu
|
||||||
|
- **XML Compliance**: Fixed XHTML validation issues by properly formatting self-closing tags (`<br>` → `<br />`)
|
||||||
|
|
||||||
|
## EPUB Import Specification
|
||||||
|
|
||||||
|
### Supported Formats
|
||||||
|
- **EPUB 2.0** and **EPUB 3.x** formats
|
||||||
|
- **DRM-Free** files only
|
||||||
|
- **Maximum file size**: 50MB
|
||||||
|
- **Supported content**: Text-based stories with HTML content
|
||||||
|
|
||||||
|
### Import Process Flow
|
||||||
|
1. **File Upload**: User uploads EPUB file via web interface
|
||||||
|
2. **Validation**: Check file format, size, and basic EPUB structure
|
||||||
|
3. **Parsing**: Extract metadata, content, and resources using EPUBLib
|
||||||
|
4. **Content Processing**: Sanitize HTML content using existing Jsoup pipeline
|
||||||
|
5. **Story Creation**: Create Story entity with extracted data
|
||||||
|
6. **Preview**: Show extracted story details for user confirmation
|
||||||
|
7. **Finalization**: Save story to database with imported metadata
|
||||||
|
|
||||||
|
### Metadata Mapping
|
||||||
|
```java
|
||||||
|
// EPUB Metadata → StoryCove Story Entity
|
||||||
|
epub.getMetadata().getFirstTitle() → story.title
|
||||||
|
epub.getMetadata().getAuthors().get(0) → story.authorName
|
||||||
|
epub.getMetadata().getDescriptions().get(0) → story.summary
|
||||||
|
epub.getCoverImage() → story.coverPath
|
||||||
|
epub.getMetadata().getSubjects() → story.tags
|
||||||
|
```
|
||||||
|
|
||||||
|
### Content Extraction
|
||||||
|
- **Multi-chapter EPUBs**: Combine all content files into single HTML
|
||||||
|
- **Chapter separation**: Insert `<hr>` or `<h2>` tags between chapters
|
||||||
|
- **HTML sanitization**: Apply existing sanitization rules
|
||||||
|
- **Image handling**: Extract and store cover images, inline images optional
|
||||||
|
|
||||||
|
### API Endpoints
|
||||||
|
|
||||||
|
#### POST /api/stories/import-epub
|
||||||
|
```java
|
||||||
|
@PostMapping("/import-epub")
|
||||||
|
public ResponseEntity<?> importEPUB(@RequestParam("file") MultipartFile file) {
|
||||||
|
// Implementation in EPUBImportService
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Request**: Multipart file upload
|
||||||
|
**Response**:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"message": "EPUB imported successfully",
|
||||||
|
"storyId": "uuid",
|
||||||
|
"extractedData": {
|
||||||
|
"title": "Story Title",
|
||||||
|
"author": "Author Name",
|
||||||
|
"summary": "Story description",
|
||||||
|
"chapterCount": 12,
|
||||||
|
"wordCount": 45000,
|
||||||
|
"hasCovers": true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## EPUB Export Specification
|
||||||
|
|
||||||
|
### Export Types
|
||||||
|
1. **Single Story Export**: Convert one story to EPUB
|
||||||
|
2. **Collection Export**: Multiple stories as single EPUB with chapters
|
||||||
|
|
||||||
|
### EPUB Structure Generation
|
||||||
|
```
|
||||||
|
story.epub
|
||||||
|
├── mimetype
|
||||||
|
├── META-INF/
|
||||||
|
│ └── container.xml
|
||||||
|
└── OEBPS/
|
||||||
|
├── content.opf # Package metadata
|
||||||
|
├── toc.ncx # Navigation
|
||||||
|
├── stylesheet.css # Styling
|
||||||
|
├── cover.html # Cover page
|
||||||
|
├── chapter001.xhtml # Story content
|
||||||
|
├── images/
|
||||||
|
│ └── cover.jpg # Cover image
|
||||||
|
└── fonts/ (optional)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Reading Position Implementation
|
||||||
|
|
||||||
|
#### EPUB 3 CFI (Canonical Fragment Identifier)
|
||||||
|
```xml
|
||||||
|
<!-- In content.opf metadata -->
|
||||||
|
<meta property="epub-cfi" content="/6/4[chap01]!/4[body01]/10[para05]/3:142"/>
|
||||||
|
<meta property="reading-percentage" content="0.65"/>
|
||||||
|
<meta property="last-read-timestamp" content="2023-12-07T10:30:00Z"/>
|
||||||
|
```
|
||||||
|
|
||||||
|
#### StoryCove Custom Metadata (Fallback)
|
||||||
|
```xml
|
||||||
|
<meta name="storycove:reading-chapter" content="3"/>
|
||||||
|
<meta name="storycove:reading-paragraph" content="15"/>
|
||||||
|
<meta name="storycove:reading-offset" content="142"/>
|
||||||
|
<meta name="storycove:reading-percentage" content="0.65"/>
|
||||||
|
```
|
||||||
|
|
||||||
|
#### CFI Generation Logic
|
||||||
|
```java
|
||||||
|
public String generateCFI(ReadingPosition position) {
|
||||||
|
return String.format("/6/%d[chap%02d]!/4[body01]/%d[para%02d]/3:%d",
|
||||||
|
(position.getChapterIndex() * 2) + 4,
|
||||||
|
position.getChapterIndex(),
|
||||||
|
(position.getParagraphIndex() * 2) + 4,
|
||||||
|
position.getParagraphIndex(),
|
||||||
|
position.getCharacterOffset());
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### API Endpoints
|
||||||
|
|
||||||
|
#### GET /api/stories/{id}/export-epub
|
||||||
|
```java
|
||||||
|
@GetMapping("/{id}/export-epub")
|
||||||
|
public ResponseEntity<StreamingResponseBody> exportStory(@PathVariable UUID id) {
|
||||||
|
// Implementation in EPUBExportService
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response**: EPUB file download with headers:
|
||||||
|
```
|
||||||
|
Content-Type: application/epub+zip
|
||||||
|
Content-Disposition: attachment; filename="story-title.epub"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### GET /api/collections/{id}/export-epub
|
||||||
|
```java
|
||||||
|
@GetMapping("/{id}/export-epub")
|
||||||
|
public ResponseEntity<StreamingResponseBody> exportCollection(@PathVariable UUID id) {
|
||||||
|
// Implementation in EPUBExportService
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response**: Multi-story EPUB with table of contents
|
||||||
|
|
||||||
|
## Data Models
|
||||||
|
|
||||||
|
### ReadingPosition Entity
|
||||||
|
```java
|
||||||
|
@Entity
|
||||||
|
@Table(name = "reading_positions")
|
||||||
|
public class ReadingPosition {
|
||||||
|
@Id
|
||||||
|
private UUID id;
|
||||||
|
|
||||||
|
@ManyToOne(fetch = FetchType.LAZY)
|
||||||
|
@JoinColumn(name = "story_id")
|
||||||
|
private Story story;
|
||||||
|
|
||||||
|
@Column(name = "chapter_index")
|
||||||
|
private Integer chapterIndex = 0;
|
||||||
|
|
||||||
|
@Column(name = "paragraph_index")
|
||||||
|
private Integer paragraphIndex = 0;
|
||||||
|
|
||||||
|
@Column(name = "character_offset")
|
||||||
|
private Integer characterOffset = 0;
|
||||||
|
|
||||||
|
@Column(name = "progress_percentage")
|
||||||
|
private Double progressPercentage = 0.0;
|
||||||
|
|
||||||
|
@Column(name = "epub_cfi")
|
||||||
|
private String canonicalFragmentIdentifier;
|
||||||
|
|
||||||
|
@Column(name = "last_read_at")
|
||||||
|
private LocalDateTime lastReadAt;
|
||||||
|
|
||||||
|
@Column(name = "device_identifier")
|
||||||
|
private String deviceIdentifier;
|
||||||
|
|
||||||
|
// Constructors, getters, setters
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### EPUB Import Request DTO
|
||||||
|
```java
|
||||||
|
public class EPUBImportRequest {
|
||||||
|
private String filename;
|
||||||
|
private Long fileSize;
|
||||||
|
private Boolean preserveChapterStructure = true;
|
||||||
|
private Boolean extractCover = true;
|
||||||
|
private String targetCollectionId; // Optional: add to specific collection
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### EPUB Export Options DTO
|
||||||
|
```java
|
||||||
|
public class EPUBExportOptions {
|
||||||
|
private Boolean includeReadingPosition = true;
|
||||||
|
private Boolean includeCoverImage = true;
|
||||||
|
private Boolean includeMetadata = true;
|
||||||
|
private String cssStylesheet; // Optional custom CSS
|
||||||
|
private EPUBVersion version = EPUBVersion.EPUB3;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Service Layer Architecture
|
||||||
|
|
||||||
|
### EPUBImportService
|
||||||
|
```java
|
||||||
|
@Service
|
||||||
|
public class EPUBImportService {
|
||||||
|
|
||||||
|
// Core import method
|
||||||
|
public Story importEPUBFile(MultipartFile file, EPUBImportRequest request);
|
||||||
|
|
||||||
|
// Helper methods
|
||||||
|
private void validateEPUBFile(MultipartFile file);
|
||||||
|
private Book parseEPUBStructure(InputStream inputStream);
|
||||||
|
private Story extractStoryData(Book epub);
|
||||||
|
private String combineChapterContent(Book epub);
|
||||||
|
private void extractAndSaveCover(Book epub, Story story);
|
||||||
|
private List<String> extractTags(Book epub);
|
||||||
|
private ReadingPosition extractReadingPosition(Book epub);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### EPUBExportService
|
||||||
|
```java
|
||||||
|
@Service
|
||||||
|
public class EPUBExportService {
|
||||||
|
|
||||||
|
// Core export methods
|
||||||
|
public byte[] exportSingleStory(UUID storyId, EPUBExportOptions options);
|
||||||
|
public byte[] exportCollection(UUID collectionId, EPUBExportOptions options);
|
||||||
|
|
||||||
|
// Helper methods
|
||||||
|
private Book createEPUBStructure(Story story, ReadingPosition position);
|
||||||
|
private Book createCollectionEPUB(Collection collection, List<ReadingPosition> positions);
|
||||||
|
private void addReadingPositionMetadata(Book book, ReadingPosition position);
|
||||||
|
private String generateCFI(ReadingPosition position);
|
||||||
|
private Resource createChapterResource(Story story);
|
||||||
|
private Resource createStylesheetResource();
|
||||||
|
private void addCoverImage(Book book, Story story);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Frontend Integration
|
||||||
|
|
||||||
|
### Import UI Flow
|
||||||
|
1. **Upload Interface**: File input with EPUB validation
|
||||||
|
2. **Progress Indicator**: Show parsing progress
|
||||||
|
3. **Preview Screen**: Display extracted metadata for confirmation
|
||||||
|
4. **Confirmation**: Allow editing of title, author, summary before saving
|
||||||
|
5. **Success**: Redirect to created story
|
||||||
|
|
||||||
|
### Export UI Flow
|
||||||
|
1. **Export Button**: Available on story detail and collection pages
|
||||||
|
2. **Options Modal**: Allow selection of export options
|
||||||
|
3. **Progress Indicator**: Show EPUB generation progress
|
||||||
|
4. **Download**: Automatic file download on completion
|
||||||
|
|
||||||
|
### Frontend API Calls
|
||||||
|
```typescript
|
||||||
|
// Import EPUB
|
||||||
|
const importEPUB = async (file: File) => {
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', file);
|
||||||
|
|
||||||
|
const response = await fetch('/api/stories/import-epub', {
|
||||||
|
method: 'POST',
|
||||||
|
body: formData,
|
||||||
|
});
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
};
|
||||||
|
|
||||||
|
// Export Story
|
||||||
|
const exportStoryEPUB = async (storyId: string) => {
|
||||||
|
const response = await fetch(`/api/stories/${storyId}/export-epub`, {
|
||||||
|
method: 'GET',
|
||||||
|
});
|
||||||
|
|
||||||
|
const blob = await response.blob();
|
||||||
|
const url = window.URL.createObjectURL(blob);
|
||||||
|
const a = document.createElement('a');
|
||||||
|
a.href = url;
|
||||||
|
a.download = `${storyTitle}.epub`;
|
||||||
|
a.click();
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Import Errors
|
||||||
|
- **Invalid EPUB format**: "Invalid EPUB file format"
|
||||||
|
- **File too large**: "File size exceeds 50MB limit"
|
||||||
|
- **DRM protected**: "DRM-protected EPUBs not supported"
|
||||||
|
- **Corrupted file**: "EPUB file appears to be corrupted"
|
||||||
|
- **No content**: "EPUB contains no readable content"
|
||||||
|
|
||||||
|
### Export Errors
|
||||||
|
- **Story not found**: "Story not found or access denied"
|
||||||
|
- **Missing content**: "Story has no content to export"
|
||||||
|
- **Generation failure**: "Failed to generate EPUB file"
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
### File Upload Security
|
||||||
|
- **File type validation**: Verify EPUB MIME type and structure
|
||||||
|
- **Size limits**: Enforce maximum file size limits
|
||||||
|
- **Content sanitization**: Apply existing HTML sanitization
|
||||||
|
- **Virus scanning**: Consider integration with antivirus scanning
|
||||||
|
|
||||||
|
### Content Security
|
||||||
|
- **HTML sanitization**: Apply existing Jsoup rules to imported content
|
||||||
|
- **Image validation**: Validate extracted cover images
|
||||||
|
- **Metadata escaping**: Escape special characters in metadata
|
||||||
|
|
||||||
|
## Testing Strategy
|
||||||
|
|
||||||
|
### Unit Tests
|
||||||
|
- EPUB parsing and validation logic
|
||||||
|
- CFI generation and parsing
|
||||||
|
- Metadata extraction accuracy
|
||||||
|
- Content sanitization
|
||||||
|
|
||||||
|
### Integration Tests
|
||||||
|
- End-to-end import/export workflow
|
||||||
|
- Reading position preservation
|
||||||
|
- Multi-story collection export
|
||||||
|
- Error handling scenarios
|
||||||
|
|
||||||
|
### Test Data
|
||||||
|
- Sample EPUB files for various scenarios
|
||||||
|
- EPUBs with and without reading positions
|
||||||
|
- Multi-chapter EPUBs
|
||||||
|
- EPUBs with covers and metadata
|
||||||
|
|
||||||
|
## Performance Considerations
|
||||||
|
|
||||||
|
### Import Performance
|
||||||
|
- **Streaming processing**: Process large EPUBs without loading entirely into memory
|
||||||
|
- **Async processing**: Consider async import for large files
|
||||||
|
- **Progress tracking**: Provide progress feedback for large imports
|
||||||
|
|
||||||
|
### Export Performance
|
||||||
|
- **Caching**: Cache generated EPUBs for repeated exports
|
||||||
|
- **Streaming**: Stream EPUB generation for large collections
|
||||||
|
- **Resource optimization**: Optimize image and content sizes
|
||||||
|
|
||||||
|
## Future Enhancements (Out of Scope)
|
||||||
|
|
||||||
|
### Phase 2 Considerations
|
||||||
|
- **DRM support**: Research legal and technical feasibility
|
||||||
|
- **Reading position sync**: Real-time sync across devices
|
||||||
|
- **Advanced EPUB features**: Enhanced typography, annotations
|
||||||
|
- **Bulk operations**: Import/export multiple EPUBs
|
||||||
|
- **EPUB validation**: Full EPUB compliance checking
|
||||||
|
|
||||||
|
### Integration Possibilities
|
||||||
|
- **Cloud storage**: Export directly to Google Drive, Dropbox
|
||||||
|
- **E-reader sync**: Direct sync with Kindle, Kobo devices
|
||||||
|
- **Reading analytics**: Track reading patterns and statistics
|
||||||
|
|
||||||
|
## Implementation Phases
|
||||||
|
|
||||||
|
### Phase 1: Core Functionality ✅ **COMPLETED**
|
||||||
|
- [x] Basic EPUB import (DRM-free)
|
||||||
|
- [x] Single story export
|
||||||
|
- [x] Reading position storage and retrieval
|
||||||
|
- [x] Frontend UI integration
|
||||||
|
|
||||||
|
### Phase 2: Enhanced Features ✅ **COMPLETED**
|
||||||
|
- [x] Collection export with table of contents
|
||||||
|
- [x] Advanced metadata handling (subjects, keywords, publisher, language, etc.)
|
||||||
|
- [x] Enhanced cover image processing for import/export
|
||||||
|
- [x] Comprehensive error handling
|
||||||
|
|
||||||
|
### Phase 3: Advanced Features
|
||||||
|
- [ ] DRM exploration (legal research required)
|
||||||
|
- [ ] Reading position sync
|
||||||
|
- [ ] Advanced EPUB features
|
||||||
|
- [ ] Analytics and reporting
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
|
||||||
|
### Import Success Criteria ✅ **COMPLETED**
|
||||||
|
- [x] Successfully parse EPUB 2.0 and 3.x files
|
||||||
|
- [x] Extract title, author, summary, and content accurately
|
||||||
|
- [x] Preserve formatting and basic HTML structure
|
||||||
|
- [x] Handle cover images correctly
|
||||||
|
- [x] Import reading positions when present
|
||||||
|
- [x] Provide clear error messages for invalid files
|
||||||
|
|
||||||
|
### Export Success Criteria ✅ **FULLY COMPLETED**
|
||||||
|
- [x] Generate valid EPUB files compatible with major readers
|
||||||
|
- [x] Include accurate metadata and content
|
||||||
|
- [x] Embed reading positions using CFI standard
|
||||||
|
- [x] Support single story export
|
||||||
|
- [x] Support collection export with proper structure
|
||||||
|
- [x] Generate proper table of contents for collections
|
||||||
|
- [x] Include cover images when available
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*This specification serves as the implementation guide for the EPUB import/export feature. All implementation decisions should reference this document for consistency and completeness.*
|
||||||
131
README.md
131
README.md
@@ -131,9 +131,12 @@ cd backend
|
|||||||
### 🎨 **User Experience**
|
### 🎨 **User Experience**
|
||||||
- **Dark/Light Mode**: Automatic theme switching with system preference detection
|
- **Dark/Light Mode**: Automatic theme switching with system preference detection
|
||||||
- **Responsive Design**: Optimized for desktop, tablet, and mobile
|
- **Responsive Design**: Optimized for desktop, tablet, and mobile
|
||||||
- **Reading Mode**: Distraction-free reading interface
|
- **Reading Mode**: Distraction-free reading interface with real-time progress tracking
|
||||||
|
- **Reading Position Memory**: Character-based position tracking with smooth auto-scroll restoration
|
||||||
|
- **Smart Tag Filtering**: Dynamic tag filters with live story counts in library view
|
||||||
- **Keyboard Navigation**: Full keyboard accessibility
|
- **Keyboard Navigation**: Full keyboard accessibility
|
||||||
- **Rich Text Editor**: Visual and source editing modes for story content
|
- **Rich Text Editor**: Visual and source editing modes for story content
|
||||||
|
- **Progress Indicators**: Visual reading progress bars and completion tracking
|
||||||
|
|
||||||
### 🔒 **Security & Administration**
|
### 🔒 **Security & Administration**
|
||||||
- **JWT Authentication**: Secure token-based authentication
|
- **JWT Authentication**: Secure token-based authentication
|
||||||
@@ -158,43 +161,75 @@ cd backend
|
|||||||
|
|
||||||
## 📖 Documentation
|
## 📖 Documentation
|
||||||
|
|
||||||
- **[API Documentation](docs/API.md)**: Complete REST API reference with examples
|
- **[Technical Specification](storycove-spec.md)**: Complete technical specification with API documentation, data models, and all feature specifications
|
||||||
- **[Data Model](docs/DATA_MODEL.md)**: Detailed database schema and relationships
|
- **[Web Scraper Specification](storycove-scraper-spec.md)**: URL content grabbing functionality
|
||||||
- **[Technical Specification](storycove-spec.md)**: Comprehensive technical specification
|
|
||||||
- **Environment Configuration**: Multi-environment deployment setup (see above)
|
- **Environment Configuration**: Multi-environment deployment setup (see above)
|
||||||
- **Development Setup**: Local development environment setup (see below)
|
- **Development Setup**: Local development environment setup (see below)
|
||||||
|
|
||||||
|
> **Note**: All feature specifications (Collections, Tag Enhancements, EPUB Import/Export) have been consolidated into the main technical specification for easier maintenance and reference.
|
||||||
|
|
||||||
## 🗄️ Data Model
|
## 🗄️ Data Model
|
||||||
|
|
||||||
StoryCove uses a PostgreSQL database with the following core entities:
|
StoryCove uses a PostgreSQL database with the following core entities:
|
||||||
|
|
||||||
### **Stories**
|
### **Stories**
|
||||||
- **Primary Key**: UUID
|
- **Primary Key**: UUID
|
||||||
- **Fields**: title, summary, description, content_html, content_plain, source_url, word_count, rating, volume, cover_path
|
- **Fields**: title, summary, description, content_html, content_plain, source_url, word_count, rating, volume, cover_path, is_read, reading_position, last_read_at, created_at, updated_at
|
||||||
- **Relationships**: Many-to-One with Author, Many-to-One with Series, Many-to-Many with Tags
|
- **Relationships**: Many-to-One with Author, Many-to-One with Series, Many-to-Many with Tags, One-to-Many with ReadingPositions
|
||||||
- **Features**: Automatic word count calculation, HTML sanitization, plain text extraction
|
- **Features**: Automatic word count calculation, HTML sanitization, plain text extraction, reading progress tracking, duplicate detection
|
||||||
|
|
||||||
### **Authors**
|
### **Authors**
|
||||||
- **Primary Key**: UUID
|
- **Primary Key**: UUID
|
||||||
- **Fields**: name, notes, author_rating, avatar_image_path
|
- **Fields**: name, notes, author_rating, avatar_image_path, created_at, updated_at
|
||||||
- **Relationships**: One-to-Many with Stories, One-to-Many with Author URLs
|
- **Relationships**: One-to-Many with Stories, One-to-Many with Author URLs (via @ElementCollection)
|
||||||
- **Features**: URL collection storage, rating system, statistics calculation
|
- **Features**: URL collection storage, rating system, statistics calculation, average story rating calculation
|
||||||
|
|
||||||
|
### **Collections**
|
||||||
|
- **Primary Key**: UUID
|
||||||
|
- **Fields**: name, description, rating, cover_image_path, is_archived, created_at, updated_at
|
||||||
|
- **Relationships**: Many-to-Many with Tags, One-to-Many with CollectionStories
|
||||||
|
- **Features**: Story ordering with gap-based positioning, statistics calculation, EPUB export, Typesense search
|
||||||
|
|
||||||
|
### **CollectionStories** (Junction Table)
|
||||||
|
- **Composite Key**: collection_id, story_id
|
||||||
|
- **Fields**: position, added_at
|
||||||
|
- **Relationships**: Links Collections to Stories with ordering
|
||||||
|
- **Features**: Gap-based positioning for efficient reordering
|
||||||
|
|
||||||
### **Series**
|
### **Series**
|
||||||
- **Primary Key**: UUID
|
- **Primary Key**: UUID
|
||||||
- **Fields**: name, description
|
- **Fields**: name, description, created_at
|
||||||
- **Relationships**: One-to-Many with Stories (ordered by volume)
|
- **Relationships**: One-to-Many with Stories (ordered by volume)
|
||||||
- **Features**: Volume-based story ordering, navigation methods
|
- **Features**: Volume-based story ordering, navigation methods (next/previous story)
|
||||||
|
|
||||||
### **Tags**
|
### **Tags**
|
||||||
- **Primary Key**: UUID
|
- **Primary Key**: UUID
|
||||||
- **Fields**: name (unique)
|
- **Fields**: name (unique), color (hex), description, created_at
|
||||||
- **Relationships**: Many-to-Many with Stories
|
- **Relationships**: Many-to-Many with Stories, Many-to-Many with Collections, One-to-Many with TagAliases
|
||||||
- **Features**: Autocomplete support, usage statistics
|
- **Features**: Color coding, alias system, autocomplete support, usage statistics, AI-powered suggestions
|
||||||
|
|
||||||
### **Join Tables**
|
### **TagAliases**
|
||||||
- **story_tags**: Links stories to tags
|
- **Primary Key**: UUID
|
||||||
- **author_urls**: Stores multiple URLs per author
|
- **Fields**: alias_name (unique), canonical_tag_id, created_from_merge, created_at
|
||||||
|
- **Relationships**: Many-to-One with Tag (canonical)
|
||||||
|
- **Features**: Transparent alias resolution, merge tracking, autocomplete integration
|
||||||
|
|
||||||
|
### **ReadingPositions**
|
||||||
|
- **Primary Key**: UUID
|
||||||
|
- **Fields**: story_id, chapter_index, chapter_title, word_position, character_position, percentage_complete, epub_cfi, context_before, context_after, created_at, updated_at
|
||||||
|
- **Relationships**: Many-to-One with Story
|
||||||
|
- **Features**: Advanced reading position tracking, EPUB CFI support, context preservation, percentage calculation
|
||||||
|
|
||||||
|
### **Libraries**
|
||||||
|
- **Primary Key**: UUID
|
||||||
|
- **Fields**: name, description, is_default, created_at, updated_at
|
||||||
|
- **Features**: Multi-library support, library switching functionality
|
||||||
|
|
||||||
|
### **Core Join Tables**
|
||||||
|
- **story_tags**: Links stories to tags (Many-to-Many)
|
||||||
|
- **collection_tags**: Links collections to tags (Many-to-Many)
|
||||||
|
- **collection_stories**: Links collections to stories with ordering
|
||||||
|
- **author_urls**: Stores multiple URLs per author (@ElementCollection)
|
||||||
|
|
||||||
## 🔌 REST API Reference
|
## 🔌 REST API Reference
|
||||||
|
|
||||||
@@ -206,6 +241,7 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
|||||||
### **Stories** (`/api/stories`)
|
### **Stories** (`/api/stories`)
|
||||||
- `GET /` - List stories (paginated)
|
- `GET /` - List stories (paginated)
|
||||||
- `GET /{id}` - Get specific story
|
- `GET /{id}` - Get specific story
|
||||||
|
- `GET /{id}/read` - Get story for reading interface
|
||||||
- `POST /` - Create new story
|
- `POST /` - Create new story
|
||||||
- `PUT /{id}` - Update story
|
- `PUT /{id}` - Update story
|
||||||
- `DELETE /{id}` - Delete story
|
- `DELETE /{id}` - Delete story
|
||||||
@@ -214,13 +250,28 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
|||||||
- `POST /{id}/rating` - Set story rating
|
- `POST /{id}/rating` - Set story rating
|
||||||
- `POST /{id}/tags/{tagId}` - Add tag to story
|
- `POST /{id}/tags/{tagId}` - Add tag to story
|
||||||
- `DELETE /{id}/tags/{tagId}` - Remove tag from story
|
- `DELETE /{id}/tags/{tagId}` - Remove tag from story
|
||||||
- `GET /search` - Search stories (Typesense)
|
- `POST /{id}/reading-progress` - Update reading position
|
||||||
|
- `POST /{id}/reading-status` - Mark story as read/unread
|
||||||
|
- `GET /{id}/collections` - Get collections containing story
|
||||||
|
- `GET /random` - Get random story with optional filters
|
||||||
|
- `GET /check-duplicate` - Check for duplicate stories
|
||||||
|
- `GET /search` - Search stories (Typesense with faceting)
|
||||||
- `GET /search/suggestions` - Get search suggestions
|
- `GET /search/suggestions` - Get search suggestions
|
||||||
- `GET /author/{authorId}` - Stories by author
|
- `GET /author/{authorId}` - Stories by author
|
||||||
- `GET /series/{seriesId}` - Stories in series
|
- `GET /series/{seriesId}` - Stories in series
|
||||||
- `GET /tags/{tagName}` - Stories with tag
|
- `GET /tags/{tagName}` - Stories with tag
|
||||||
- `GET /recent` - Recent stories
|
- `GET /recent` - Recent stories
|
||||||
- `GET /top-rated` - Top-rated stories
|
- `GET /top-rated` - Top-rated stories
|
||||||
|
- `POST /batch/add-to-collection` - Add multiple stories to collection
|
||||||
|
- `POST /reindex` - Manual Typesense reindex
|
||||||
|
- `POST /reindex-typesense` - Reindex stories in Typesense
|
||||||
|
- `POST /recreate-typesense-collection` - Recreate Typesense collection
|
||||||
|
|
||||||
|
#### **EPUB Import/Export** (`/api/stories/epub`)
|
||||||
|
- `POST /import` - Import story from EPUB file
|
||||||
|
- `POST /export` - Export story as EPUB with options
|
||||||
|
- `GET /{id}/epub` - Export story as EPUB (simple)
|
||||||
|
- `POST /validate` - Validate EPUB file structure
|
||||||
|
|
||||||
### **Authors** (`/api/authors`)
|
### **Authors** (`/api/authors`)
|
||||||
- `GET /` - List authors (paginated)
|
- `GET /` - List authors (paginated)
|
||||||
@@ -240,14 +291,49 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
|||||||
### **Tags** (`/api/tags`)
|
### **Tags** (`/api/tags`)
|
||||||
- `GET /` - List tags (paginated)
|
- `GET /` - List tags (paginated)
|
||||||
- `GET /{id}` - Get specific tag
|
- `GET /{id}` - Get specific tag
|
||||||
- `POST /` - Create new tag
|
- `POST /` - Create new tag (with color and description)
|
||||||
- `PUT /{id}` - Update tag
|
- `PUT /{id}` - Update tag (name, color, description)
|
||||||
- `DELETE /{id}` - Delete tag
|
- `DELETE /{id}` - Delete tag
|
||||||
- `GET /search` - Search tags
|
- `GET /search` - Search tags
|
||||||
- `GET /autocomplete` - Tag autocomplete
|
- `GET /autocomplete` - Tag autocomplete with alias resolution
|
||||||
- `GET /popular` - Most used tags
|
- `GET /popular` - Most used tags
|
||||||
- `GET /unused` - Unused tags
|
- `GET /unused` - Unused tags
|
||||||
- `GET /stats` - Tag statistics
|
- `GET /stats` - Tag statistics
|
||||||
|
- `GET /collections` - Tags used by collections
|
||||||
|
- `GET /resolve/{name}` - Resolve tag name (handles aliases)
|
||||||
|
|
||||||
|
#### **Tag Aliases** (`/api/tags/{tagId}/aliases`)
|
||||||
|
- `POST /` - Add alias to tag
|
||||||
|
- `DELETE /{aliasId}` - Remove alias from tag
|
||||||
|
|
||||||
|
#### **Tag Management**
|
||||||
|
- `POST /merge` - Merge multiple tags into one
|
||||||
|
- `POST /merge/preview` - Preview tag merge operation
|
||||||
|
- `POST /suggest` - AI-powered tag suggestions for content
|
||||||
|
|
||||||
|
### **Collections** (`/api/collections`)
|
||||||
|
- `GET /` - Search and list collections (Typesense)
|
||||||
|
- `GET /{id}` - Get collection details
|
||||||
|
- `POST /` - Create new collection (JSON or multipart)
|
||||||
|
- `PUT /{id}` - Update collection metadata
|
||||||
|
- `DELETE /{id}` - Delete collection
|
||||||
|
- `PUT /{id}/archive` - Archive/unarchive collection
|
||||||
|
- `POST /{id}/cover` - Upload collection cover image
|
||||||
|
- `DELETE /{id}/cover` - Remove collection cover image
|
||||||
|
- `GET /{id}/stats` - Get collection statistics
|
||||||
|
|
||||||
|
#### **Collection Story Management**
|
||||||
|
- `POST /{id}/stories` - Add stories to collection
|
||||||
|
- `DELETE /{id}/stories/{storyId}` - Remove story from collection
|
||||||
|
- `PUT /{id}/stories/order` - Reorder stories in collection
|
||||||
|
- `GET /{id}/read/{storyId}` - Get story with collection context
|
||||||
|
|
||||||
|
#### **Collection EPUB Export**
|
||||||
|
- `GET /{id}/epub` - Export collection as EPUB
|
||||||
|
- `POST /{id}/epub` - Export collection as EPUB with options
|
||||||
|
|
||||||
|
#### **Collection Management**
|
||||||
|
- `POST /reindex-typesense` - Reindex collections in Typesense
|
||||||
|
|
||||||
### **Series** (`/api/series`)
|
### **Series** (`/api/series`)
|
||||||
- `GET /` - List series (paginated)
|
- `GET /` - List series (paginated)
|
||||||
@@ -295,6 +381,7 @@ All API endpoints use JSON format with proper HTTP status codes:
|
|||||||
- **Backend**: Spring Boot 3, Java 21, PostgreSQL, Typesense
|
- **Backend**: Spring Boot 3, Java 21, PostgreSQL, Typesense
|
||||||
- **Infrastructure**: Docker, Docker Compose, Nginx
|
- **Infrastructure**: Docker, Docker Compose, Nginx
|
||||||
- **Security**: JWT authentication, HTML sanitization, CORS
|
- **Security**: JWT authentication, HTML sanitization, CORS
|
||||||
|
- **Search**: Typesense with faceting and full-text search capabilities
|
||||||
|
|
||||||
### **Local Development Setup**
|
### **Local Development Setup**
|
||||||
|
|
||||||
|
|||||||
305
TAG_ENHANCEMENT_SPECIFICATION.md
Normal file
305
TAG_ENHANCEMENT_SPECIFICATION.md
Normal file
@@ -0,0 +1,305 @@
|
|||||||
|
# Tag Enhancement Specification
|
||||||
|
|
||||||
|
> **✅ Implementation Status: COMPLETED**
|
||||||
|
> This feature has been fully implemented and is available in the system.
|
||||||
|
> All tag enhancements including colors, aliases, merging, and AI suggestions are working.
|
||||||
|
> Last updated: January 2025
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
This document outlines the comprehensive enhancement of the tagging functionality in StoryCove, including color tags, tag deletion, merging, and aliases. These features will be accessible through a new "Tag Maintenance" page linked from the Settings page.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
### 1. Color Tags
|
||||||
|
|
||||||
|
**Purpose**: Assign optional colors to tags for visual distinction and better organization.
|
||||||
|
|
||||||
|
**Implementation Details**:
|
||||||
|
- **Color Selection**: Predefined color palette that complements the app's theme
|
||||||
|
- **Custom Colors**: Fallback option with full color picker for advanced users
|
||||||
|
- **Default Behavior**: Tags without colors use consistent default styling
|
||||||
|
- **Accessibility**: All colors ensure sufficient contrast ratios
|
||||||
|
|
||||||
|
**UI Design**:
|
||||||
|
```
|
||||||
|
Color Selection Interface:
|
||||||
|
[Theme Blue] [Theme Green] [Theme Purple] [Theme Orange] ... [Custom ▼]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Database Changes**:
|
||||||
|
```sql
|
||||||
|
ALTER TABLE tags ADD COLUMN color VARCHAR(7); -- hex colors like #3B82F6
|
||||||
|
ALTER TABLE tags ADD COLUMN description TEXT;
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Tag Deletion
|
||||||
|
|
||||||
|
**Purpose**: Remove unused or unwanted tags from the system.
|
||||||
|
|
||||||
|
**Safety Features**:
|
||||||
|
- Show impact: "This tag is used by X stories"
|
||||||
|
- Confirmation dialog with story count
|
||||||
|
- Option to reassign stories to different tag before deletion
|
||||||
|
- Simple workflow appropriate for single-user application
|
||||||
|
|
||||||
|
**Behavior**:
|
||||||
|
- Display number of affected stories
|
||||||
|
- Require confirmation for deletion
|
||||||
|
- Optionally allow reassignment to another tag
|
||||||
|
|
||||||
|
### 3. Tag Merging
|
||||||
|
|
||||||
|
**Purpose**: Combine similar tags into a single canonical tag to reduce duplication.
|
||||||
|
|
||||||
|
**Workflow**:
|
||||||
|
1. User selects multiple tags to merge
|
||||||
|
2. User chooses which tag name becomes canonical
|
||||||
|
3. System shows merge preview with story counts
|
||||||
|
4. All story associations transfer to canonical tag
|
||||||
|
5. **Automatic Aliasing**: Merged tags automatically become aliases
|
||||||
|
|
||||||
|
**Example**:
|
||||||
|
```
|
||||||
|
Merge Preview:
|
||||||
|
• "magictf" (5 stories) → "magic tf" (12 stories)
|
||||||
|
• Result: "magic tf" (17 stories)
|
||||||
|
• "magictf" will become an alias for "magic tf"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Technical Implementation**:
|
||||||
|
```sql
|
||||||
|
-- Merge operation (atomic transaction)
|
||||||
|
BEGIN TRANSACTION;
|
||||||
|
UPDATE story_tags SET tag_id = target_tag_id WHERE tag_id = source_tag_id;
|
||||||
|
INSERT INTO tag_aliases (alias_name, canonical_tag_id, created_from_merge)
|
||||||
|
VALUES (source_tag_name, target_tag_id, TRUE);
|
||||||
|
DELETE FROM tags WHERE id = source_tag_id;
|
||||||
|
COMMIT;
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Tag Aliases
|
||||||
|
|
||||||
|
**Purpose**: Prevent tag duplication by allowing alternative names that resolve to canonical tags.
|
||||||
|
|
||||||
|
**Key Features**:
|
||||||
|
- **Transparent Resolution**: Users type "magictf" and automatically get "magic tf"
|
||||||
|
- **Hover Display**: Show aliases when hovering over tags
|
||||||
|
- **Import Integration**: Automatic alias resolution during story imports
|
||||||
|
- **Auto-Generation**: Created automatically during tag merges
|
||||||
|
|
||||||
|
**Database Schema**:
|
||||||
|
```sql
|
||||||
|
CREATE TABLE tag_aliases (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
alias_name VARCHAR(255) UNIQUE NOT NULL,
|
||||||
|
canonical_tag_id UUID NOT NULL REFERENCES tags(id) ON DELETE CASCADE,
|
||||||
|
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
created_from_merge BOOLEAN DEFAULT FALSE
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_tag_aliases_name ON tag_aliases(alias_name);
|
||||||
|
```
|
||||||
|
|
||||||
|
**UI Behavior**:
|
||||||
|
- Tags with aliases show subtle indicator (e.g., small "+" icon)
|
||||||
|
- Hover tooltip displays:
|
||||||
|
```
|
||||||
|
magic tf
|
||||||
|
────────────
|
||||||
|
Aliases: magictf, magic_tf, magic-transformation
|
||||||
|
```
|
||||||
|
|
||||||
|
## Tag Maintenance Page
|
||||||
|
|
||||||
|
### Access
|
||||||
|
- Reachable only through Settings page
|
||||||
|
- Button: "Tag Maintenance" or "Manage Tags"
|
||||||
|
|
||||||
|
### Main Interface
|
||||||
|
|
||||||
|
**Tag Management Table**:
|
||||||
|
```
|
||||||
|
┌─ Search: [____________] [Color Filter ▼] [Sort: Usage ▼]
|
||||||
|
├─
|
||||||
|
├─ ☐ magic tf 🔵 (17 stories) [+2 aliases] [Edit] [Delete]
|
||||||
|
├─ ☐ transformation 🟢 (34 stories) [+1 alias] [Edit] [Delete]
|
||||||
|
├─ ☐ sci-fi 🟣 (45 stories) [Edit] [Delete]
|
||||||
|
└─
|
||||||
|
[Merge Selected] [Bulk Delete] [Export/Import Tags]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Features**:
|
||||||
|
- Searchable and filterable tag list
|
||||||
|
- Sortable by name, usage count, creation date
|
||||||
|
- Bulk selection for merge/delete operations
|
||||||
|
- Visual indicators for color and alias count
|
||||||
|
|
||||||
|
### Tag Edit Modal
|
||||||
|
|
||||||
|
```
|
||||||
|
Edit Tag: "magic tf"
|
||||||
|
┌─ Name: [magic tf ]
|
||||||
|
├─ Color: [🔵] [Theme Colors...] [Custom...]
|
||||||
|
├─ Description: [Optional description]
|
||||||
|
├─
|
||||||
|
├─ Aliases (2):
|
||||||
|
│ • magictf [Remove]
|
||||||
|
│ • magic_tf [Remove]
|
||||||
|
│ [Add Alias: ____________] [Add]
|
||||||
|
├─
|
||||||
|
├─ Used by 17 stories [View Stories]
|
||||||
|
└─ [Save] [Cancel]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Functionality**:
|
||||||
|
- Edit tag name, color, and description
|
||||||
|
- Manage aliases (add/remove)
|
||||||
|
- View associated stories
|
||||||
|
- Prevent circular alias references
|
||||||
|
|
||||||
|
### Merge Interface
|
||||||
|
|
||||||
|
**Selection Process**:
|
||||||
|
1. Select multiple tags from main table
|
||||||
|
2. Click "Merge Selected"
|
||||||
|
3. Choose canonical tag name
|
||||||
|
4. Preview merge results
|
||||||
|
5. Confirm operation
|
||||||
|
|
||||||
|
**Preview Display**:
|
||||||
|
- Show before/after story counts
|
||||||
|
- List all aliases that will be created
|
||||||
|
- Highlight any conflicts or issues
|
||||||
|
|
||||||
|
## Integration Points
|
||||||
|
|
||||||
|
### 1. Import/Scraping Enhancement
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Tag resolution during imports
|
||||||
|
const resolveTagName = async (inputTag) => {
|
||||||
|
const alias = await tagApi.findAlias(inputTag);
|
||||||
|
return alias ? alias.canonicalTag : inputTag;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Tag Input Components
|
||||||
|
|
||||||
|
**Enhanced Autocomplete**:
|
||||||
|
- Include both canonical names and aliases in suggestions
|
||||||
|
- Show resolution: "magictf → magic tf" in dropdown
|
||||||
|
- Always save canonical name to database
|
||||||
|
|
||||||
|
### 3. Search Functionality
|
||||||
|
|
||||||
|
**Transparent Alias Search**:
|
||||||
|
- Search for "magictf" includes stories tagged with "magic tf"
|
||||||
|
- User doesn't need to know about canonical/alias distinction
|
||||||
|
- Expand search queries to include all aliases
|
||||||
|
|
||||||
|
### 4. Display Components
|
||||||
|
|
||||||
|
**Tag Rendering**:
|
||||||
|
- Apply colors consistently across all tag displays
|
||||||
|
- Show alias indicator where appropriate
|
||||||
|
- Implement hover tooltips for alias information
|
||||||
|
|
||||||
|
## Implementation Phases
|
||||||
|
|
||||||
|
### Phase 1: Core Infrastructure
|
||||||
|
- [ ] Database schema updates (tags.color, tag_aliases table)
|
||||||
|
- [ ] Basic tag editing functionality (name, color, description)
|
||||||
|
- [ ] Color palette component with theme colors
|
||||||
|
- [ ] Tag edit modal interface
|
||||||
|
|
||||||
|
### Phase 2: Merging & Aliasing
|
||||||
|
- [ ] Tag merge functionality with automatic alias creation
|
||||||
|
- [ ] Alias resolution in import/scraping logic
|
||||||
|
- [ ] Tag input component enhancements
|
||||||
|
- [ ] Search integration with alias expansion
|
||||||
|
|
||||||
|
### Phase 3: UI Polish & Advanced Features
|
||||||
|
- [ ] Hover tooltips for alias display
|
||||||
|
- [ ] Bulk operations (merge multiple, bulk delete)
|
||||||
|
- [ ] Advanced filtering and sorting options
|
||||||
|
- [ ] Tag maintenance page integration with Settings
|
||||||
|
|
||||||
|
### Phase 4: Smart Features (Optional)
|
||||||
|
- [ ] Auto-merge suggestions for similar tag names
|
||||||
|
- [ ] Color auto-assignment based on usage patterns
|
||||||
|
- [ ] Import intelligence and learning from user decisions
|
||||||
|
|
||||||
|
## Technical Considerations
|
||||||
|
|
||||||
|
### Performance
|
||||||
|
- Index alias names for fast lookup during imports
|
||||||
|
- Optimize tag queries with proper database indexing
|
||||||
|
- Consider caching for frequently accessed tag/alias mappings
|
||||||
|
|
||||||
|
### Data Integrity
|
||||||
|
- Prevent circular alias references
|
||||||
|
- Atomic transactions for merge operations
|
||||||
|
- Cascade deletion handling for tag relationships
|
||||||
|
|
||||||
|
### User Experience
|
||||||
|
- Clear visual feedback for all operations
|
||||||
|
- Comprehensive preview before destructive actions
|
||||||
|
- Consistent color and styling across the application
|
||||||
|
|
||||||
|
### Accessibility
|
||||||
|
- Sufficient color contrast for all tag colors
|
||||||
|
- Keyboard navigation support
|
||||||
|
- Screen reader compatibility
|
||||||
|
- Don't rely solely on color for information
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
### New Endpoints Needed
|
||||||
|
- `GET /api/tags/{id}/aliases` - Get aliases for a tag
|
||||||
|
- `POST /api/tags/merge` - Merge multiple tags
|
||||||
|
- `POST /api/tags/{id}/aliases` - Add alias to tag
|
||||||
|
- `DELETE /api/tags/{id}/aliases/{aliasId}` - Remove alias
|
||||||
|
- `PUT /api/tags/{id}/color` - Update tag color
|
||||||
|
- `GET /api/tags/resolve/{name}` - Resolve tag name (check aliases)
|
||||||
|
|
||||||
|
### Enhanced Endpoints
|
||||||
|
- `GET /api/tags` - Include color and alias count in response
|
||||||
|
- `PUT /api/tags/{id}` - Support color and description updates
|
||||||
|
- `DELETE /api/tags/{id}` - Enhanced with story impact information
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Theme Color Palette
|
||||||
|
Define a curated set of colors that work well with both light and dark themes:
|
||||||
|
- Primary blues: #3B82F6, #1D4ED8, #60A5FA
|
||||||
|
- Greens: #10B981, #059669, #34D399
|
||||||
|
- Purples: #8B5CF6, #7C3AED, #A78BFA
|
||||||
|
- Warm tones: #F59E0B, #D97706, #F97316
|
||||||
|
- Neutrals: #6B7280, #4B5563, #9CA3AF
|
||||||
|
|
||||||
|
### Settings Integration
|
||||||
|
- Add "Tag Maintenance" button to Settings page
|
||||||
|
- Consider adding tag-related preferences (default colors, etc.)
|
||||||
|
|
||||||
|
## Success Criteria
|
||||||
|
|
||||||
|
1. **Color Tags**: Tags can be assigned colors that display consistently throughout the application
|
||||||
|
2. **Tag Deletion**: Users can safely delete tags with appropriate warnings and reassignment options
|
||||||
|
3. **Tag Merging**: Similar tags can be merged with automatic alias creation
|
||||||
|
4. **Alias Resolution**: Imports automatically resolve aliases to canonical tags
|
||||||
|
5. **User Experience**: All operations are intuitive with clear feedback and preview options
|
||||||
|
6. **Performance**: Tag operations remain fast even with large numbers of tags and aliases
|
||||||
|
7. **Data Integrity**: No orphaned references or circular alias chains
|
||||||
|
|
||||||
|
## Future Enhancements
|
||||||
|
|
||||||
|
- **Tag Statistics**: Usage analytics and trends
|
||||||
|
- **Tag Recommendations**: AI-powered tag suggestions during story import
|
||||||
|
- **Tag Templates**: Predefined tag sets for common story types
|
||||||
|
- **Export/Import**: Backup and restore tag configurations
|
||||||
|
- **Tag Validation**: Rules for tag naming conventions
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*This specification serves as the definitive guide for implementing the tag enhancement features in StoryCove. All implementation should refer back to this document to ensure consistency and completeness.*
|
||||||
@@ -2,15 +2,15 @@ FROM openjdk:17-jdk-slim
|
|||||||
|
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
|
|
||||||
COPY pom.xml .
|
# Install Maven
|
||||||
COPY src ./src
|
RUN apt-get update && apt-get install -y maven && rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
RUN apt-get update && apt-get install -y maven && \
|
# Copy source code
|
||||||
mvn clean package -DskipTests && \
|
COPY . .
|
||||||
apt-get remove -y maven && \
|
|
||||||
apt-get autoremove -y && \
|
# Build the application
|
||||||
rm -rf /var/lib/apt/lists/*
|
RUN mvn clean package -DskipTests
|
||||||
|
|
||||||
EXPOSE 8080
|
EXPOSE 8080
|
||||||
|
|
||||||
CMD ["java", "-jar", "target/storycove-backend-0.0.1-SNAPSHOT.jar"]
|
ENTRYPOINT ["java", "-jar", "target/storycove-backend-0.0.1-SNAPSHOT.jar"]
|
||||||
1
backend/backend.log
Normal file
1
backend/backend.log
Normal file
@@ -0,0 +1 @@
|
|||||||
|
(eval):1: no such file or directory: ./mvnw
|
||||||
4
backend/cookies_new.txt
Normal file
4
backend/cookies_new.txt
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
# Netscape HTTP Cookie File
|
||||||
|
# https://curl.se/docs/http-cookies.html
|
||||||
|
# This file was generated by libcurl! Edit at your own risk.
|
||||||
|
|
||||||
@@ -5,7 +5,7 @@
|
|||||||
<parent>
|
<parent>
|
||||||
<groupId>org.springframework.boot</groupId>
|
<groupId>org.springframework.boot</groupId>
|
||||||
<artifactId>spring-boot-starter-parent</artifactId>
|
<artifactId>spring-boot-starter-parent</artifactId>
|
||||||
<version>3.2.0</version>
|
<version>3.5.5</version>
|
||||||
<relativePath/>
|
<relativePath/>
|
||||||
</parent>
|
</parent>
|
||||||
|
|
||||||
@@ -17,7 +17,7 @@
|
|||||||
|
|
||||||
<properties>
|
<properties>
|
||||||
<java.version>17</java.version>
|
<java.version>17</java.version>
|
||||||
<testcontainers.version>1.19.3</testcontainers.version>
|
<testcontainers.version>1.21.3</testcontainers.version>
|
||||||
</properties>
|
</properties>
|
||||||
|
|
||||||
<dependencyManagement>
|
<dependencyManagement>
|
||||||
@@ -56,18 +56,18 @@
|
|||||||
<dependency>
|
<dependency>
|
||||||
<groupId>io.jsonwebtoken</groupId>
|
<groupId>io.jsonwebtoken</groupId>
|
||||||
<artifactId>jjwt-api</artifactId>
|
<artifactId>jjwt-api</artifactId>
|
||||||
<version>0.12.3</version>
|
<version>0.13.0</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>io.jsonwebtoken</groupId>
|
<groupId>io.jsonwebtoken</groupId>
|
||||||
<artifactId>jjwt-impl</artifactId>
|
<artifactId>jjwt-impl</artifactId>
|
||||||
<version>0.12.3</version>
|
<version>0.13.0</version>
|
||||||
<scope>runtime</scope>
|
<scope>runtime</scope>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
<groupId>io.jsonwebtoken</groupId>
|
<groupId>io.jsonwebtoken</groupId>
|
||||||
<artifactId>jjwt-jackson</artifactId>
|
<artifactId>jjwt-jackson</artifactId>
|
||||||
<version>0.12.3</version>
|
<version>0.13.0</version>
|
||||||
<scope>runtime</scope>
|
<scope>runtime</scope>
|
||||||
</dependency>
|
</dependency>
|
||||||
<dependency>
|
<dependency>
|
||||||
@@ -84,6 +84,11 @@
|
|||||||
<artifactId>typesense-java</artifactId>
|
<artifactId>typesense-java</artifactId>
|
||||||
<version>1.3.0</version>
|
<version>1.3.0</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.positiondev.epublib</groupId>
|
||||||
|
<artifactId>epublib-core</artifactId>
|
||||||
|
<version>3.1</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
<!-- Test dependencies -->
|
<!-- Test dependencies -->
|
||||||
<dependency>
|
<dependency>
|
||||||
|
|||||||
@@ -0,0 +1,64 @@
|
|||||||
|
package com.storycove.config;
|
||||||
|
|
||||||
|
import com.storycove.service.LibraryService;
|
||||||
|
import com.zaxxer.hikari.HikariConfig;
|
||||||
|
import com.zaxxer.hikari.HikariDataSource;
|
||||||
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
|
import org.springframework.context.annotation.Bean;
|
||||||
|
import org.springframework.context.annotation.Configuration;
|
||||||
|
import org.springframework.context.annotation.DependsOn;
|
||||||
|
import org.springframework.context.annotation.Primary;
|
||||||
|
|
||||||
|
import javax.sql.DataSource;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Database configuration that sets up library-aware datasource routing.
|
||||||
|
*
|
||||||
|
* This configuration replaces the default Spring Boot datasource with a routing
|
||||||
|
* datasource that automatically directs all database operations to the appropriate
|
||||||
|
* library-specific database based on the current active library.
|
||||||
|
*/
|
||||||
|
@Configuration
|
||||||
|
public class DatabaseConfig {
|
||||||
|
|
||||||
|
@Value("${spring.datasource.url}")
|
||||||
|
private String baseDbUrl;
|
||||||
|
|
||||||
|
@Value("${spring.datasource.username}")
|
||||||
|
private String dbUsername;
|
||||||
|
|
||||||
|
@Value("${spring.datasource.password}")
|
||||||
|
private String dbPassword;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a fallback datasource for when no library is active.
|
||||||
|
* This connects to the main database specified in application.yml.
|
||||||
|
*/
|
||||||
|
@Bean(name = "fallbackDataSource")
|
||||||
|
public DataSource fallbackDataSource() {
|
||||||
|
HikariConfig config = new HikariConfig();
|
||||||
|
config.setJdbcUrl(baseDbUrl);
|
||||||
|
config.setUsername(dbUsername);
|
||||||
|
config.setPassword(dbPassword);
|
||||||
|
config.setDriverClassName("org.postgresql.Driver");
|
||||||
|
config.setMaximumPoolSize(10);
|
||||||
|
config.setConnectionTimeout(30000);
|
||||||
|
|
||||||
|
return new HikariDataSource(config);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Primary datasource bean - uses smart routing that excludes authentication operations
|
||||||
|
*/
|
||||||
|
@Bean(name = "dataSource")
|
||||||
|
@Primary
|
||||||
|
@DependsOn("libraryService")
|
||||||
|
public DataSource primaryDataSource(LibraryService libraryService) {
|
||||||
|
SmartRoutingDataSource routingDataSource = new SmartRoutingDataSource(
|
||||||
|
libraryService, baseDbUrl, dbUsername, dbPassword);
|
||||||
|
routingDataSource.setDefaultTargetDataSource(fallbackDataSource());
|
||||||
|
routingDataSource.setTargetDataSources(new java.util.HashMap<>());
|
||||||
|
return routingDataSource;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
@@ -0,0 +1,65 @@
|
|||||||
|
package com.storycove.config;
|
||||||
|
|
||||||
|
import com.storycove.service.LibraryService;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Custom DataSource router that dynamically routes database calls to the appropriate
|
||||||
|
* library-specific datasource based on the current active library.
|
||||||
|
*
|
||||||
|
* This makes ALL Spring Data JPA repositories automatically library-aware without
|
||||||
|
* requiring changes to existing repository or service code.
|
||||||
|
*/
|
||||||
|
public class LibraryAwareDataSource extends AbstractRoutingDataSource {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(LibraryAwareDataSource.class);
|
||||||
|
|
||||||
|
private final LibraryService libraryService;
|
||||||
|
|
||||||
|
public LibraryAwareDataSource(LibraryService libraryService) {
|
||||||
|
this.libraryService = libraryService;
|
||||||
|
// Set empty target datasources to satisfy AbstractRoutingDataSource requirements
|
||||||
|
// We override determineTargetDataSource() so this won't be used
|
||||||
|
setTargetDataSources(new java.util.HashMap<>());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
protected Object determineCurrentLookupKey() {
|
||||||
|
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||||
|
logger.debug("Routing database call to library: {}", currentLibraryId);
|
||||||
|
return currentLibraryId;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
protected javax.sql.DataSource determineTargetDataSource() {
|
||||||
|
try {
|
||||||
|
// Check if LibraryService is properly initialized
|
||||||
|
if (libraryService == null) {
|
||||||
|
logger.debug("LibraryService not available, using default datasource");
|
||||||
|
return getResolvedDefaultDataSource();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if any library is currently active
|
||||||
|
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||||
|
if (currentLibraryId == null) {
|
||||||
|
logger.debug("No active library, using default datasource");
|
||||||
|
return getResolvedDefaultDataSource();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to get the current library datasource
|
||||||
|
javax.sql.DataSource libraryDataSource = libraryService.getCurrentDataSource();
|
||||||
|
logger.debug("Successfully routing database call to library: {}", currentLibraryId);
|
||||||
|
return libraryDataSource;
|
||||||
|
|
||||||
|
} catch (IllegalStateException e) {
|
||||||
|
// This is expected during authentication, startup, or when no library is active
|
||||||
|
logger.debug("No active library (IllegalStateException) - using default datasource: {}", e.getMessage());
|
||||||
|
return getResolvedDefaultDataSource();
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.warn("Unexpected error determining target datasource, falling back to default: {}", e.getMessage(), e);
|
||||||
|
return getResolvedDefaultDataSource();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -56,7 +56,10 @@ public class SecurityConfig {
|
|||||||
@Bean
|
@Bean
|
||||||
public CorsConfigurationSource corsConfigurationSource() {
|
public CorsConfigurationSource corsConfigurationSource() {
|
||||||
CorsConfiguration configuration = new CorsConfiguration();
|
CorsConfiguration configuration = new CorsConfiguration();
|
||||||
configuration.setAllowedOriginPatterns(Arrays.asList(allowedOrigins.split(",")));
|
List<String> origins = Arrays.stream(allowedOrigins.split(","))
|
||||||
|
.map(String::trim)
|
||||||
|
.toList();
|
||||||
|
configuration.setAllowedOriginPatterns(origins);
|
||||||
configuration.setAllowedMethods(Arrays.asList("GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"));
|
configuration.setAllowedMethods(Arrays.asList("GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"));
|
||||||
configuration.setAllowedHeaders(List.of("*"));
|
configuration.setAllowedHeaders(List.of("*"));
|
||||||
configuration.setAllowCredentials(true);
|
configuration.setAllowCredentials(true);
|
||||||
|
|||||||
@@ -0,0 +1,158 @@
|
|||||||
|
package com.storycove.config;
|
||||||
|
|
||||||
|
import com.storycove.service.LibraryService;
|
||||||
|
import com.zaxxer.hikari.HikariConfig;
|
||||||
|
import com.zaxxer.hikari.HikariDataSource;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
|
import org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource;
|
||||||
|
import org.springframework.web.context.request.RequestContextHolder;
|
||||||
|
import org.springframework.web.context.request.ServletRequestAttributes;
|
||||||
|
|
||||||
|
import javax.sql.DataSource;
|
||||||
|
import java.util.Map;
|
||||||
|
import java.util.concurrent.ConcurrentHashMap;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Smart routing datasource that:
|
||||||
|
* 1. Routes to library-specific databases when a library is active
|
||||||
|
* 2. Excludes authentication operations (keeps them on default database)
|
||||||
|
* 3. Uses request context to determine when routing is appropriate
|
||||||
|
*/
|
||||||
|
public class SmartRoutingDataSource extends AbstractRoutingDataSource {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(SmartRoutingDataSource.class);
|
||||||
|
|
||||||
|
private final LibraryService libraryService;
|
||||||
|
private final Map<String, DataSource> libraryDataSources = new ConcurrentHashMap<>();
|
||||||
|
|
||||||
|
// Database connection details - will be injected via constructor
|
||||||
|
private final String baseDbUrl;
|
||||||
|
private final String dbUsername;
|
||||||
|
private final String dbPassword;
|
||||||
|
|
||||||
|
public SmartRoutingDataSource(LibraryService libraryService, String baseDbUrl, String dbUsername, String dbPassword) {
|
||||||
|
this.libraryService = libraryService;
|
||||||
|
this.baseDbUrl = baseDbUrl;
|
||||||
|
this.dbUsername = dbUsername;
|
||||||
|
this.dbPassword = dbPassword;
|
||||||
|
|
||||||
|
logger.info("SmartRoutingDataSource initialized with database: {}", baseDbUrl);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
protected Object determineCurrentLookupKey() {
|
||||||
|
try {
|
||||||
|
// Check if this is an authentication request - if so, use default database
|
||||||
|
if (isAuthenticationRequest()) {
|
||||||
|
logger.debug("Authentication request detected, using default database");
|
||||||
|
return null; // null means use default datasource
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if we have an active library
|
||||||
|
if (libraryService != null) {
|
||||||
|
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||||
|
if (currentLibraryId != null && !currentLibraryId.trim().isEmpty()) {
|
||||||
|
logger.info("ROUTING: Directing to library-specific database: {}", currentLibraryId);
|
||||||
|
return currentLibraryId;
|
||||||
|
} else {
|
||||||
|
logger.info("ROUTING: No active library, using default database");
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
logger.info("ROUTING: LibraryService is null, using default database");
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.debug("Error determining lookup key, falling back to default database", e);
|
||||||
|
}
|
||||||
|
|
||||||
|
return null; // Use default datasource
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if the current request is an authentication request that should use the default database
|
||||||
|
*/
|
||||||
|
private boolean isAuthenticationRequest() {
|
||||||
|
try {
|
||||||
|
ServletRequestAttributes attributes = (ServletRequestAttributes) RequestContextHolder.getRequestAttributes();
|
||||||
|
if (attributes != null) {
|
||||||
|
String requestURI = attributes.getRequest().getRequestURI();
|
||||||
|
String method = attributes.getRequest().getMethod();
|
||||||
|
|
||||||
|
// Authentication endpoints that should use default database
|
||||||
|
if (requestURI.contains("/auth/") ||
|
||||||
|
requestURI.contains("/login") ||
|
||||||
|
requestURI.contains("/api/libraries/switch") ||
|
||||||
|
(requestURI.contains("/api/libraries") && "POST".equals(method))) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.debug("Could not determine request context", e);
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
protected DataSource determineTargetDataSource() {
|
||||||
|
Object lookupKey = determineCurrentLookupKey();
|
||||||
|
|
||||||
|
if (lookupKey != null) {
|
||||||
|
String libraryId = (String) lookupKey;
|
||||||
|
return getLibraryDataSource(libraryId);
|
||||||
|
}
|
||||||
|
|
||||||
|
return getDefaultDataSource();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get or create a datasource for the specified library
|
||||||
|
*/
|
||||||
|
private DataSource getLibraryDataSource(String libraryId) {
|
||||||
|
return libraryDataSources.computeIfAbsent(libraryId, id -> {
|
||||||
|
try {
|
||||||
|
HikariConfig config = new HikariConfig();
|
||||||
|
|
||||||
|
// Replace database name in URL with library-specific name
|
||||||
|
String libraryUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + "storycove_" + id);
|
||||||
|
|
||||||
|
config.setJdbcUrl(libraryUrl);
|
||||||
|
config.setUsername(dbUsername);
|
||||||
|
config.setPassword(dbPassword);
|
||||||
|
config.setDriverClassName("org.postgresql.Driver");
|
||||||
|
config.setMaximumPoolSize(5); // Smaller pool for library-specific databases
|
||||||
|
config.setConnectionTimeout(10000);
|
||||||
|
config.setMaxLifetime(600000); // 10 minutes
|
||||||
|
|
||||||
|
logger.info("Created new datasource for library: {} -> {}", id, libraryUrl);
|
||||||
|
return new HikariDataSource(config);
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to create datasource for library: {}", id, e);
|
||||||
|
return getDefaultDataSource();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private DataSource getDefaultDataSource() {
|
||||||
|
// Use the default target datasource that was set in the configuration
|
||||||
|
try {
|
||||||
|
return (DataSource) super.determineTargetDataSource();
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.debug("Could not get default datasource via super method", e);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback: create a basic datasource
|
||||||
|
logger.warn("No default datasource available, creating fallback");
|
||||||
|
HikariConfig config = new HikariConfig();
|
||||||
|
config.setJdbcUrl(baseDbUrl);
|
||||||
|
config.setUsername(dbUsername);
|
||||||
|
config.setPassword(dbPassword);
|
||||||
|
config.setDriverClassName("org.postgresql.Driver");
|
||||||
|
config.setMaximumPoolSize(10);
|
||||||
|
config.setConnectionTimeout(30000);
|
||||||
|
return new HikariDataSource(config);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
package com.storycove.controller;
|
package com.storycove.controller;
|
||||||
|
|
||||||
|
import com.storycove.service.LibraryService;
|
||||||
import com.storycove.service.PasswordAuthenticationService;
|
import com.storycove.service.PasswordAuthenticationService;
|
||||||
import com.storycove.util.JwtUtil;
|
import com.storycove.util.JwtUtil;
|
||||||
import jakarta.servlet.http.HttpServletResponse;
|
import jakarta.servlet.http.HttpServletResponse;
|
||||||
@@ -18,18 +19,21 @@ import java.time.Duration;
|
|||||||
public class AuthController {
|
public class AuthController {
|
||||||
|
|
||||||
private final PasswordAuthenticationService passwordService;
|
private final PasswordAuthenticationService passwordService;
|
||||||
|
private final LibraryService libraryService;
|
||||||
private final JwtUtil jwtUtil;
|
private final JwtUtil jwtUtil;
|
||||||
|
|
||||||
public AuthController(PasswordAuthenticationService passwordService, JwtUtil jwtUtil) {
|
public AuthController(PasswordAuthenticationService passwordService, LibraryService libraryService, JwtUtil jwtUtil) {
|
||||||
this.passwordService = passwordService;
|
this.passwordService = passwordService;
|
||||||
|
this.libraryService = libraryService;
|
||||||
this.jwtUtil = jwtUtil;
|
this.jwtUtil = jwtUtil;
|
||||||
}
|
}
|
||||||
|
|
||||||
@PostMapping("/login")
|
@PostMapping("/login")
|
||||||
public ResponseEntity<?> login(@Valid @RequestBody LoginRequest request, HttpServletResponse response) {
|
public ResponseEntity<?> login(@Valid @RequestBody LoginRequest request, HttpServletResponse response) {
|
||||||
if (passwordService.authenticate(request.getPassword())) {
|
// Use new library-aware authentication
|
||||||
String token = jwtUtil.generateToken();
|
String token = passwordService.authenticateAndSwitchLibrary(request.getPassword());
|
||||||
|
|
||||||
|
if (token != null) {
|
||||||
// Set httpOnly cookie
|
// Set httpOnly cookie
|
||||||
ResponseCookie cookie = ResponseCookie.from("token", token)
|
ResponseCookie cookie = ResponseCookie.from("token", token)
|
||||||
.httpOnly(true)
|
.httpOnly(true)
|
||||||
@@ -40,7 +44,8 @@ public class AuthController {
|
|||||||
|
|
||||||
response.addHeader(HttpHeaders.SET_COOKIE, cookie.toString());
|
response.addHeader(HttpHeaders.SET_COOKIE, cookie.toString());
|
||||||
|
|
||||||
return ResponseEntity.ok(new LoginResponse("Authentication successful", token));
|
String libraryInfo = passwordService.getCurrentLibraryInfo();
|
||||||
|
return ResponseEntity.ok(new LoginResponse("Authentication successful - " + libraryInfo, token));
|
||||||
} else {
|
} else {
|
||||||
return ResponseEntity.status(401).body(new ErrorResponse("Invalid password"));
|
return ResponseEntity.status(401).body(new ErrorResponse("Invalid password"));
|
||||||
}
|
}
|
||||||
@@ -48,6 +53,9 @@ public class AuthController {
|
|||||||
|
|
||||||
@PostMapping("/logout")
|
@PostMapping("/logout")
|
||||||
public ResponseEntity<?> logout(HttpServletResponse response) {
|
public ResponseEntity<?> logout(HttpServletResponse response) {
|
||||||
|
// Clear authentication state
|
||||||
|
libraryService.clearAuthentication();
|
||||||
|
|
||||||
// Clear the cookie
|
// Clear the cookie
|
||||||
ResponseCookie cookie = ResponseCookie.from("token", "")
|
ResponseCookie cookie = ResponseCookie.from("token", "")
|
||||||
.httpOnly(true)
|
.httpOnly(true)
|
||||||
|
|||||||
@@ -65,10 +65,12 @@ public class AuthorController {
|
|||||||
|
|
||||||
@PostMapping
|
@PostMapping
|
||||||
public ResponseEntity<AuthorDto> createAuthor(@Valid @RequestBody CreateAuthorRequest request) {
|
public ResponseEntity<AuthorDto> createAuthor(@Valid @RequestBody CreateAuthorRequest request) {
|
||||||
|
logger.info("Creating new author: {}", request.getName());
|
||||||
Author author = new Author();
|
Author author = new Author();
|
||||||
updateAuthorFromRequest(author, request);
|
updateAuthorFromRequest(author, request);
|
||||||
|
|
||||||
Author savedAuthor = authorService.create(author);
|
Author savedAuthor = authorService.create(author);
|
||||||
|
logger.info("Successfully created author: {} (ID: {})", savedAuthor.getName(), savedAuthor.getId());
|
||||||
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedAuthor));
|
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedAuthor));
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -81,13 +83,7 @@ public class AuthorController {
|
|||||||
@RequestParam(required = false, name = "authorRating") Integer rating,
|
@RequestParam(required = false, name = "authorRating") Integer rating,
|
||||||
@RequestParam(required = false, name = "avatar") MultipartFile avatarFile) {
|
@RequestParam(required = false, name = "avatar") MultipartFile avatarFile) {
|
||||||
|
|
||||||
System.out.println("DEBUG: MULTIPART PUT called with:");
|
logger.info("Updating author with multipart data (ID: {})", id);
|
||||||
System.out.println(" - name: " + name);
|
|
||||||
System.out.println(" - notes: " + notes);
|
|
||||||
System.out.println(" - urls: " + urls);
|
|
||||||
System.out.println(" - rating: " + rating);
|
|
||||||
System.out.println(" - avatar: " + (avatarFile != null ? avatarFile.getOriginalFilename() : "null"));
|
|
||||||
|
|
||||||
try {
|
try {
|
||||||
Author existingAuthor = authorService.findById(id);
|
Author existingAuthor = authorService.findById(id);
|
||||||
|
|
||||||
@@ -104,7 +100,6 @@ public class AuthorController {
|
|||||||
|
|
||||||
// Handle rating update
|
// Handle rating update
|
||||||
if (rating != null) {
|
if (rating != null) {
|
||||||
System.out.println("DEBUG: Setting author rating via PUT: " + rating);
|
|
||||||
existingAuthor.setAuthorRating(rating);
|
existingAuthor.setAuthorRating(rating);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -115,6 +110,7 @@ public class AuthorController {
|
|||||||
}
|
}
|
||||||
|
|
||||||
Author updatedAuthor = authorService.update(id, existingAuthor);
|
Author updatedAuthor = authorService.update(id, existingAuthor);
|
||||||
|
logger.info("Successfully updated author: {} via multipart", updatedAuthor.getName());
|
||||||
return ResponseEntity.ok(convertToDto(updatedAuthor));
|
return ResponseEntity.ok(convertToDto(updatedAuthor));
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
@@ -125,31 +121,27 @@ public class AuthorController {
|
|||||||
@PutMapping(value = "/{id}", consumes = "application/json")
|
@PutMapping(value = "/{id}", consumes = "application/json")
|
||||||
public ResponseEntity<AuthorDto> updateAuthorJson(@PathVariable UUID id,
|
public ResponseEntity<AuthorDto> updateAuthorJson(@PathVariable UUID id,
|
||||||
@Valid @RequestBody UpdateAuthorRequest request) {
|
@Valid @RequestBody UpdateAuthorRequest request) {
|
||||||
System.out.println("DEBUG: JSON PUT called with:");
|
logger.info("Updating author with JSON data: {} (ID: {})", request.getName(), id);
|
||||||
System.out.println(" - name: " + request.getName());
|
|
||||||
System.out.println(" - notes: " + request.getNotes());
|
|
||||||
System.out.println(" - urls: " + request.getUrls());
|
|
||||||
System.out.println(" - rating: " + request.getRating());
|
|
||||||
|
|
||||||
Author existingAuthor = authorService.findById(id);
|
Author existingAuthor = authorService.findById(id);
|
||||||
updateAuthorFromRequest(existingAuthor, request);
|
updateAuthorFromRequest(existingAuthor, request);
|
||||||
|
|
||||||
Author updatedAuthor = authorService.update(id, existingAuthor);
|
Author updatedAuthor = authorService.update(id, existingAuthor);
|
||||||
|
logger.info("Successfully updated author: {} via JSON", updatedAuthor.getName());
|
||||||
return ResponseEntity.ok(convertToDto(updatedAuthor));
|
return ResponseEntity.ok(convertToDto(updatedAuthor));
|
||||||
}
|
}
|
||||||
|
|
||||||
@PutMapping("/{id}")
|
@PutMapping("/{id}")
|
||||||
public ResponseEntity<String> updateAuthorGeneric(@PathVariable UUID id, HttpServletRequest request) {
|
public ResponseEntity<String> updateAuthorGeneric(@PathVariable UUID id, HttpServletRequest request) {
|
||||||
System.out.println("DEBUG: GENERIC PUT called!");
|
|
||||||
System.out.println(" - Content-Type: " + request.getContentType());
|
|
||||||
System.out.println(" - Method: " + request.getMethod());
|
|
||||||
|
|
||||||
return ResponseEntity.status(415).body("Unsupported Media Type. Expected multipart/form-data or application/json");
|
return ResponseEntity.status(415).body("Unsupported Media Type. Expected multipart/form-data or application/json");
|
||||||
}
|
}
|
||||||
|
|
||||||
@DeleteMapping("/{id}")
|
@DeleteMapping("/{id}")
|
||||||
public ResponseEntity<?> deleteAuthor(@PathVariable UUID id) {
|
public ResponseEntity<?> deleteAuthor(@PathVariable UUID id) {
|
||||||
|
logger.info("Deleting author with ID: {}", id);
|
||||||
authorService.delete(id);
|
authorService.delete(id);
|
||||||
|
logger.info("Successfully deleted author with ID: {}", id);
|
||||||
return ResponseEntity.ok(Map.of("message", "Author deleted successfully"));
|
return ResponseEntity.ok(Map.of("message", "Author deleted successfully"));
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -177,11 +169,8 @@ public class AuthorController {
|
|||||||
|
|
||||||
@PostMapping("/{id}/rating")
|
@PostMapping("/{id}/rating")
|
||||||
public ResponseEntity<AuthorDto> rateAuthor(@PathVariable UUID id, @RequestBody RatingRequest request) {
|
public ResponseEntity<AuthorDto> rateAuthor(@PathVariable UUID id, @RequestBody RatingRequest request) {
|
||||||
System.out.println("DEBUG: Rating author " + id + " with rating " + request.getRating());
|
|
||||||
Author author = authorService.setRating(id, request.getRating());
|
Author author = authorService.setRating(id, request.getRating());
|
||||||
System.out.println("DEBUG: After setRating, author rating is: " + author.getAuthorRating());
|
|
||||||
AuthorDto dto = convertToDto(author);
|
AuthorDto dto = convertToDto(author);
|
||||||
System.out.println("DEBUG: Final DTO rating is: " + dto.getAuthorRating());
|
|
||||||
return ResponseEntity.ok(dto);
|
return ResponseEntity.ok(dto);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -211,9 +200,7 @@ public class AuthorController {
|
|||||||
@PostMapping("/{id}/test-rating/{rating}")
|
@PostMapping("/{id}/test-rating/{rating}")
|
||||||
public ResponseEntity<Map<String, Object>> testSetRating(@PathVariable UUID id, @PathVariable Integer rating) {
|
public ResponseEntity<Map<String, Object>> testSetRating(@PathVariable UUID id, @PathVariable Integer rating) {
|
||||||
try {
|
try {
|
||||||
System.out.println("DEBUG: Test setting rating " + rating + " for author " + id);
|
|
||||||
Author author = authorService.setRating(id, rating);
|
Author author = authorService.setRating(id, rating);
|
||||||
System.out.println("DEBUG: After test setRating, got: " + author.getAuthorRating());
|
|
||||||
|
|
||||||
return ResponseEntity.ok(Map.of(
|
return ResponseEntity.ok(Map.of(
|
||||||
"success", true,
|
"success", true,
|
||||||
@@ -231,13 +218,11 @@ public class AuthorController {
|
|||||||
@PostMapping("/{id}/test-put-rating")
|
@PostMapping("/{id}/test-put-rating")
|
||||||
public ResponseEntity<Map<String, Object>> testPutWithRating(@PathVariable UUID id, @RequestParam Integer rating) {
|
public ResponseEntity<Map<String, Object>> testPutWithRating(@PathVariable UUID id, @RequestParam Integer rating) {
|
||||||
try {
|
try {
|
||||||
System.out.println("DEBUG: Test PUT with rating " + rating + " for author " + id);
|
|
||||||
|
|
||||||
Author existingAuthor = authorService.findById(id);
|
Author existingAuthor = authorService.findById(id);
|
||||||
existingAuthor.setAuthorRating(rating);
|
existingAuthor.setAuthorRating(rating);
|
||||||
Author updatedAuthor = authorService.update(id, existingAuthor);
|
Author updatedAuthor = authorService.update(id, existingAuthor);
|
||||||
|
|
||||||
System.out.println("DEBUG: After PUT update, rating is: " + updatedAuthor.getAuthorRating());
|
|
||||||
|
|
||||||
return ResponseEntity.ok(Map.of(
|
return ResponseEntity.ok(Map.of(
|
||||||
"success", true,
|
"success", true,
|
||||||
@@ -350,6 +335,44 @@ public class AuthorController {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@PostMapping("/clean-author-names")
|
||||||
|
public ResponseEntity<Map<String, Object>> cleanAuthorNames() {
|
||||||
|
try {
|
||||||
|
List<Author> allAuthors = authorService.findAllWithStories();
|
||||||
|
int cleanedCount = 0;
|
||||||
|
|
||||||
|
for (Author author : allAuthors) {
|
||||||
|
String originalName = author.getName();
|
||||||
|
String cleanedName = originalName != null ? originalName.trim() : "";
|
||||||
|
|
||||||
|
if (!cleanedName.equals(originalName)) {
|
||||||
|
logger.info("Cleaning author name: '{}' -> '{}'", originalName, cleanedName);
|
||||||
|
author.setName(cleanedName);
|
||||||
|
authorService.update(author.getId(), author);
|
||||||
|
cleanedCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reindex all authors after cleaning
|
||||||
|
if (cleanedCount > 0) {
|
||||||
|
typesenseService.reindexAllAuthors(allAuthors);
|
||||||
|
}
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"message", "Cleaned " + cleanedCount + " author names and reindexed",
|
||||||
|
"cleanedCount", cleanedCount,
|
||||||
|
"totalAuthors", allAuthors.size()
|
||||||
|
));
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to clean author names", e);
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", e.getMessage()
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
@GetMapping("/top-rated")
|
@GetMapping("/top-rated")
|
||||||
public ResponseEntity<List<AuthorSummaryDto>> getTopRatedAuthors(@RequestParam(defaultValue = "10") int limit) {
|
public ResponseEntity<List<AuthorSummaryDto>> getTopRatedAuthors(@RequestParam(defaultValue = "10") int limit) {
|
||||||
Pageable pageable = PageRequest.of(0, limit);
|
Pageable pageable = PageRequest.of(0, limit);
|
||||||
@@ -389,7 +412,6 @@ public class AuthorController {
|
|||||||
author.setUrls(updateReq.getUrls());
|
author.setUrls(updateReq.getUrls());
|
||||||
}
|
}
|
||||||
if (updateReq.getRating() != null) {
|
if (updateReq.getRating() != null) {
|
||||||
System.out.println("DEBUG: Setting author rating via JSON: " + updateReq.getRating());
|
|
||||||
author.setAuthorRating(updateReq.getRating());
|
author.setAuthorRating(updateReq.getRating());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -402,9 +424,6 @@ public class AuthorController {
|
|||||||
dto.setNotes(author.getNotes());
|
dto.setNotes(author.getNotes());
|
||||||
dto.setAvatarImagePath(author.getAvatarImagePath());
|
dto.setAvatarImagePath(author.getAvatarImagePath());
|
||||||
|
|
||||||
// Debug logging for author rating
|
|
||||||
System.out.println("DEBUG: Converting author " + author.getName() +
|
|
||||||
" with rating: " + author.getAuthorRating());
|
|
||||||
|
|
||||||
dto.setAuthorRating(author.getAuthorRating());
|
dto.setAuthorRating(author.getAuthorRating());
|
||||||
dto.setUrls(author.getUrls());
|
dto.setUrls(author.getUrls());
|
||||||
@@ -415,7 +434,6 @@ public class AuthorController {
|
|||||||
// Calculate and set average story rating
|
// Calculate and set average story rating
|
||||||
dto.setAverageStoryRating(authorService.calculateAverageStoryRating(author.getId()));
|
dto.setAverageStoryRating(authorService.calculateAverageStoryRating(author.getId()));
|
||||||
|
|
||||||
System.out.println("DEBUG: DTO authorRating set to: " + dto.getAuthorRating());
|
|
||||||
|
|
||||||
return dto;
|
return dto;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -6,7 +6,10 @@ import com.storycove.entity.CollectionStory;
|
|||||||
import com.storycove.entity.Story;
|
import com.storycove.entity.Story;
|
||||||
import com.storycove.entity.Tag;
|
import com.storycove.entity.Tag;
|
||||||
import com.storycove.service.CollectionService;
|
import com.storycove.service.CollectionService;
|
||||||
|
import com.storycove.service.EPUBExportService;
|
||||||
import com.storycove.service.ImageService;
|
import com.storycove.service.ImageService;
|
||||||
|
import com.storycove.service.ReadingTimeService;
|
||||||
|
import com.storycove.service.TypesenseService;
|
||||||
import jakarta.validation.Valid;
|
import jakarta.validation.Valid;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
@@ -28,12 +31,21 @@ public class CollectionController {
|
|||||||
|
|
||||||
private final CollectionService collectionService;
|
private final CollectionService collectionService;
|
||||||
private final ImageService imageService;
|
private final ImageService imageService;
|
||||||
|
private final TypesenseService typesenseService;
|
||||||
|
private final ReadingTimeService readingTimeService;
|
||||||
|
private final EPUBExportService epubExportService;
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
public CollectionController(CollectionService collectionService,
|
public CollectionController(CollectionService collectionService,
|
||||||
ImageService imageService) {
|
ImageService imageService,
|
||||||
|
@Autowired(required = false) TypesenseService typesenseService,
|
||||||
|
ReadingTimeService readingTimeService,
|
||||||
|
EPUBExportService epubExportService) {
|
||||||
this.collectionService = collectionService;
|
this.collectionService = collectionService;
|
||||||
this.imageService = imageService;
|
this.imageService = imageService;
|
||||||
|
this.typesenseService = typesenseService;
|
||||||
|
this.readingTimeService = readingTimeService;
|
||||||
|
this.epubExportService = epubExportService;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -48,8 +60,6 @@ public class CollectionController {
|
|||||||
@RequestParam(required = false) List<String> tags,
|
@RequestParam(required = false) List<String> tags,
|
||||||
@RequestParam(defaultValue = "false") boolean archived) {
|
@RequestParam(defaultValue = "false") boolean archived) {
|
||||||
|
|
||||||
logger.info("COLLECTIONS: Search request - search='{}', tags={}, archived={}, page={}, limit={}",
|
|
||||||
search, tags, archived, page, limit);
|
|
||||||
|
|
||||||
// MANDATORY: Use Typesense for all search/filter operations
|
// MANDATORY: Use Typesense for all search/filter operations
|
||||||
SearchResultDto<Collection> results = collectionService.searchCollections(search, tags, archived, page, limit);
|
SearchResultDto<Collection> results = collectionService.searchCollections(search, tags, archived, page, limit);
|
||||||
@@ -86,13 +96,14 @@ public class CollectionController {
|
|||||||
*/
|
*/
|
||||||
@PostMapping
|
@PostMapping
|
||||||
public ResponseEntity<Collection> createCollection(@Valid @RequestBody CreateCollectionRequest request) {
|
public ResponseEntity<Collection> createCollection(@Valid @RequestBody CreateCollectionRequest request) {
|
||||||
|
logger.info("Creating new collection: {}", request.getName());
|
||||||
Collection collection = collectionService.createCollection(
|
Collection collection = collectionService.createCollection(
|
||||||
request.getName(),
|
request.getName(),
|
||||||
request.getDescription(),
|
request.getDescription(),
|
||||||
request.getTagNames(),
|
request.getTagNames(),
|
||||||
request.getStoryIds()
|
request.getStoryIds()
|
||||||
);
|
);
|
||||||
|
logger.info("Successfully created collection: {} (ID: {})", collection.getName(), collection.getId());
|
||||||
return ResponseEntity.status(HttpStatus.CREATED).body(collection);
|
return ResponseEntity.status(HttpStatus.CREATED).body(collection);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -107,6 +118,7 @@ public class CollectionController {
|
|||||||
@RequestParam(required = false) List<UUID> storyIds,
|
@RequestParam(required = false) List<UUID> storyIds,
|
||||||
@RequestParam(required = false, name = "coverImage") MultipartFile coverImage) {
|
@RequestParam(required = false, name = "coverImage") MultipartFile coverImage) {
|
||||||
|
|
||||||
|
logger.info("Creating new collection with image: {}", name);
|
||||||
try {
|
try {
|
||||||
// Create collection first
|
// Create collection first
|
||||||
Collection collection = collectionService.createCollection(name, description, tags, storyIds);
|
Collection collection = collectionService.createCollection(name, description, tags, storyIds);
|
||||||
@@ -120,6 +132,7 @@ public class CollectionController {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
logger.info("Successfully created collection with image: {} (ID: {})", collection.getName(), collection.getId());
|
||||||
return ResponseEntity.status(HttpStatus.CREATED).body(collection);
|
return ResponseEntity.status(HttpStatus.CREATED).body(collection);
|
||||||
|
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
@@ -152,7 +165,9 @@ public class CollectionController {
|
|||||||
*/
|
*/
|
||||||
@DeleteMapping("/{id}")
|
@DeleteMapping("/{id}")
|
||||||
public ResponseEntity<Map<String, String>> deleteCollection(@PathVariable UUID id) {
|
public ResponseEntity<Map<String, String>> deleteCollection(@PathVariable UUID id) {
|
||||||
|
logger.info("Deleting collection with ID: {}", id);
|
||||||
collectionService.deleteCollection(id);
|
collectionService.deleteCollection(id);
|
||||||
|
logger.info("Successfully deleted collection with ID: {}", id);
|
||||||
return ResponseEntity.ok(Map.of("message", "Collection deleted successfully"));
|
return ResponseEntity.ok(Map.of("message", "Collection deleted successfully"));
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -270,6 +285,114 @@ public class CollectionController {
|
|||||||
return ResponseEntity.ok(Map.of("message", "Cover removed successfully"));
|
return ResponseEntity.ok(Map.of("message", "Cover removed successfully"));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* POST /api/collections/reindex-typesense - Reindex all collections in Typesense
|
||||||
|
*/
|
||||||
|
@PostMapping("/reindex-typesense")
|
||||||
|
public ResponseEntity<Map<String, Object>> reindexCollectionsTypesense() {
|
||||||
|
try {
|
||||||
|
List<Collection> allCollections = collectionService.findAllWithTags();
|
||||||
|
if (typesenseService != null) {
|
||||||
|
typesenseService.reindexAllCollections(allCollections);
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"message", "Successfully reindexed all collections",
|
||||||
|
"count", allCollections.size()
|
||||||
|
));
|
||||||
|
} else {
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"message", "Typesense service not available"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to reindex collections", e);
|
||||||
|
return ResponseEntity.badRequest().body(Map.of(
|
||||||
|
"success", false,
|
||||||
|
"error", e.getMessage()
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* GET /api/collections/{id}/epub - Export collection as EPUB
|
||||||
|
*/
|
||||||
|
@GetMapping("/{id}/epub")
|
||||||
|
public ResponseEntity<org.springframework.core.io.Resource> exportCollectionAsEPUB(@PathVariable UUID id) {
|
||||||
|
logger.info("Exporting collection {} to EPUB", id);
|
||||||
|
|
||||||
|
try {
|
||||||
|
Collection collection = collectionService.findById(id);
|
||||||
|
List<Story> stories = collection.getCollectionStories().stream()
|
||||||
|
.sorted((cs1, cs2) -> Integer.compare(cs1.getPosition(), cs2.getPosition()))
|
||||||
|
.map(cs -> cs.getStory())
|
||||||
|
.collect(java.util.stream.Collectors.toList());
|
||||||
|
|
||||||
|
if (stories.isEmpty()) {
|
||||||
|
logger.warn("Collection {} contains no stories for export", id);
|
||||||
|
return ResponseEntity.badRequest()
|
||||||
|
.body(null);
|
||||||
|
}
|
||||||
|
|
||||||
|
EPUBExportRequest request = new EPUBExportRequest();
|
||||||
|
request.setIncludeCoverImage(true);
|
||||||
|
request.setIncludeMetadata(true);
|
||||||
|
request.setIncludeReadingPosition(false); // Collections don't have reading positions
|
||||||
|
|
||||||
|
org.springframework.core.io.Resource resource = epubExportService.exportCollectionAsEPUB(id, request);
|
||||||
|
String filename = epubExportService.getCollectionEPUBFilename(collection);
|
||||||
|
|
||||||
|
logger.info("Successfully exported collection EPUB: {}", filename);
|
||||||
|
|
||||||
|
return ResponseEntity.ok()
|
||||||
|
.header("Content-Disposition", "attachment; filename=\"" + filename + "\"")
|
||||||
|
.header("Content-Type", "application/epub+zip")
|
||||||
|
.body(resource);
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error exporting collection EPUB: {}", e.getMessage(), e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* POST /api/collections/{id}/epub - Export collection as EPUB with custom options
|
||||||
|
*/
|
||||||
|
@PostMapping("/{id}/epub")
|
||||||
|
public ResponseEntity<org.springframework.core.io.Resource> exportCollectionAsEPUBWithOptions(
|
||||||
|
@PathVariable UUID id,
|
||||||
|
@Valid @RequestBody EPUBExportRequest request) {
|
||||||
|
logger.info("Exporting collection {} to EPUB with custom options", id);
|
||||||
|
|
||||||
|
try {
|
||||||
|
Collection collection = collectionService.findById(id);
|
||||||
|
List<Story> stories = collection.getCollectionStories().stream()
|
||||||
|
.sorted((cs1, cs2) -> Integer.compare(cs1.getPosition(), cs2.getPosition()))
|
||||||
|
.map(cs -> cs.getStory())
|
||||||
|
.collect(java.util.stream.Collectors.toList());
|
||||||
|
|
||||||
|
if (stories.isEmpty()) {
|
||||||
|
logger.warn("Collection {} contains no stories for export", id);
|
||||||
|
return ResponseEntity.badRequest()
|
||||||
|
.body(null);
|
||||||
|
}
|
||||||
|
|
||||||
|
org.springframework.core.io.Resource resource = epubExportService.exportCollectionAsEPUB(id, request);
|
||||||
|
String filename = epubExportService.getCollectionEPUBFilename(collection);
|
||||||
|
|
||||||
|
logger.info("Successfully exported collection EPUB with options: {}", filename);
|
||||||
|
|
||||||
|
return ResponseEntity.ok()
|
||||||
|
.header("Content-Disposition", "attachment; filename=\"" + filename + "\"")
|
||||||
|
.header("Content-Type", "application/epub+zip")
|
||||||
|
.body(resource);
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error exporting collection EPUB: {}", e.getMessage(), e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Mapper methods
|
// Mapper methods
|
||||||
|
|
||||||
private CollectionDto mapToCollectionDto(Collection collection) {
|
private CollectionDto mapToCollectionDto(Collection collection) {
|
||||||
@@ -290,6 +413,11 @@ public class CollectionController {
|
|||||||
.toList());
|
.toList());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Map tag names for search results
|
||||||
|
if (collection.getTagNames() != null) {
|
||||||
|
dto.setTagNames(collection.getTagNames());
|
||||||
|
}
|
||||||
|
|
||||||
// Map collection stories (lightweight)
|
// Map collection stories (lightweight)
|
||||||
if (collection.getCollectionStories() != null) {
|
if (collection.getCollectionStories() != null) {
|
||||||
dto.setCollectionStories(collection.getCollectionStories().stream()
|
dto.setCollectionStories(collection.getCollectionStories().stream()
|
||||||
@@ -300,7 +428,7 @@ public class CollectionController {
|
|||||||
// Set calculated properties
|
// Set calculated properties
|
||||||
dto.setStoryCount(collection.getStoryCount());
|
dto.setStoryCount(collection.getStoryCount());
|
||||||
dto.setTotalWordCount(collection.getTotalWordCount());
|
dto.setTotalWordCount(collection.getTotalWordCount());
|
||||||
dto.setEstimatedReadingTime(collection.getEstimatedReadingTime());
|
dto.setEstimatedReadingTime(readingTimeService.calculateReadingTime(collection.getTotalWordCount()));
|
||||||
dto.setAverageStoryRating(collection.getAverageStoryRating());
|
dto.setAverageStoryRating(collection.getAverageStoryRating());
|
||||||
|
|
||||||
return dto;
|
return dto;
|
||||||
|
|||||||
@@ -0,0 +1,54 @@
|
|||||||
|
package com.storycove.controller;
|
||||||
|
|
||||||
|
import com.storycove.dto.HtmlSanitizationConfigDto;
|
||||||
|
import com.storycove.service.HtmlSanitizationService;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
|
import org.springframework.http.ResponseEntity;
|
||||||
|
import org.springframework.web.bind.annotation.*;
|
||||||
|
|
||||||
|
import java.util.Map;
|
||||||
|
|
||||||
|
@RestController
|
||||||
|
@RequestMapping("/api/config")
|
||||||
|
public class ConfigController {
|
||||||
|
|
||||||
|
private final HtmlSanitizationService htmlSanitizationService;
|
||||||
|
|
||||||
|
@Value("${app.reading.speed.default:200}")
|
||||||
|
private int defaultReadingSpeed;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
public ConfigController(HtmlSanitizationService htmlSanitizationService) {
|
||||||
|
this.htmlSanitizationService = htmlSanitizationService;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get the HTML sanitization configuration for frontend use
|
||||||
|
* This allows the frontend to use the same sanitization rules as the backend
|
||||||
|
*/
|
||||||
|
@GetMapping("/html-sanitization")
|
||||||
|
public ResponseEntity<HtmlSanitizationConfigDto> getHtmlSanitizationConfig() {
|
||||||
|
HtmlSanitizationConfigDto config = htmlSanitizationService.getConfiguration();
|
||||||
|
return ResponseEntity.ok(config);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get application settings configuration
|
||||||
|
*/
|
||||||
|
@GetMapping("/settings")
|
||||||
|
public ResponseEntity<Map<String, Object>> getSettings() {
|
||||||
|
Map<String, Object> settings = Map.of(
|
||||||
|
"defaultReadingSpeed", defaultReadingSpeed
|
||||||
|
);
|
||||||
|
return ResponseEntity.ok(settings);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get reading speed for calculation purposes
|
||||||
|
*/
|
||||||
|
@GetMapping("/reading-speed")
|
||||||
|
public ResponseEntity<Map<String, Integer>> getReadingSpeed() {
|
||||||
|
return ResponseEntity.ok(Map.of("wordsPerMinute", defaultReadingSpeed));
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,154 @@
|
|||||||
|
package com.storycove.controller;
|
||||||
|
|
||||||
|
import com.storycove.service.DatabaseManagementService;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.core.io.Resource;
|
||||||
|
import org.springframework.http.HttpHeaders;
|
||||||
|
import org.springframework.http.MediaType;
|
||||||
|
import org.springframework.http.ResponseEntity;
|
||||||
|
import org.springframework.web.bind.annotation.*;
|
||||||
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.time.format.DateTimeFormatter;
|
||||||
|
import java.util.Map;
|
||||||
|
|
||||||
|
@RestController
|
||||||
|
@RequestMapping("/api/database")
|
||||||
|
public class DatabaseController {
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private DatabaseManagementService databaseManagementService;
|
||||||
|
|
||||||
|
@PostMapping("/backup")
|
||||||
|
public ResponseEntity<Resource> backupDatabase() {
|
||||||
|
try {
|
||||||
|
Resource backup = databaseManagementService.createBackup();
|
||||||
|
|
||||||
|
String timestamp = LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss"));
|
||||||
|
String filename = "storycove_backup_" + timestamp + ".sql";
|
||||||
|
|
||||||
|
return ResponseEntity.ok()
|
||||||
|
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"" + filename + "\"")
|
||||||
|
.contentType(MediaType.APPLICATION_OCTET_STREAM)
|
||||||
|
.body(backup);
|
||||||
|
} catch (Exception e) {
|
||||||
|
throw new RuntimeException("Failed to create database backup: " + e.getMessage(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/restore")
|
||||||
|
public ResponseEntity<Map<String, Object>> restoreDatabase(@RequestParam("file") MultipartFile file) {
|
||||||
|
try {
|
||||||
|
if (file.isEmpty()) {
|
||||||
|
return ResponseEntity.badRequest()
|
||||||
|
.body(Map.of("success", false, "message", "No file uploaded"));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!file.getOriginalFilename().endsWith(".sql")) {
|
||||||
|
return ResponseEntity.badRequest()
|
||||||
|
.body(Map.of("success", false, "message", "Invalid file type. Please upload a .sql file"));
|
||||||
|
}
|
||||||
|
|
||||||
|
databaseManagementService.restoreFromBackup(file.getInputStream());
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"message", "Database restored successfully from " + file.getOriginalFilename()
|
||||||
|
));
|
||||||
|
} catch (IOException e) {
|
||||||
|
return ResponseEntity.internalServerError()
|
||||||
|
.body(Map.of("success", false, "message", "Failed to read backup file: " + e.getMessage()));
|
||||||
|
} catch (Exception e) {
|
||||||
|
return ResponseEntity.internalServerError()
|
||||||
|
.body(Map.of("success", false, "message", "Failed to restore database: " + e.getMessage()));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/clear")
|
||||||
|
public ResponseEntity<Map<String, Object>> clearDatabase() {
|
||||||
|
try {
|
||||||
|
int deletedRecords = databaseManagementService.clearAllData();
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"message", "Database cleared successfully",
|
||||||
|
"deletedRecords", deletedRecords
|
||||||
|
));
|
||||||
|
} catch (Exception e) {
|
||||||
|
return ResponseEntity.internalServerError()
|
||||||
|
.body(Map.of("success", false, "message", "Failed to clear database: " + e.getMessage()));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/backup-complete")
|
||||||
|
public ResponseEntity<Resource> backupComplete() {
|
||||||
|
try {
|
||||||
|
Resource backup = databaseManagementService.createCompleteBackup();
|
||||||
|
|
||||||
|
String timestamp = LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss"));
|
||||||
|
String filename = "storycove_complete_backup_" + timestamp + ".zip";
|
||||||
|
|
||||||
|
return ResponseEntity.ok()
|
||||||
|
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"" + filename + "\"")
|
||||||
|
.header(HttpHeaders.CONTENT_TYPE, "application/zip")
|
||||||
|
.body(backup);
|
||||||
|
} catch (Exception e) {
|
||||||
|
throw new RuntimeException("Failed to create complete backup: " + e.getMessage(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/restore-complete")
|
||||||
|
public ResponseEntity<Map<String, Object>> restoreComplete(@RequestParam("file") MultipartFile file) {
|
||||||
|
System.err.println("Complete restore endpoint called with file: " + (file != null ? file.getOriginalFilename() : "null"));
|
||||||
|
try {
|
||||||
|
if (file.isEmpty()) {
|
||||||
|
System.err.println("File is empty - returning bad request");
|
||||||
|
return ResponseEntity.badRequest()
|
||||||
|
.body(Map.of("success", false, "message", "No file uploaded"));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!file.getOriginalFilename().endsWith(".zip")) {
|
||||||
|
System.err.println("Invalid file type: " + file.getOriginalFilename());
|
||||||
|
return ResponseEntity.badRequest()
|
||||||
|
.body(Map.of("success", false, "message", "Invalid file type. Please upload a .zip file"));
|
||||||
|
}
|
||||||
|
|
||||||
|
System.err.println("File validation passed, calling restore service...");
|
||||||
|
databaseManagementService.restoreFromCompleteBackup(file.getInputStream());
|
||||||
|
System.err.println("Restore service completed successfully");
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"message", "Complete backup restored successfully from " + file.getOriginalFilename()
|
||||||
|
));
|
||||||
|
} catch (IOException e) {
|
||||||
|
System.err.println("IOException during restore: " + e.getMessage());
|
||||||
|
e.printStackTrace();
|
||||||
|
return ResponseEntity.internalServerError()
|
||||||
|
.body(Map.of("success", false, "message", "Failed to read backup file: " + e.getMessage()));
|
||||||
|
} catch (Exception e) {
|
||||||
|
System.err.println("Exception during restore: " + e.getMessage());
|
||||||
|
e.printStackTrace();
|
||||||
|
return ResponseEntity.internalServerError()
|
||||||
|
.body(Map.of("success", false, "message", "Failed to restore complete backup: " + e.getMessage()));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/clear-complete")
|
||||||
|
public ResponseEntity<Map<String, Object>> clearComplete() {
|
||||||
|
try {
|
||||||
|
int deletedRecords = databaseManagementService.clearAllDataAndFiles();
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"message", "Database and files cleared successfully",
|
||||||
|
"deletedRecords", deletedRecords
|
||||||
|
));
|
||||||
|
} catch (Exception e) {
|
||||||
|
return ResponseEntity.internalServerError()
|
||||||
|
.body(Map.of("success", false, "message", "Failed to clear database and files: " + e.getMessage()));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,6 +1,7 @@
|
|||||||
package com.storycove.controller;
|
package com.storycove.controller;
|
||||||
|
|
||||||
import com.storycove.service.ImageService;
|
import com.storycove.service.ImageService;
|
||||||
|
import com.storycove.service.LibraryService;
|
||||||
import org.springframework.core.io.FileSystemResource;
|
import org.springframework.core.io.FileSystemResource;
|
||||||
import org.springframework.core.io.Resource;
|
import org.springframework.core.io.Resource;
|
||||||
import org.springframework.http.HttpHeaders;
|
import org.springframework.http.HttpHeaders;
|
||||||
@@ -10,6 +11,7 @@ import org.springframework.http.ResponseEntity;
|
|||||||
import org.springframework.web.bind.annotation.*;
|
import org.springframework.web.bind.annotation.*;
|
||||||
import org.springframework.web.multipart.MultipartFile;
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
|
import jakarta.servlet.http.HttpServletRequest;
|
||||||
import java.io.IOException;
|
import java.io.IOException;
|
||||||
import java.nio.file.Files;
|
import java.nio.file.Files;
|
||||||
import java.nio.file.Path;
|
import java.nio.file.Path;
|
||||||
@@ -21,9 +23,17 @@ import java.util.Map;
|
|||||||
public class FileController {
|
public class FileController {
|
||||||
|
|
||||||
private final ImageService imageService;
|
private final ImageService imageService;
|
||||||
|
private final LibraryService libraryService;
|
||||||
|
|
||||||
public FileController(ImageService imageService) {
|
public FileController(ImageService imageService, LibraryService libraryService) {
|
||||||
this.imageService = imageService;
|
this.imageService = imageService;
|
||||||
|
this.libraryService = libraryService;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String getCurrentLibraryId() {
|
||||||
|
String libraryId = libraryService.getCurrentLibraryId();
|
||||||
|
System.out.println("FileController - Current Library ID: " + libraryId);
|
||||||
|
return libraryId != null ? libraryId : "default";
|
||||||
}
|
}
|
||||||
|
|
||||||
@PostMapping("/upload/cover")
|
@PostMapping("/upload/cover")
|
||||||
@@ -34,7 +44,11 @@ public class FileController {
|
|||||||
Map<String, String> response = new HashMap<>();
|
Map<String, String> response = new HashMap<>();
|
||||||
response.put("message", "Cover uploaded successfully");
|
response.put("message", "Cover uploaded successfully");
|
||||||
response.put("path", imagePath);
|
response.put("path", imagePath);
|
||||||
response.put("url", "/api/files/images/" + imagePath);
|
String currentLibraryId = getCurrentLibraryId();
|
||||||
|
String imageUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
||||||
|
response.put("url", imageUrl);
|
||||||
|
|
||||||
|
System.out.println("Upload response - path: " + imagePath + ", url: " + imageUrl);
|
||||||
|
|
||||||
return ResponseEntity.ok(response);
|
return ResponseEntity.ok(response);
|
||||||
} catch (IllegalArgumentException e) {
|
} catch (IllegalArgumentException e) {
|
||||||
@@ -53,7 +67,8 @@ public class FileController {
|
|||||||
Map<String, String> response = new HashMap<>();
|
Map<String, String> response = new HashMap<>();
|
||||||
response.put("message", "Avatar uploaded successfully");
|
response.put("message", "Avatar uploaded successfully");
|
||||||
response.put("path", imagePath);
|
response.put("path", imagePath);
|
||||||
response.put("url", "/api/files/images/" + imagePath);
|
String currentLibraryId = getCurrentLibraryId();
|
||||||
|
response.put("url", "/api/files/images/" + currentLibraryId + "/" + imagePath);
|
||||||
|
|
||||||
return ResponseEntity.ok(response);
|
return ResponseEntity.ok(response);
|
||||||
} catch (IllegalArgumentException e) {
|
} catch (IllegalArgumentException e) {
|
||||||
@@ -64,17 +79,18 @@ public class FileController {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@GetMapping("/images/**")
|
@GetMapping("/images/{libraryId}/**")
|
||||||
public ResponseEntity<Resource> serveImage(@RequestParam String path) {
|
public ResponseEntity<Resource> serveImage(@PathVariable String libraryId, HttpServletRequest request) {
|
||||||
try {
|
try {
|
||||||
// Extract path from the URL
|
// Extract the full request path after /api/files/images/{libraryId}/
|
||||||
String imagePath = path.replace("/api/files/images/", "");
|
String requestURI = request.getRequestURI();
|
||||||
|
String imagePath = requestURI.replaceFirst(".*/api/files/images/" + libraryId + "/", "");
|
||||||
|
|
||||||
if (!imageService.imageExists(imagePath)) {
|
if (!imageService.imageExistsInLibrary(imagePath, libraryId)) {
|
||||||
return ResponseEntity.notFound().build();
|
return ResponseEntity.notFound().build();
|
||||||
}
|
}
|
||||||
|
|
||||||
Path fullPath = imageService.getImagePath(imagePath);
|
Path fullPath = imageService.getImagePathInLibrary(imagePath, libraryId);
|
||||||
Resource resource = new FileSystemResource(fullPath);
|
Resource resource = new FileSystemResource(fullPath);
|
||||||
|
|
||||||
if (!resource.exists()) {
|
if (!resource.exists()) {
|
||||||
|
|||||||
@@ -1,31 +0,0 @@
|
|||||||
package com.storycove.controller;
|
|
||||||
|
|
||||||
import com.storycove.dto.HtmlSanitizationConfigDto;
|
|
||||||
import com.storycove.service.HtmlSanitizationService;
|
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
|
||||||
import org.springframework.http.ResponseEntity;
|
|
||||||
import org.springframework.web.bind.annotation.GetMapping;
|
|
||||||
import org.springframework.web.bind.annotation.RequestMapping;
|
|
||||||
import org.springframework.web.bind.annotation.RestController;
|
|
||||||
|
|
||||||
@RestController
|
|
||||||
@RequestMapping("/api/config")
|
|
||||||
public class HtmlSanitizationController {
|
|
||||||
|
|
||||||
private final HtmlSanitizationService htmlSanitizationService;
|
|
||||||
|
|
||||||
@Autowired
|
|
||||||
public HtmlSanitizationController(HtmlSanitizationService htmlSanitizationService) {
|
|
||||||
this.htmlSanitizationService = htmlSanitizationService;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Get the HTML sanitization configuration for frontend use
|
|
||||||
* This allows the frontend to use the same sanitization rules as the backend
|
|
||||||
*/
|
|
||||||
@GetMapping("/html-sanitization")
|
|
||||||
public ResponseEntity<HtmlSanitizationConfigDto> getHtmlSanitizationConfig() {
|
|
||||||
HtmlSanitizationConfigDto config = htmlSanitizationService.getConfiguration();
|
|
||||||
return ResponseEntity.ok(config);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -0,0 +1,242 @@
|
|||||||
|
package com.storycove.controller;
|
||||||
|
|
||||||
|
import com.storycove.dto.LibraryDto;
|
||||||
|
import com.storycove.service.LibraryService;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.http.ResponseEntity;
|
||||||
|
import org.springframework.web.bind.annotation.*;
|
||||||
|
|
||||||
|
import java.util.HashMap;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Map;
|
||||||
|
|
||||||
|
@RestController
|
||||||
|
@RequestMapping("/api/libraries")
|
||||||
|
public class LibraryController {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(LibraryController.class);
|
||||||
|
|
||||||
|
private final LibraryService libraryService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
public LibraryController(LibraryService libraryService) {
|
||||||
|
this.libraryService = libraryService;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get all available libraries (for settings UI)
|
||||||
|
*/
|
||||||
|
@GetMapping
|
||||||
|
public ResponseEntity<List<LibraryDto>> getAllLibraries() {
|
||||||
|
try {
|
||||||
|
List<LibraryDto> libraries = libraryService.getAllLibraries();
|
||||||
|
return ResponseEntity.ok(libraries);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to get libraries", e);
|
||||||
|
return ResponseEntity.internalServerError().build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get current active library info
|
||||||
|
*/
|
||||||
|
@GetMapping("/current")
|
||||||
|
public ResponseEntity<LibraryDto> getCurrentLibrary() {
|
||||||
|
try {
|
||||||
|
var library = libraryService.getCurrentLibrary();
|
||||||
|
if (library == null) {
|
||||||
|
return ResponseEntity.noContent().build();
|
||||||
|
}
|
||||||
|
|
||||||
|
LibraryDto dto = new LibraryDto(
|
||||||
|
library.getId(),
|
||||||
|
library.getName(),
|
||||||
|
library.getDescription(),
|
||||||
|
true, // always active since it's current
|
||||||
|
library.isInitialized()
|
||||||
|
);
|
||||||
|
|
||||||
|
return ResponseEntity.ok(dto);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to get current library", e);
|
||||||
|
return ResponseEntity.internalServerError().build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Switch to a different library (requires re-authentication)
|
||||||
|
* This endpoint returns a switching status that the frontend can poll
|
||||||
|
*/
|
||||||
|
@PostMapping("/switch")
|
||||||
|
public ResponseEntity<Map<String, Object>> initiateLibrarySwitch(@RequestBody Map<String, String> request) {
|
||||||
|
try {
|
||||||
|
String password = request.get("password");
|
||||||
|
if (password == null || password.trim().isEmpty()) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", "Password required"));
|
||||||
|
}
|
||||||
|
|
||||||
|
String libraryId = libraryService.authenticateAndGetLibrary(password);
|
||||||
|
if (libraryId == null) {
|
||||||
|
return ResponseEntity.status(401).body(Map.of("error", "Invalid password"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if already on this library
|
||||||
|
if (libraryId.equals(libraryService.getCurrentLibraryId())) {
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"status", "already_active",
|
||||||
|
"message", "Already using this library"
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initiate switch in background thread
|
||||||
|
new Thread(() -> {
|
||||||
|
try {
|
||||||
|
libraryService.switchToLibrary(libraryId);
|
||||||
|
logger.info("Library switch completed: {}", libraryId);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Library switch failed: {}", libraryId, e);
|
||||||
|
}
|
||||||
|
}).start();
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"status", "switching",
|
||||||
|
"targetLibrary", libraryId,
|
||||||
|
"message", "Switching to library, please wait..."
|
||||||
|
));
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to initiate library switch", e);
|
||||||
|
return ResponseEntity.internalServerError().body(Map.of("error", "Server error"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check library switch status
|
||||||
|
*/
|
||||||
|
@GetMapping("/switch/status")
|
||||||
|
public ResponseEntity<Map<String, Object>> getLibrarySwitchStatus() {
|
||||||
|
try {
|
||||||
|
var currentLibrary = libraryService.getCurrentLibrary();
|
||||||
|
boolean isReady = currentLibrary != null;
|
||||||
|
|
||||||
|
Map<String, Object> response = new HashMap<>();
|
||||||
|
response.put("ready", isReady);
|
||||||
|
if (isReady) {
|
||||||
|
response.put("currentLibrary", currentLibrary.getId());
|
||||||
|
response.put("currentLibraryName", currentLibrary.getName());
|
||||||
|
} else {
|
||||||
|
response.put("currentLibrary", null);
|
||||||
|
response.put("currentLibraryName", null);
|
||||||
|
}
|
||||||
|
|
||||||
|
return ResponseEntity.ok(response);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to get switch status", e);
|
||||||
|
return ResponseEntity.ok(Map.of("ready", false, "error", "Status check failed"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Change password for current library
|
||||||
|
*/
|
||||||
|
@PostMapping("/password")
|
||||||
|
public ResponseEntity<Map<String, Object>> changePassword(@RequestBody Map<String, String> request) {
|
||||||
|
try {
|
||||||
|
String currentPassword = request.get("currentPassword");
|
||||||
|
String newPassword = request.get("newPassword");
|
||||||
|
|
||||||
|
if (currentPassword == null || newPassword == null) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", "Current and new passwords required"));
|
||||||
|
}
|
||||||
|
|
||||||
|
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||||
|
if (currentLibraryId == null) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", "No active library"));
|
||||||
|
}
|
||||||
|
|
||||||
|
boolean success = libraryService.changeLibraryPassword(currentLibraryId, currentPassword, newPassword);
|
||||||
|
if (success) {
|
||||||
|
return ResponseEntity.ok(Map.of("success", true, "message", "Password changed successfully"));
|
||||||
|
} else {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", "Current password is incorrect"));
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to change password", e);
|
||||||
|
return ResponseEntity.internalServerError().body(Map.of("error", "Server error"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a new library
|
||||||
|
*/
|
||||||
|
@PostMapping("/create")
|
||||||
|
public ResponseEntity<Map<String, Object>> createLibrary(@RequestBody Map<String, String> request) {
|
||||||
|
try {
|
||||||
|
String name = request.get("name");
|
||||||
|
String description = request.get("description");
|
||||||
|
String password = request.get("password");
|
||||||
|
|
||||||
|
if (name == null || name.trim().isEmpty() || password == null || password.trim().isEmpty()) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", "Name and password are required"));
|
||||||
|
}
|
||||||
|
|
||||||
|
var newLibrary = libraryService.createNewLibrary(name.trim(), description, password);
|
||||||
|
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"success", true,
|
||||||
|
"library", Map.of(
|
||||||
|
"id", newLibrary.getId(),
|
||||||
|
"name", newLibrary.getName(),
|
||||||
|
"description", newLibrary.getDescription()
|
||||||
|
),
|
||||||
|
"message", "Library created successfully. You can now log in with the new password to access it."
|
||||||
|
));
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to create library", e);
|
||||||
|
return ResponseEntity.internalServerError().body(Map.of("error", "Server error"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update library metadata (name and description)
|
||||||
|
*/
|
||||||
|
@PutMapping("/{libraryId}/metadata")
|
||||||
|
public ResponseEntity<Map<String, Object>> updateLibraryMetadata(
|
||||||
|
@PathVariable String libraryId,
|
||||||
|
@RequestBody Map<String, String> updates) {
|
||||||
|
|
||||||
|
try {
|
||||||
|
String newName = updates.get("name");
|
||||||
|
String newDescription = updates.get("description");
|
||||||
|
|
||||||
|
if (newName == null || newName.trim().isEmpty()) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", "Library name is required"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update the library
|
||||||
|
libraryService.updateLibraryMetadata(libraryId, newName, newDescription);
|
||||||
|
|
||||||
|
// Return updated library info
|
||||||
|
LibraryDto updatedLibrary = libraryService.getLibraryById(libraryId);
|
||||||
|
if (updatedLibrary != null) {
|
||||||
|
Map<String, Object> response = new HashMap<>();
|
||||||
|
response.put("success", true);
|
||||||
|
response.put("message", "Library metadata updated successfully");
|
||||||
|
response.put("library", updatedLibrary);
|
||||||
|
return ResponseEntity.ok(response);
|
||||||
|
} else {
|
||||||
|
return ResponseEntity.notFound().build();
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (IllegalArgumentException e) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", e.getMessage()));
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to update library metadata for {}: {}", libraryId, e.getMessage(), e);
|
||||||
|
return ResponseEntity.internalServerError().body(Map.of("error", "Failed to update library metadata"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -14,6 +14,7 @@ import org.slf4j.Logger;
|
|||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.data.domain.Page;
|
import org.springframework.data.domain.Page;
|
||||||
|
import org.springframework.data.domain.PageImpl;
|
||||||
import org.springframework.data.domain.PageRequest;
|
import org.springframework.data.domain.PageRequest;
|
||||||
import org.springframework.data.domain.Pageable;
|
import org.springframework.data.domain.Pageable;
|
||||||
import org.springframework.data.domain.Sort;
|
import org.springframework.data.domain.Sort;
|
||||||
@@ -25,6 +26,7 @@ import org.springframework.web.multipart.MultipartFile;
|
|||||||
import java.util.ArrayList;
|
import java.util.ArrayList;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Map;
|
import java.util.Map;
|
||||||
|
import java.util.Optional;
|
||||||
import java.util.UUID;
|
import java.util.UUID;
|
||||||
import java.util.stream.Collectors;
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
@@ -41,6 +43,9 @@ public class StoryController {
|
|||||||
private final ImageService imageService;
|
private final ImageService imageService;
|
||||||
private final TypesenseService typesenseService;
|
private final TypesenseService typesenseService;
|
||||||
private final CollectionService collectionService;
|
private final CollectionService collectionService;
|
||||||
|
private final ReadingTimeService readingTimeService;
|
||||||
|
private final EPUBImportService epubImportService;
|
||||||
|
private final EPUBExportService epubExportService;
|
||||||
|
|
||||||
public StoryController(StoryService storyService,
|
public StoryController(StoryService storyService,
|
||||||
AuthorService authorService,
|
AuthorService authorService,
|
||||||
@@ -48,7 +53,10 @@ public class StoryController {
|
|||||||
HtmlSanitizationService sanitizationService,
|
HtmlSanitizationService sanitizationService,
|
||||||
ImageService imageService,
|
ImageService imageService,
|
||||||
CollectionService collectionService,
|
CollectionService collectionService,
|
||||||
@Autowired(required = false) TypesenseService typesenseService) {
|
@Autowired(required = false) TypesenseService typesenseService,
|
||||||
|
ReadingTimeService readingTimeService,
|
||||||
|
EPUBImportService epubImportService,
|
||||||
|
EPUBExportService epubExportService) {
|
||||||
this.storyService = storyService;
|
this.storyService = storyService;
|
||||||
this.authorService = authorService;
|
this.authorService = authorService;
|
||||||
this.seriesService = seriesService;
|
this.seriesService = seriesService;
|
||||||
@@ -56,6 +64,9 @@ public class StoryController {
|
|||||||
this.imageService = imageService;
|
this.imageService = imageService;
|
||||||
this.collectionService = collectionService;
|
this.collectionService = collectionService;
|
||||||
this.typesenseService = typesenseService;
|
this.typesenseService = typesenseService;
|
||||||
|
this.readingTimeService = readingTimeService;
|
||||||
|
this.epubImportService = epubImportService;
|
||||||
|
this.epubExportService = epubExportService;
|
||||||
}
|
}
|
||||||
|
|
||||||
@GetMapping
|
@GetMapping
|
||||||
@@ -75,31 +86,92 @@ public class StoryController {
|
|||||||
return ResponseEntity.ok(storyDtos);
|
return ResponseEntity.ok(storyDtos);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@GetMapping("/random")
|
||||||
|
public ResponseEntity<StorySummaryDto> getRandomStory(
|
||||||
|
@RequestParam(required = false) String searchQuery,
|
||||||
|
@RequestParam(required = false) List<String> tags,
|
||||||
|
@RequestParam(required = false) Long seed,
|
||||||
|
// Advanced filters
|
||||||
|
@RequestParam(required = false) Integer minWordCount,
|
||||||
|
@RequestParam(required = false) Integer maxWordCount,
|
||||||
|
@RequestParam(required = false) String createdAfter,
|
||||||
|
@RequestParam(required = false) String createdBefore,
|
||||||
|
@RequestParam(required = false) String lastReadAfter,
|
||||||
|
@RequestParam(required = false) String lastReadBefore,
|
||||||
|
@RequestParam(required = false) Integer minRating,
|
||||||
|
@RequestParam(required = false) Integer maxRating,
|
||||||
|
@RequestParam(required = false) Boolean unratedOnly,
|
||||||
|
@RequestParam(required = false) String readingStatus,
|
||||||
|
@RequestParam(required = false) Boolean hasReadingProgress,
|
||||||
|
@RequestParam(required = false) Boolean hasCoverImage,
|
||||||
|
@RequestParam(required = false) String sourceDomain,
|
||||||
|
@RequestParam(required = false) String seriesFilter,
|
||||||
|
@RequestParam(required = false) Integer minTagCount,
|
||||||
|
@RequestParam(required = false) Boolean popularOnly,
|
||||||
|
@RequestParam(required = false) Boolean hiddenGemsOnly) {
|
||||||
|
|
||||||
|
logger.info("Getting random story with filters - searchQuery: {}, tags: {}, seed: {}",
|
||||||
|
searchQuery, tags, seed);
|
||||||
|
|
||||||
|
Optional<Story> randomStory = storyService.findRandomStory(searchQuery, tags, seed,
|
||||||
|
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||||
|
minRating, maxRating, unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
|
||||||
|
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
|
||||||
|
|
||||||
|
if (randomStory.isPresent()) {
|
||||||
|
StorySummaryDto storyDto = convertToSummaryDto(randomStory.get());
|
||||||
|
return ResponseEntity.ok(storyDto);
|
||||||
|
} else {
|
||||||
|
return ResponseEntity.noContent().build(); // 204 No Content when no stories match filters
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
@GetMapping("/{id}")
|
@GetMapping("/{id}")
|
||||||
public ResponseEntity<StoryDto> getStoryById(@PathVariable UUID id) {
|
public ResponseEntity<StoryDto> getStoryById(@PathVariable UUID id) {
|
||||||
Story story = storyService.findById(id);
|
Story story = storyService.findById(id);
|
||||||
return ResponseEntity.ok(convertToDto(story));
|
return ResponseEntity.ok(convertToDto(story));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@GetMapping("/{id}/read")
|
||||||
|
public ResponseEntity<StoryReadingDto> getStoryForReading(@PathVariable UUID id) {
|
||||||
|
logger.info("Getting story {} for reading", id);
|
||||||
|
Story story = storyService.findById(id);
|
||||||
|
return ResponseEntity.ok(convertToReadingDto(story));
|
||||||
|
}
|
||||||
|
|
||||||
@PostMapping
|
@PostMapping
|
||||||
public ResponseEntity<StoryDto> createStory(@Valid @RequestBody CreateStoryRequest request) {
|
public ResponseEntity<StoryDto> createStory(@Valid @RequestBody CreateStoryRequest request) {
|
||||||
|
logger.info("Creating new story: {}", request.getTitle());
|
||||||
Story story = new Story();
|
Story story = new Story();
|
||||||
updateStoryFromRequest(story, request);
|
updateStoryFromRequest(story, request);
|
||||||
|
|
||||||
Story savedStory = storyService.createWithTagNames(story, request.getTagNames());
|
Story savedStory = storyService.createWithTagNames(story, request.getTagNames());
|
||||||
|
logger.info("Successfully created story: {} (ID: {})", savedStory.getTitle(), savedStory.getId());
|
||||||
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedStory));
|
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedStory));
|
||||||
}
|
}
|
||||||
|
|
||||||
@PutMapping("/{id}")
|
@PutMapping("/{id}")
|
||||||
public ResponseEntity<StoryDto> updateStory(@PathVariable UUID id,
|
public ResponseEntity<StoryDto> updateStory(@PathVariable UUID id,
|
||||||
@Valid @RequestBody UpdateStoryRequest request) {
|
@Valid @RequestBody UpdateStoryRequest request) {
|
||||||
|
logger.info("Updating story: {} (ID: {})", request.getTitle(), id);
|
||||||
|
|
||||||
|
// Handle author creation/lookup at controller level before calling service
|
||||||
|
if (request.getAuthorName() != null && !request.getAuthorName().trim().isEmpty() && request.getAuthorId() == null) {
|
||||||
|
Author author = findOrCreateAuthor(request.getAuthorName().trim());
|
||||||
|
request.setAuthorId(author.getId());
|
||||||
|
request.setAuthorName(null); // Clear author name since we now have the ID
|
||||||
|
}
|
||||||
|
|
||||||
Story updatedStory = storyService.updateWithTagNames(id, request);
|
Story updatedStory = storyService.updateWithTagNames(id, request);
|
||||||
|
logger.info("Successfully updated story: {}", updatedStory.getTitle());
|
||||||
return ResponseEntity.ok(convertToDto(updatedStory));
|
return ResponseEntity.ok(convertToDto(updatedStory));
|
||||||
}
|
}
|
||||||
|
|
||||||
@DeleteMapping("/{id}")
|
@DeleteMapping("/{id}")
|
||||||
public ResponseEntity<?> deleteStory(@PathVariable UUID id) {
|
public ResponseEntity<?> deleteStory(@PathVariable UUID id) {
|
||||||
|
logger.info("Deleting story with ID: {}", id);
|
||||||
storyService.delete(id);
|
storyService.delete(id);
|
||||||
|
logger.info("Successfully deleted story with ID: {}", id);
|
||||||
return ResponseEntity.ok(Map.of("message", "Story deleted successfully"));
|
return ResponseEntity.ok(Map.of("message", "Story deleted successfully"));
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -143,6 +215,52 @@ public class StoryController {
|
|||||||
return ResponseEntity.ok(convertToDto(story));
|
return ResponseEntity.ok(convertToDto(story));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@PostMapping("/{id}/reading-progress")
|
||||||
|
public ResponseEntity<StoryDto> updateReadingProgress(@PathVariable UUID id, @RequestBody ReadingProgressRequest request) {
|
||||||
|
logger.info("Updating reading progress for story {} to position {}", id, request.getPosition());
|
||||||
|
Story story = storyService.updateReadingProgress(id, request.getPosition());
|
||||||
|
return ResponseEntity.ok(convertToDto(story));
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/{id}/reading-status")
|
||||||
|
public ResponseEntity<StoryDto> updateReadingStatus(@PathVariable UUID id, @RequestBody ReadingStatusRequest request) {
|
||||||
|
logger.info("Updating reading status for story {} to {}", id, request.getIsRead() ? "read" : "unread");
|
||||||
|
Story story = storyService.updateReadingStatus(id, request.getIsRead());
|
||||||
|
return ResponseEntity.ok(convertToDto(story));
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/{id}/process-content-images")
|
||||||
|
public ResponseEntity<Map<String, Object>> processContentImages(@PathVariable UUID id, @RequestBody ProcessContentImagesRequest request) {
|
||||||
|
logger.info("Processing content images for story {}", id);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Process the HTML content to download and replace image URLs
|
||||||
|
ImageService.ContentImageProcessingResult result = imageService.processContentImages(request.getHtmlContent(), id);
|
||||||
|
|
||||||
|
// If there are warnings, let the client decide whether to proceed
|
||||||
|
if (result.hasWarnings()) {
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"processedContent", result.getProcessedContent(),
|
||||||
|
"warnings", result.getWarnings(),
|
||||||
|
"downloadedImages", result.getDownloadedImages(),
|
||||||
|
"hasWarnings", true
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Success - no warnings
|
||||||
|
return ResponseEntity.ok(Map.of(
|
||||||
|
"processedContent", result.getProcessedContent(),
|
||||||
|
"downloadedImages", result.getDownloadedImages(),
|
||||||
|
"hasWarnings", false
|
||||||
|
));
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to process content images for story {}", id, e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
.body(Map.of("error", "Failed to process content images: " + e.getMessage()));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
@PostMapping("/reindex")
|
@PostMapping("/reindex")
|
||||||
public ResponseEntity<String> manualReindex() {
|
public ResponseEntity<String> manualReindex() {
|
||||||
if (typesenseService == null) {
|
if (typesenseService == null) {
|
||||||
@@ -207,13 +325,32 @@ public class StoryController {
|
|||||||
@RequestParam(required = false) Integer minRating,
|
@RequestParam(required = false) Integer minRating,
|
||||||
@RequestParam(required = false) Integer maxRating,
|
@RequestParam(required = false) Integer maxRating,
|
||||||
@RequestParam(required = false) String sortBy,
|
@RequestParam(required = false) String sortBy,
|
||||||
@RequestParam(required = false) String sortDir) {
|
@RequestParam(required = false) String sortDir,
|
||||||
|
@RequestParam(required = false) String facetBy,
|
||||||
|
// Advanced filters
|
||||||
|
@RequestParam(required = false) Integer minWordCount,
|
||||||
|
@RequestParam(required = false) Integer maxWordCount,
|
||||||
|
@RequestParam(required = false) String createdAfter,
|
||||||
|
@RequestParam(required = false) String createdBefore,
|
||||||
|
@RequestParam(required = false) String lastReadAfter,
|
||||||
|
@RequestParam(required = false) String lastReadBefore,
|
||||||
|
@RequestParam(required = false) Boolean unratedOnly,
|
||||||
|
@RequestParam(required = false) String readingStatus,
|
||||||
|
@RequestParam(required = false) Boolean hasReadingProgress,
|
||||||
|
@RequestParam(required = false) Boolean hasCoverImage,
|
||||||
|
@RequestParam(required = false) String sourceDomain,
|
||||||
|
@RequestParam(required = false) String seriesFilter,
|
||||||
|
@RequestParam(required = false) Integer minTagCount,
|
||||||
|
@RequestParam(required = false) Boolean popularOnly,
|
||||||
|
@RequestParam(required = false) Boolean hiddenGemsOnly) {
|
||||||
|
|
||||||
logger.info("CONTROLLER DEBUG: Search request - query='{}', tags={}, authors={}", query, tags, authors);
|
|
||||||
|
|
||||||
if (typesenseService != null) {
|
if (typesenseService != null) {
|
||||||
SearchResultDto<StorySearchDto> results = typesenseService.searchStories(
|
SearchResultDto<StorySearchDto> results = typesenseService.searchStories(
|
||||||
query, page, size, authors, tags, minRating, maxRating, sortBy, sortDir);
|
query, page, size, authors, tags, minRating, maxRating, sortBy, sortDir, facetBy,
|
||||||
|
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||||
|
unratedOnly, readingStatus, hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter,
|
||||||
|
minTagCount, popularOnly, hiddenGemsOnly);
|
||||||
return ResponseEntity.ok(results);
|
return ResponseEntity.ok(results);
|
||||||
} else {
|
} else {
|
||||||
// Fallback to basic search if Typesense is not available
|
// Fallback to basic search if Typesense is not available
|
||||||
@@ -353,25 +490,55 @@ public class StoryController {
|
|||||||
story.setDescription(updateReq.getDescription());
|
story.setDescription(updateReq.getDescription());
|
||||||
}
|
}
|
||||||
if (updateReq.getContentHtml() != null) {
|
if (updateReq.getContentHtml() != null) {
|
||||||
story.setContentHtml(sanitizationService.sanitize(updateReq.getContentHtml()));
|
logger.info("Content before sanitization (length: {}): {}",
|
||||||
|
updateReq.getContentHtml().length(),
|
||||||
|
updateReq.getContentHtml().substring(0, Math.min(500, updateReq.getContentHtml().length())));
|
||||||
|
String sanitizedContent = sanitizationService.sanitize(updateReq.getContentHtml());
|
||||||
|
logger.info("Content after sanitization (length: {}): {}",
|
||||||
|
sanitizedContent.length(),
|
||||||
|
sanitizedContent.substring(0, Math.min(500, sanitizedContent.length())));
|
||||||
|
story.setContentHtml(sanitizedContent);
|
||||||
}
|
}
|
||||||
if (updateReq.getSourceUrl() != null) {
|
if (updateReq.getSourceUrl() != null) {
|
||||||
story.setSourceUrl(updateReq.getSourceUrl());
|
story.setSourceUrl(updateReq.getSourceUrl());
|
||||||
}
|
}
|
||||||
if (updateReq.getVolume() != null) {
|
// Volume will be handled in series logic below
|
||||||
story.setVolume(updateReq.getVolume());
|
// Handle author - either by ID or by name
|
||||||
}
|
|
||||||
if (updateReq.getAuthorId() != null) {
|
if (updateReq.getAuthorId() != null) {
|
||||||
Author author = authorService.findById(updateReq.getAuthorId());
|
Author author = authorService.findById(updateReq.getAuthorId());
|
||||||
story.setAuthor(author);
|
story.setAuthor(author);
|
||||||
|
} else if (updateReq.getAuthorName() != null && !updateReq.getAuthorName().trim().isEmpty()) {
|
||||||
|
Author author = findOrCreateAuthor(updateReq.getAuthorName().trim());
|
||||||
|
story.setAuthor(author);
|
||||||
}
|
}
|
||||||
// Handle series - either by ID or by name
|
// Handle series - either by ID, by name, or remove from series
|
||||||
if (updateReq.getSeriesId() != null) {
|
if (updateReq.getSeriesId() != null) {
|
||||||
Series series = seriesService.findById(updateReq.getSeriesId());
|
Series series = seriesService.findById(updateReq.getSeriesId());
|
||||||
story.setSeries(series);
|
story.setSeries(series);
|
||||||
} else if (updateReq.getSeriesName() != null && !updateReq.getSeriesName().trim().isEmpty()) {
|
} else if (updateReq.getSeriesName() != null) {
|
||||||
|
logger.info("Processing series update: seriesName='{}', isEmpty={}", updateReq.getSeriesName(), updateReq.getSeriesName().trim().isEmpty());
|
||||||
|
if (updateReq.getSeriesName().trim().isEmpty()) {
|
||||||
|
// Empty series name means remove from series
|
||||||
|
logger.info("Removing story from series");
|
||||||
|
if (story.getSeries() != null) {
|
||||||
|
story.getSeries().removeStory(story);
|
||||||
|
story.setSeries(null);
|
||||||
|
story.setVolume(null);
|
||||||
|
logger.info("Story removed from series");
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Non-empty series name means add to series
|
||||||
|
logger.info("Adding story to series: '{}', volume: {}", updateReq.getSeriesName().trim(), updateReq.getVolume());
|
||||||
Series series = seriesService.findOrCreate(updateReq.getSeriesName().trim());
|
Series series = seriesService.findOrCreate(updateReq.getSeriesName().trim());
|
||||||
story.setSeries(series);
|
story.setSeries(series);
|
||||||
|
// Set volume only if series is being set
|
||||||
|
if (updateReq.getVolume() != null) {
|
||||||
|
story.setVolume(updateReq.getVolume());
|
||||||
|
logger.info("Story added to series: {} with volume: {}", series.getName(), updateReq.getVolume());
|
||||||
|
} else {
|
||||||
|
logger.info("Story added to series: {} with no volume", series.getName());
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Note: Tags are now handled in StoryService.updateWithTagNames()
|
// Note: Tags are now handled in StoryService.updateWithTagNames()
|
||||||
@@ -385,7 +552,6 @@ public class StoryController {
|
|||||||
dto.setSummary(story.getSummary());
|
dto.setSummary(story.getSummary());
|
||||||
dto.setDescription(story.getDescription());
|
dto.setDescription(story.getDescription());
|
||||||
dto.setContentHtml(story.getContentHtml());
|
dto.setContentHtml(story.getContentHtml());
|
||||||
dto.setContentPlain(story.getContentPlain());
|
|
||||||
dto.setSourceUrl(story.getSourceUrl());
|
dto.setSourceUrl(story.getSourceUrl());
|
||||||
dto.setCoverPath(story.getCoverPath());
|
dto.setCoverPath(story.getCoverPath());
|
||||||
dto.setWordCount(story.getWordCount());
|
dto.setWordCount(story.getWordCount());
|
||||||
@@ -394,6 +560,48 @@ public class StoryController {
|
|||||||
dto.setCreatedAt(story.getCreatedAt());
|
dto.setCreatedAt(story.getCreatedAt());
|
||||||
dto.setUpdatedAt(story.getUpdatedAt());
|
dto.setUpdatedAt(story.getUpdatedAt());
|
||||||
|
|
||||||
|
// Reading progress fields
|
||||||
|
dto.setIsRead(story.getIsRead());
|
||||||
|
dto.setReadingPosition(story.getReadingPosition());
|
||||||
|
dto.setLastReadAt(story.getLastReadAt());
|
||||||
|
|
||||||
|
if (story.getAuthor() != null) {
|
||||||
|
dto.setAuthorId(story.getAuthor().getId());
|
||||||
|
dto.setAuthorName(story.getAuthor().getName());
|
||||||
|
}
|
||||||
|
|
||||||
|
if (story.getSeries() != null) {
|
||||||
|
dto.setSeriesId(story.getSeries().getId());
|
||||||
|
dto.setSeriesName(story.getSeries().getName());
|
||||||
|
}
|
||||||
|
|
||||||
|
dto.setTags(story.getTags().stream()
|
||||||
|
.map(this::convertTagToDto)
|
||||||
|
.collect(Collectors.toList()));
|
||||||
|
|
||||||
|
return dto;
|
||||||
|
}
|
||||||
|
|
||||||
|
private StoryReadingDto convertToReadingDto(Story story) {
|
||||||
|
StoryReadingDto dto = new StoryReadingDto();
|
||||||
|
dto.setId(story.getId());
|
||||||
|
dto.setTitle(story.getTitle());
|
||||||
|
dto.setSummary(story.getSummary());
|
||||||
|
dto.setDescription(story.getDescription());
|
||||||
|
dto.setContentHtml(story.getContentHtml());
|
||||||
|
dto.setSourceUrl(story.getSourceUrl());
|
||||||
|
dto.setCoverPath(story.getCoverPath());
|
||||||
|
dto.setWordCount(story.getWordCount());
|
||||||
|
dto.setRating(story.getRating());
|
||||||
|
dto.setVolume(story.getVolume());
|
||||||
|
dto.setCreatedAt(story.getCreatedAt());
|
||||||
|
dto.setUpdatedAt(story.getUpdatedAt());
|
||||||
|
|
||||||
|
// Reading progress fields
|
||||||
|
dto.setIsRead(story.getIsRead());
|
||||||
|
dto.setReadingPosition(story.getReadingPosition());
|
||||||
|
dto.setLastReadAt(story.getLastReadAt());
|
||||||
|
|
||||||
if (story.getAuthor() != null) {
|
if (story.getAuthor() != null) {
|
||||||
dto.setAuthorId(story.getAuthor().getId());
|
dto.setAuthorId(story.getAuthor().getId());
|
||||||
dto.setAuthorName(story.getAuthor().getName());
|
dto.setAuthorName(story.getAuthor().getName());
|
||||||
@@ -426,6 +634,11 @@ public class StoryController {
|
|||||||
dto.setUpdatedAt(story.getUpdatedAt());
|
dto.setUpdatedAt(story.getUpdatedAt());
|
||||||
dto.setPartOfSeries(story.isPartOfSeries());
|
dto.setPartOfSeries(story.isPartOfSeries());
|
||||||
|
|
||||||
|
// Reading progress fields
|
||||||
|
dto.setIsRead(story.getIsRead());
|
||||||
|
dto.setReadingPosition(story.getReadingPosition());
|
||||||
|
dto.setLastReadAt(story.getLastReadAt());
|
||||||
|
|
||||||
if (story.getAuthor() != null) {
|
if (story.getAuthor() != null) {
|
||||||
dto.setAuthorId(story.getAuthor().getId());
|
dto.setAuthorId(story.getAuthor().getId());
|
||||||
dto.setAuthorName(story.getAuthor().getName());
|
dto.setAuthorName(story.getAuthor().getName());
|
||||||
@@ -447,8 +660,11 @@ public class StoryController {
|
|||||||
TagDto tagDto = new TagDto();
|
TagDto tagDto = new TagDto();
|
||||||
tagDto.setId(tag.getId());
|
tagDto.setId(tag.getId());
|
||||||
tagDto.setName(tag.getName());
|
tagDto.setName(tag.getName());
|
||||||
|
tagDto.setColor(tag.getColor());
|
||||||
|
tagDto.setDescription(tag.getDescription());
|
||||||
tagDto.setCreatedAt(tag.getCreatedAt());
|
tagDto.setCreatedAt(tag.getCreatedAt());
|
||||||
// storyCount can be set if needed, but it might be expensive to calculate for each tag
|
tagDto.setStoryCount(tag.getStories() != null ? tag.getStories().size() : 0);
|
||||||
|
tagDto.setAliasCount(tag.getAliases() != null ? tag.getAliases().size() : 0);
|
||||||
return tagDto;
|
return tagDto;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -467,12 +683,151 @@ public class StoryController {
|
|||||||
// to avoid circular references and keep it lightweight
|
// to avoid circular references and keep it lightweight
|
||||||
dto.setStoryCount(collection.getStoryCount());
|
dto.setStoryCount(collection.getStoryCount());
|
||||||
dto.setTotalWordCount(collection.getTotalWordCount());
|
dto.setTotalWordCount(collection.getTotalWordCount());
|
||||||
dto.setEstimatedReadingTime(collection.getEstimatedReadingTime());
|
dto.setEstimatedReadingTime(readingTimeService.calculateReadingTime(collection.getTotalWordCount()));
|
||||||
dto.setAverageStoryRating(collection.getAverageStoryRating());
|
dto.setAverageStoryRating(collection.getAverageStoryRating());
|
||||||
|
|
||||||
return dto;
|
return dto;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@GetMapping("/check-duplicate")
|
||||||
|
public ResponseEntity<Map<String, Object>> checkDuplicate(
|
||||||
|
@RequestParam String title,
|
||||||
|
@RequestParam String authorName) {
|
||||||
|
try {
|
||||||
|
List<Story> duplicates = storyService.findPotentialDuplicates(title, authorName);
|
||||||
|
|
||||||
|
Map<String, Object> response = Map.of(
|
||||||
|
"hasDuplicates", !duplicates.isEmpty(),
|
||||||
|
"count", duplicates.size(),
|
||||||
|
"duplicates", duplicates.stream()
|
||||||
|
.map(story -> Map.of(
|
||||||
|
"id", story.getId(),
|
||||||
|
"title", story.getTitle(),
|
||||||
|
"authorName", story.getAuthor() != null ? story.getAuthor().getName() : "",
|
||||||
|
"createdAt", story.getCreatedAt()
|
||||||
|
))
|
||||||
|
.collect(Collectors.toList())
|
||||||
|
);
|
||||||
|
|
||||||
|
return ResponseEntity.ok(response);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error checking for duplicates", e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
.body(Map.of("error", "Failed to check for duplicates"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// EPUB Import endpoint
|
||||||
|
@PostMapping("/epub/import")
|
||||||
|
public ResponseEntity<EPUBImportResponse> importEPUB(
|
||||||
|
@RequestParam("file") MultipartFile file,
|
||||||
|
@RequestParam(required = false) UUID authorId,
|
||||||
|
@RequestParam(required = false) String authorName,
|
||||||
|
@RequestParam(required = false) UUID seriesId,
|
||||||
|
@RequestParam(required = false) String seriesName,
|
||||||
|
@RequestParam(required = false) Integer seriesVolume,
|
||||||
|
@RequestParam(required = false) List<String> tags,
|
||||||
|
@RequestParam(defaultValue = "true") Boolean preserveReadingPosition,
|
||||||
|
@RequestParam(defaultValue = "false") Boolean overwriteExisting,
|
||||||
|
@RequestParam(defaultValue = "true") Boolean createMissingAuthor,
|
||||||
|
@RequestParam(defaultValue = "true") Boolean createMissingSeries) {
|
||||||
|
|
||||||
|
logger.info("Importing EPUB file: {}", file.getOriginalFilename());
|
||||||
|
|
||||||
|
EPUBImportRequest request = new EPUBImportRequest();
|
||||||
|
request.setEpubFile(file);
|
||||||
|
request.setAuthorId(authorId);
|
||||||
|
request.setAuthorName(authorName);
|
||||||
|
request.setSeriesId(seriesId);
|
||||||
|
request.setSeriesName(seriesName);
|
||||||
|
request.setSeriesVolume(seriesVolume);
|
||||||
|
request.setTags(tags);
|
||||||
|
request.setPreserveReadingPosition(preserveReadingPosition);
|
||||||
|
request.setOverwriteExisting(overwriteExisting);
|
||||||
|
request.setCreateMissingAuthor(createMissingAuthor);
|
||||||
|
request.setCreateMissingSeries(createMissingSeries);
|
||||||
|
|
||||||
|
try {
|
||||||
|
EPUBImportResponse response = epubImportService.importEPUB(request);
|
||||||
|
|
||||||
|
if (response.isSuccess()) {
|
||||||
|
logger.info("Successfully imported EPUB: {} (Story ID: {})",
|
||||||
|
response.getStoryTitle(), response.getStoryId());
|
||||||
|
return ResponseEntity.ok(response);
|
||||||
|
} else {
|
||||||
|
logger.warn("EPUB import failed: {}", response.getMessage());
|
||||||
|
return ResponseEntity.badRequest().body(response);
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error importing EPUB: {}", e.getMessage(), e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
.body(EPUBImportResponse.error("Internal server error: " + e.getMessage()));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// EPUB Export endpoint
|
||||||
|
@PostMapping("/epub/export")
|
||||||
|
public ResponseEntity<org.springframework.core.io.Resource> exportEPUB(
|
||||||
|
@Valid @RequestBody EPUBExportRequest request) {
|
||||||
|
|
||||||
|
logger.info("Exporting story {} to EPUB", request.getStoryId());
|
||||||
|
|
||||||
|
try {
|
||||||
|
if (!epubExportService.canExportStory(request.getStoryId())) {
|
||||||
|
return ResponseEntity.badRequest().build();
|
||||||
|
}
|
||||||
|
|
||||||
|
org.springframework.core.io.Resource resource = epubExportService.exportStoryAsEPUB(request);
|
||||||
|
Story story = storyService.findById(request.getStoryId());
|
||||||
|
String filename = epubExportService.getEPUBFilename(story);
|
||||||
|
|
||||||
|
logger.info("Successfully exported EPUB: {}", filename);
|
||||||
|
|
||||||
|
return ResponseEntity.ok()
|
||||||
|
.header("Content-Disposition", "attachment; filename=\"" + filename + "\"")
|
||||||
|
.header("Content-Type", "application/epub+zip")
|
||||||
|
.body(resource);
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error exporting EPUB: {}", e.getMessage(), e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// EPUB Export by story ID (GET endpoint)
|
||||||
|
@GetMapping("/{id}/epub")
|
||||||
|
public ResponseEntity<org.springframework.core.io.Resource> exportStoryAsEPUB(@PathVariable UUID id) {
|
||||||
|
logger.info("Exporting story {} to EPUB via GET", id);
|
||||||
|
|
||||||
|
EPUBExportRequest request = new EPUBExportRequest(id);
|
||||||
|
return exportEPUB(request);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate EPUB file
|
||||||
|
@PostMapping("/epub/validate")
|
||||||
|
public ResponseEntity<Map<String, Object>> validateEPUBFile(@RequestParam("file") MultipartFile file) {
|
||||||
|
logger.info("Validating EPUB file: {}", file.getOriginalFilename());
|
||||||
|
|
||||||
|
try {
|
||||||
|
List<String> errors = epubImportService.validateEPUBFile(file);
|
||||||
|
|
||||||
|
Map<String, Object> response = Map.of(
|
||||||
|
"valid", errors.isEmpty(),
|
||||||
|
"errors", errors,
|
||||||
|
"filename", file.getOriginalFilename(),
|
||||||
|
"size", file.getSize()
|
||||||
|
);
|
||||||
|
|
||||||
|
return ResponseEntity.ok(response);
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error validating EPUB file: {}", e.getMessage(), e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
.body(Map.of("error", "Failed to validate EPUB file"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Request DTOs
|
// Request DTOs
|
||||||
public static class CreateStoryRequest {
|
public static class CreateStoryRequest {
|
||||||
private String title;
|
private String title;
|
||||||
@@ -520,6 +875,7 @@ public class StoryController {
|
|||||||
private String sourceUrl;
|
private String sourceUrl;
|
||||||
private Integer volume;
|
private Integer volume;
|
||||||
private UUID authorId;
|
private UUID authorId;
|
||||||
|
private String authorName;
|
||||||
private UUID seriesId;
|
private UUID seriesId;
|
||||||
private String seriesName;
|
private String seriesName;
|
||||||
private List<String> tagNames;
|
private List<String> tagNames;
|
||||||
@@ -539,6 +895,8 @@ public class StoryController {
|
|||||||
public void setVolume(Integer volume) { this.volume = volume; }
|
public void setVolume(Integer volume) { this.volume = volume; }
|
||||||
public UUID getAuthorId() { return authorId; }
|
public UUID getAuthorId() { return authorId; }
|
||||||
public void setAuthorId(UUID authorId) { this.authorId = authorId; }
|
public void setAuthorId(UUID authorId) { this.authorId = authorId; }
|
||||||
|
public String getAuthorName() { return authorName; }
|
||||||
|
public void setAuthorName(String authorName) { this.authorName = authorName; }
|
||||||
public UUID getSeriesId() { return seriesId; }
|
public UUID getSeriesId() { return seriesId; }
|
||||||
public void setSeriesId(UUID seriesId) { this.seriesId = seriesId; }
|
public void setSeriesId(UUID seriesId) { this.seriesId = seriesId; }
|
||||||
public String getSeriesName() { return seriesName; }
|
public String getSeriesName() { return seriesName; }
|
||||||
|
|||||||
@@ -1,9 +1,13 @@
|
|||||||
package com.storycove.controller;
|
package com.storycove.controller;
|
||||||
|
|
||||||
import com.storycove.dto.TagDto;
|
import com.storycove.dto.TagDto;
|
||||||
|
import com.storycove.dto.TagAliasDto;
|
||||||
import com.storycove.entity.Tag;
|
import com.storycove.entity.Tag;
|
||||||
|
import com.storycove.entity.TagAlias;
|
||||||
import com.storycove.service.TagService;
|
import com.storycove.service.TagService;
|
||||||
import jakarta.validation.Valid;
|
import jakarta.validation.Valid;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.data.domain.Page;
|
import org.springframework.data.domain.Page;
|
||||||
import org.springframework.data.domain.PageRequest;
|
import org.springframework.data.domain.PageRequest;
|
||||||
import org.springframework.data.domain.Pageable;
|
import org.springframework.data.domain.Pageable;
|
||||||
@@ -21,6 +25,7 @@ import java.util.stream.Collectors;
|
|||||||
@RequestMapping("/api/tags")
|
@RequestMapping("/api/tags")
|
||||||
public class TagController {
|
public class TagController {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(TagController.class);
|
||||||
private final TagService tagService;
|
private final TagService tagService;
|
||||||
|
|
||||||
public TagController(TagService tagService) {
|
public TagController(TagService tagService) {
|
||||||
@@ -54,6 +59,8 @@ public class TagController {
|
|||||||
public ResponseEntity<TagDto> createTag(@Valid @RequestBody CreateTagRequest request) {
|
public ResponseEntity<TagDto> createTag(@Valid @RequestBody CreateTagRequest request) {
|
||||||
Tag tag = new Tag();
|
Tag tag = new Tag();
|
||||||
tag.setName(request.getName());
|
tag.setName(request.getName());
|
||||||
|
tag.setColor(request.getColor());
|
||||||
|
tag.setDescription(request.getDescription());
|
||||||
|
|
||||||
Tag savedTag = tagService.create(tag);
|
Tag savedTag = tagService.create(tag);
|
||||||
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedTag));
|
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedTag));
|
||||||
@@ -66,6 +73,12 @@ public class TagController {
|
|||||||
if (request.getName() != null) {
|
if (request.getName() != null) {
|
||||||
existingTag.setName(request.getName());
|
existingTag.setName(request.getName());
|
||||||
}
|
}
|
||||||
|
if (request.getColor() != null) {
|
||||||
|
existingTag.setColor(request.getColor());
|
||||||
|
}
|
||||||
|
if (request.getDescription() != null) {
|
||||||
|
existingTag.setDescription(request.getDescription());
|
||||||
|
}
|
||||||
|
|
||||||
Tag updatedTag = tagService.update(id, existingTag);
|
Tag updatedTag = tagService.update(id, existingTag);
|
||||||
return ResponseEntity.ok(convertToDto(updatedTag));
|
return ResponseEntity.ok(convertToDto(updatedTag));
|
||||||
@@ -95,7 +108,7 @@ public class TagController {
|
|||||||
@RequestParam String query,
|
@RequestParam String query,
|
||||||
@RequestParam(defaultValue = "10") int limit) {
|
@RequestParam(defaultValue = "10") int limit) {
|
||||||
|
|
||||||
List<Tag> tags = tagService.findByNameStartingWith(query, limit);
|
List<Tag> tags = tagService.findByNameOrAliasStartingWith(query, limit);
|
||||||
List<TagDto> tagDtos = tags.stream().map(this::convertToDto).collect(Collectors.toList());
|
List<TagDto> tagDtos = tags.stream().map(this::convertToDto).collect(Collectors.toList());
|
||||||
|
|
||||||
return ResponseEntity.ok(tagDtos);
|
return ResponseEntity.ok(tagDtos);
|
||||||
@@ -132,29 +145,257 @@ public class TagController {
|
|||||||
return ResponseEntity.ok(stats);
|
return ResponseEntity.ok(stats);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@GetMapping("/collections")
|
||||||
|
public ResponseEntity<List<TagDto>> getTagsUsedByCollections() {
|
||||||
|
List<Tag> tags = tagService.findTagsUsedByCollections();
|
||||||
|
List<TagDto> tagDtos = tags.stream()
|
||||||
|
.map(this::convertToDtoWithCollectionCount)
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
|
||||||
|
return ResponseEntity.ok(tagDtos);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tag alias endpoints
|
||||||
|
@PostMapping("/{tagId}/aliases")
|
||||||
|
public ResponseEntity<TagAliasDto> addAlias(@PathVariable UUID tagId,
|
||||||
|
@RequestBody Map<String, String> request) {
|
||||||
|
String aliasName = request.get("aliasName");
|
||||||
|
if (aliasName == null || aliasName.trim().isEmpty()) {
|
||||||
|
return ResponseEntity.badRequest().build();
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
TagAlias alias = tagService.addAlias(tagId, aliasName.trim());
|
||||||
|
TagAliasDto dto = new TagAliasDto();
|
||||||
|
dto.setId(alias.getId());
|
||||||
|
dto.setAliasName(alias.getAliasName());
|
||||||
|
dto.setCanonicalTagId(alias.getCanonicalTag().getId());
|
||||||
|
dto.setCanonicalTagName(alias.getCanonicalTag().getName());
|
||||||
|
dto.setCreatedFromMerge(alias.getCreatedFromMerge());
|
||||||
|
dto.setCreatedAt(alias.getCreatedAt());
|
||||||
|
|
||||||
|
return ResponseEntity.status(HttpStatus.CREATED).body(dto);
|
||||||
|
} catch (Exception e) {
|
||||||
|
return ResponseEntity.badRequest().build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@DeleteMapping("/{tagId}/aliases/{aliasId}")
|
||||||
|
public ResponseEntity<?> removeAlias(@PathVariable UUID tagId, @PathVariable UUID aliasId) {
|
||||||
|
try {
|
||||||
|
tagService.removeAlias(tagId, aliasId);
|
||||||
|
return ResponseEntity.ok(Map.of("message", "Alias removed successfully"));
|
||||||
|
} catch (Exception e) {
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", e.getMessage()));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/resolve/{name}")
|
||||||
|
public ResponseEntity<TagDto> resolveTag(@PathVariable String name) {
|
||||||
|
try {
|
||||||
|
Tag resolvedTag = tagService.resolveTagByName(name);
|
||||||
|
if (resolvedTag != null) {
|
||||||
|
return ResponseEntity.ok(convertToDto(resolvedTag));
|
||||||
|
} else {
|
||||||
|
return ResponseEntity.notFound().build();
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
return ResponseEntity.notFound().build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/merge")
|
||||||
|
public ResponseEntity<?> mergeTags(@Valid @RequestBody MergeTagsRequest request) {
|
||||||
|
try {
|
||||||
|
Tag resultTag = tagService.mergeTags(request.getSourceTagUUIDs(), request.getTargetTagUUID());
|
||||||
|
return ResponseEntity.ok(convertToDto(resultTag));
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to merge tags", e);
|
||||||
|
String errorMessage = e.getMessage() != null ? e.getMessage() : "Unknown error occurred";
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", errorMessage));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/merge/preview")
|
||||||
|
public ResponseEntity<?> previewMerge(@Valid @RequestBody MergeTagsRequest request) {
|
||||||
|
try {
|
||||||
|
MergePreviewResponse preview = tagService.previewMerge(request.getSourceTagUUIDs(), request.getTargetTagUUID());
|
||||||
|
return ResponseEntity.ok(preview);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to preview merge", e);
|
||||||
|
String errorMessage = e.getMessage() != null ? e.getMessage() : "Unknown error occurred";
|
||||||
|
return ResponseEntity.badRequest().body(Map.of("error", errorMessage));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostMapping("/suggest")
|
||||||
|
public ResponseEntity<List<TagSuggestion>> suggestTags(@RequestBody TagSuggestionRequest request) {
|
||||||
|
try {
|
||||||
|
List<TagSuggestion> suggestions = tagService.suggestTags(
|
||||||
|
request.getTitle(),
|
||||||
|
request.getContent(),
|
||||||
|
request.getSummary(),
|
||||||
|
request.getLimit() != null ? request.getLimit() : 10
|
||||||
|
);
|
||||||
|
return ResponseEntity.ok(suggestions);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to suggest tags", e);
|
||||||
|
return ResponseEntity.ok(List.of()); // Return empty list on error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private TagDto convertToDto(Tag tag) {
|
private TagDto convertToDto(Tag tag) {
|
||||||
TagDto dto = new TagDto();
|
TagDto dto = new TagDto();
|
||||||
dto.setId(tag.getId());
|
dto.setId(tag.getId());
|
||||||
dto.setName(tag.getName());
|
dto.setName(tag.getName());
|
||||||
|
dto.setColor(tag.getColor());
|
||||||
|
dto.setDescription(tag.getDescription());
|
||||||
dto.setStoryCount(tag.getStories() != null ? tag.getStories().size() : 0);
|
dto.setStoryCount(tag.getStories() != null ? tag.getStories().size() : 0);
|
||||||
|
dto.setCollectionCount(tag.getCollections() != null ? tag.getCollections().size() : 0);
|
||||||
|
dto.setAliasCount(tag.getAliases() != null ? tag.getAliases().size() : 0);
|
||||||
dto.setCreatedAt(tag.getCreatedAt());
|
dto.setCreatedAt(tag.getCreatedAt());
|
||||||
// updatedAt field not present in Tag entity per spec
|
// updatedAt field not present in Tag entity per spec
|
||||||
|
|
||||||
|
// Convert aliases to DTOs for full context
|
||||||
|
if (tag.getAliases() != null && !tag.getAliases().isEmpty()) {
|
||||||
|
List<TagAliasDto> aliaseDtos = tag.getAliases().stream()
|
||||||
|
.map(alias -> {
|
||||||
|
TagAliasDto aliasDto = new TagAliasDto();
|
||||||
|
aliasDto.setId(alias.getId());
|
||||||
|
aliasDto.setAliasName(alias.getAliasName());
|
||||||
|
aliasDto.setCanonicalTagId(alias.getCanonicalTag().getId());
|
||||||
|
aliasDto.setCanonicalTagName(alias.getCanonicalTag().getName());
|
||||||
|
aliasDto.setCreatedFromMerge(alias.getCreatedFromMerge());
|
||||||
|
aliasDto.setCreatedAt(alias.getCreatedAt());
|
||||||
|
return aliasDto;
|
||||||
|
})
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
dto.setAliases(aliaseDtos);
|
||||||
|
}
|
||||||
|
|
||||||
|
return dto;
|
||||||
|
}
|
||||||
|
|
||||||
|
private TagDto convertToDtoWithCollectionCount(Tag tag) {
|
||||||
|
TagDto dto = new TagDto();
|
||||||
|
dto.setId(tag.getId());
|
||||||
|
dto.setName(tag.getName());
|
||||||
|
dto.setCollectionCount(tag.getCollections() != null ? tag.getCollections().size() : 0);
|
||||||
|
dto.setCreatedAt(tag.getCreatedAt());
|
||||||
|
// Don't set storyCount for collection-focused endpoint
|
||||||
|
|
||||||
return dto;
|
return dto;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Request DTOs
|
// Request DTOs
|
||||||
public static class CreateTagRequest {
|
public static class CreateTagRequest {
|
||||||
private String name;
|
private String name;
|
||||||
|
private String color;
|
||||||
|
private String description;
|
||||||
|
|
||||||
public String getName() { return name; }
|
public String getName() { return name; }
|
||||||
public void setName(String name) { this.name = name; }
|
public void setName(String name) { this.name = name; }
|
||||||
|
|
||||||
|
public String getColor() { return color; }
|
||||||
|
public void setColor(String color) { this.color = color; }
|
||||||
|
|
||||||
|
public String getDescription() { return description; }
|
||||||
|
public void setDescription(String description) { this.description = description; }
|
||||||
}
|
}
|
||||||
|
|
||||||
public static class UpdateTagRequest {
|
public static class UpdateTagRequest {
|
||||||
private String name;
|
private String name;
|
||||||
|
private String color;
|
||||||
|
private String description;
|
||||||
|
|
||||||
public String getName() { return name; }
|
public String getName() { return name; }
|
||||||
public void setName(String name) { this.name = name; }
|
public void setName(String name) { this.name = name; }
|
||||||
|
|
||||||
|
public String getColor() { return color; }
|
||||||
|
public void setColor(String color) { this.color = color; }
|
||||||
|
|
||||||
|
public String getDescription() { return description; }
|
||||||
|
public void setDescription(String description) { this.description = description; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class MergeTagsRequest {
|
||||||
|
private List<String> sourceTagIds;
|
||||||
|
private String targetTagId;
|
||||||
|
|
||||||
|
public List<String> getSourceTagIds() { return sourceTagIds; }
|
||||||
|
public void setSourceTagIds(List<String> sourceTagIds) { this.sourceTagIds = sourceTagIds; }
|
||||||
|
|
||||||
|
public String getTargetTagId() { return targetTagId; }
|
||||||
|
public void setTargetTagId(String targetTagId) { this.targetTagId = targetTagId; }
|
||||||
|
|
||||||
|
// Helper methods to convert to UUID
|
||||||
|
public List<UUID> getSourceTagUUIDs() {
|
||||||
|
return sourceTagIds != null ? sourceTagIds.stream().map(UUID::fromString).toList() : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getTargetTagUUID() {
|
||||||
|
return targetTagId != null ? UUID.fromString(targetTagId) : null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class MergePreviewResponse {
|
||||||
|
private String targetTagName;
|
||||||
|
private int targetStoryCount;
|
||||||
|
private int totalResultStoryCount;
|
||||||
|
private List<String> aliasesToCreate;
|
||||||
|
|
||||||
|
public String getTargetTagName() { return targetTagName; }
|
||||||
|
public void setTargetTagName(String targetTagName) { this.targetTagName = targetTagName; }
|
||||||
|
|
||||||
|
public int getTargetStoryCount() { return targetStoryCount; }
|
||||||
|
public void setTargetStoryCount(int targetStoryCount) { this.targetStoryCount = targetStoryCount; }
|
||||||
|
|
||||||
|
public int getTotalResultStoryCount() { return totalResultStoryCount; }
|
||||||
|
public void setTotalResultStoryCount(int totalResultStoryCount) { this.totalResultStoryCount = totalResultStoryCount; }
|
||||||
|
|
||||||
|
public List<String> getAliasesToCreate() { return aliasesToCreate; }
|
||||||
|
public void setAliasesToCreate(List<String> aliasesToCreate) { this.aliasesToCreate = aliasesToCreate; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class TagSuggestionRequest {
|
||||||
|
private String title;
|
||||||
|
private String content;
|
||||||
|
private String summary;
|
||||||
|
private Integer limit;
|
||||||
|
|
||||||
|
public String getTitle() { return title; }
|
||||||
|
public void setTitle(String title) { this.title = title; }
|
||||||
|
|
||||||
|
public String getContent() { return content; }
|
||||||
|
public void setContent(String content) { this.content = content; }
|
||||||
|
|
||||||
|
public String getSummary() { return summary; }
|
||||||
|
public void setSummary(String summary) { this.summary = summary; }
|
||||||
|
|
||||||
|
public Integer getLimit() { return limit; }
|
||||||
|
public void setLimit(Integer limit) { this.limit = limit; }
|
||||||
|
}
|
||||||
|
|
||||||
|
public static class TagSuggestion {
|
||||||
|
private String tagName;
|
||||||
|
private double confidence;
|
||||||
|
private String reason;
|
||||||
|
|
||||||
|
public TagSuggestion() {}
|
||||||
|
|
||||||
|
public TagSuggestion(String tagName, double confidence, String reason) {
|
||||||
|
this.tagName = tagName;
|
||||||
|
this.confidence = confidence;
|
||||||
|
this.reason = reason;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getTagName() { return tagName; }
|
||||||
|
public void setTagName(String tagName) { this.tagName = tagName; }
|
||||||
|
|
||||||
|
public double getConfidence() { return confidence; }
|
||||||
|
public void setConfidence(double confidence) { this.confidence = confidence; }
|
||||||
|
|
||||||
|
public String getReason() { return reason; }
|
||||||
|
public void setReason(String reason) { this.reason = reason; }
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -16,6 +16,7 @@ public class CollectionDto {
|
|||||||
private String coverImagePath;
|
private String coverImagePath;
|
||||||
private Boolean isArchived;
|
private Boolean isArchived;
|
||||||
private List<TagDto> tags;
|
private List<TagDto> tags;
|
||||||
|
private List<String> tagNames; // For search results
|
||||||
private List<CollectionStoryDto> collectionStories;
|
private List<CollectionStoryDto> collectionStories;
|
||||||
private Integer storyCount;
|
private Integer storyCount;
|
||||||
private Integer totalWordCount;
|
private Integer totalWordCount;
|
||||||
@@ -83,6 +84,14 @@ public class CollectionDto {
|
|||||||
this.tags = tags;
|
this.tags = tags;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public List<String> getTagNames() {
|
||||||
|
return tagNames;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTagNames(List<String> tagNames) {
|
||||||
|
this.tagNames = tagNames;
|
||||||
|
}
|
||||||
|
|
||||||
public List<CollectionStoryDto> getCollectionStories() {
|
public List<CollectionStoryDto> getCollectionStories() {
|
||||||
return collectionStories;
|
return collectionStories;
|
||||||
}
|
}
|
||||||
|
|||||||
115
backend/src/main/java/com/storycove/dto/EPUBExportRequest.java
Normal file
115
backend/src/main/java/com/storycove/dto/EPUBExportRequest.java
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class EPUBExportRequest {
|
||||||
|
|
||||||
|
@NotNull(message = "Story ID is required")
|
||||||
|
private UUID storyId;
|
||||||
|
|
||||||
|
private String customTitle;
|
||||||
|
|
||||||
|
private String customAuthor;
|
||||||
|
|
||||||
|
private Boolean includeReadingPosition = true;
|
||||||
|
|
||||||
|
private Boolean includeCoverImage = true;
|
||||||
|
|
||||||
|
private Boolean includeMetadata = true;
|
||||||
|
|
||||||
|
private List<String> customMetadata;
|
||||||
|
|
||||||
|
private String language = "en";
|
||||||
|
|
||||||
|
private Boolean splitByChapters = false;
|
||||||
|
|
||||||
|
private Integer maxWordsPerChapter;
|
||||||
|
|
||||||
|
public EPUBExportRequest() {}
|
||||||
|
|
||||||
|
public EPUBExportRequest(UUID storyId) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getStoryId() {
|
||||||
|
return storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStoryId(UUID storyId) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCustomTitle() {
|
||||||
|
return customTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCustomTitle(String customTitle) {
|
||||||
|
this.customTitle = customTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCustomAuthor() {
|
||||||
|
return customAuthor;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCustomAuthor(String customAuthor) {
|
||||||
|
this.customAuthor = customAuthor;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getIncludeReadingPosition() {
|
||||||
|
return includeReadingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIncludeReadingPosition(Boolean includeReadingPosition) {
|
||||||
|
this.includeReadingPosition = includeReadingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getIncludeCoverImage() {
|
||||||
|
return includeCoverImage;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIncludeCoverImage(Boolean includeCoverImage) {
|
||||||
|
this.includeCoverImage = includeCoverImage;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getIncludeMetadata() {
|
||||||
|
return includeMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIncludeMetadata(Boolean includeMetadata) {
|
||||||
|
this.includeMetadata = includeMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getCustomMetadata() {
|
||||||
|
return customMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCustomMetadata(List<String> customMetadata) {
|
||||||
|
this.customMetadata = customMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getLanguage() {
|
||||||
|
return language;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLanguage(String language) {
|
||||||
|
this.language = language;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getSplitByChapters() {
|
||||||
|
return splitByChapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSplitByChapters(Boolean splitByChapters) {
|
||||||
|
this.splitByChapters = splitByChapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getMaxWordsPerChapter() {
|
||||||
|
return maxWordsPerChapter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setMaxWordsPerChapter(Integer maxWordsPerChapter) {
|
||||||
|
this.maxWordsPerChapter = maxWordsPerChapter;
|
||||||
|
}
|
||||||
|
}
|
||||||
133
backend/src/main/java/com/storycove/dto/EPUBImportRequest.java
Normal file
133
backend/src/main/java/com/storycove/dto/EPUBImportRequest.java
Normal file
@@ -0,0 +1,133 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class EPUBImportRequest {
|
||||||
|
|
||||||
|
@NotNull(message = "EPUB file is required")
|
||||||
|
private MultipartFile epubFile;
|
||||||
|
|
||||||
|
private UUID authorId;
|
||||||
|
|
||||||
|
private String authorName;
|
||||||
|
|
||||||
|
private UUID seriesId;
|
||||||
|
|
||||||
|
private String seriesName;
|
||||||
|
|
||||||
|
private Integer seriesVolume;
|
||||||
|
|
||||||
|
private List<String> tags;
|
||||||
|
|
||||||
|
private Boolean preserveReadingPosition = true;
|
||||||
|
|
||||||
|
private Boolean overwriteExisting = false;
|
||||||
|
|
||||||
|
private Boolean createMissingAuthor = true;
|
||||||
|
|
||||||
|
private Boolean createMissingSeries = true;
|
||||||
|
|
||||||
|
private Boolean extractCover = true;
|
||||||
|
|
||||||
|
public EPUBImportRequest() {}
|
||||||
|
|
||||||
|
public MultipartFile getEpubFile() {
|
||||||
|
return epubFile;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEpubFile(MultipartFile epubFile) {
|
||||||
|
this.epubFile = epubFile;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getAuthorId() {
|
||||||
|
return authorId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAuthorId(UUID authorId) {
|
||||||
|
this.authorId = authorId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getAuthorName() {
|
||||||
|
return authorName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAuthorName(String authorName) {
|
||||||
|
this.authorName = authorName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getSeriesId() {
|
||||||
|
return seriesId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSeriesId(UUID seriesId) {
|
||||||
|
this.seriesId = seriesId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getSeriesName() {
|
||||||
|
return seriesName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSeriesName(String seriesName) {
|
||||||
|
this.seriesName = seriesName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getSeriesVolume() {
|
||||||
|
return seriesVolume;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSeriesVolume(Integer seriesVolume) {
|
||||||
|
this.seriesVolume = seriesVolume;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getTags() {
|
||||||
|
return tags;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTags(List<String> tags) {
|
||||||
|
this.tags = tags;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getPreserveReadingPosition() {
|
||||||
|
return preserveReadingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPreserveReadingPosition(Boolean preserveReadingPosition) {
|
||||||
|
this.preserveReadingPosition = preserveReadingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getOverwriteExisting() {
|
||||||
|
return overwriteExisting;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setOverwriteExisting(Boolean overwriteExisting) {
|
||||||
|
this.overwriteExisting = overwriteExisting;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getCreateMissingAuthor() {
|
||||||
|
return createMissingAuthor;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreateMissingAuthor(Boolean createMissingAuthor) {
|
||||||
|
this.createMissingAuthor = createMissingAuthor;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getCreateMissingSeries() {
|
||||||
|
return createMissingSeries;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreateMissingSeries(Boolean createMissingSeries) {
|
||||||
|
this.createMissingSeries = createMissingSeries;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getExtractCover() {
|
||||||
|
return extractCover;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setExtractCover(Boolean extractCover) {
|
||||||
|
this.extractCover = extractCover;
|
||||||
|
}
|
||||||
|
}
|
||||||
107
backend/src/main/java/com/storycove/dto/EPUBImportResponse.java
Normal file
107
backend/src/main/java/com/storycove/dto/EPUBImportResponse.java
Normal file
@@ -0,0 +1,107 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class EPUBImportResponse {
|
||||||
|
|
||||||
|
private boolean success;
|
||||||
|
private String message;
|
||||||
|
private UUID storyId;
|
||||||
|
private String storyTitle;
|
||||||
|
private Integer totalChapters;
|
||||||
|
private Integer wordCount;
|
||||||
|
private ReadingPositionDto readingPosition;
|
||||||
|
private List<String> warnings;
|
||||||
|
private List<String> errors;
|
||||||
|
|
||||||
|
public EPUBImportResponse() {}
|
||||||
|
|
||||||
|
public EPUBImportResponse(boolean success, String message) {
|
||||||
|
this.success = success;
|
||||||
|
this.message = message;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static EPUBImportResponse success(UUID storyId, String storyTitle) {
|
||||||
|
EPUBImportResponse response = new EPUBImportResponse(true, "EPUB imported successfully");
|
||||||
|
response.setStoryId(storyId);
|
||||||
|
response.setStoryTitle(storyTitle);
|
||||||
|
return response;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static EPUBImportResponse error(String message) {
|
||||||
|
return new EPUBImportResponse(false, message);
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isSuccess() {
|
||||||
|
return success;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSuccess(boolean success) {
|
||||||
|
this.success = success;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getMessage() {
|
||||||
|
return message;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setMessage(String message) {
|
||||||
|
this.message = message;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getStoryId() {
|
||||||
|
return storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStoryId(UUID storyId) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getStoryTitle() {
|
||||||
|
return storyTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStoryTitle(String storyTitle) {
|
||||||
|
this.storyTitle = storyTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getTotalChapters() {
|
||||||
|
return totalChapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTotalChapters(Integer totalChapters) {
|
||||||
|
this.totalChapters = totalChapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getWordCount() {
|
||||||
|
return wordCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWordCount(Integer wordCount) {
|
||||||
|
this.wordCount = wordCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ReadingPositionDto getReadingPosition() {
|
||||||
|
return readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setReadingPosition(ReadingPositionDto readingPosition) {
|
||||||
|
this.readingPosition = readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getWarnings() {
|
||||||
|
return warnings;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWarnings(List<String> warnings) {
|
||||||
|
this.warnings = warnings;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getErrors() {
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setErrors(List<String> errors) {
|
||||||
|
this.errors = errors;
|
||||||
|
}
|
||||||
|
}
|
||||||
31
backend/src/main/java/com/storycove/dto/FacetCountDto.java
Normal file
31
backend/src/main/java/com/storycove/dto/FacetCountDto.java
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
public class FacetCountDto {
|
||||||
|
|
||||||
|
private String value;
|
||||||
|
private int count;
|
||||||
|
|
||||||
|
public FacetCountDto() {}
|
||||||
|
|
||||||
|
public FacetCountDto(String value, int count) {
|
||||||
|
this.value = value;
|
||||||
|
this.count = count;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public String getValue() {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setValue(String value) {
|
||||||
|
this.value = value;
|
||||||
|
}
|
||||||
|
|
||||||
|
public int getCount() {
|
||||||
|
return count;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCount(int count) {
|
||||||
|
this.count = count;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -8,6 +8,7 @@ public class HtmlSanitizationConfigDto {
|
|||||||
private Map<String, List<String>> allowedAttributes;
|
private Map<String, List<String>> allowedAttributes;
|
||||||
private List<String> allowedCssProperties;
|
private List<String> allowedCssProperties;
|
||||||
private Map<String, List<String>> removedAttributes;
|
private Map<String, List<String>> removedAttributes;
|
||||||
|
private Map<String, Map<String, List<String>>> allowedProtocols;
|
||||||
private String description;
|
private String description;
|
||||||
|
|
||||||
public HtmlSanitizationConfigDto() {}
|
public HtmlSanitizationConfigDto() {}
|
||||||
@@ -44,6 +45,14 @@ public class HtmlSanitizationConfigDto {
|
|||||||
this.removedAttributes = removedAttributes;
|
this.removedAttributes = removedAttributes;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Map<String, Map<String, List<String>>> getAllowedProtocols() {
|
||||||
|
return allowedProtocols;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAllowedProtocols(Map<String, Map<String, List<String>>> allowedProtocols) {
|
||||||
|
this.allowedProtocols = allowedProtocols;
|
||||||
|
}
|
||||||
|
|
||||||
public String getDescription() {
|
public String getDescription() {
|
||||||
return description;
|
return description;
|
||||||
}
|
}
|
||||||
|
|||||||
61
backend/src/main/java/com/storycove/dto/LibraryDto.java
Normal file
61
backend/src/main/java/com/storycove/dto/LibraryDto.java
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
public class LibraryDto {
|
||||||
|
private String id;
|
||||||
|
private String name;
|
||||||
|
private String description;
|
||||||
|
private boolean isActive;
|
||||||
|
private boolean isInitialized;
|
||||||
|
|
||||||
|
// Constructors
|
||||||
|
public LibraryDto() {}
|
||||||
|
|
||||||
|
public LibraryDto(String id, String name, String description, boolean isActive, boolean isInitialized) {
|
||||||
|
this.id = id;
|
||||||
|
this.name = name;
|
||||||
|
this.description = description;
|
||||||
|
this.isActive = isActive;
|
||||||
|
this.isInitialized = isInitialized;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public String getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(String id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDescription() {
|
||||||
|
return description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDescription(String description) {
|
||||||
|
this.description = description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isActive() {
|
||||||
|
return isActive;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setActive(boolean active) {
|
||||||
|
isActive = active;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isInitialized() {
|
||||||
|
return isInitialized;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setInitialized(boolean initialized) {
|
||||||
|
isInitialized = initialized;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotBlank;
|
||||||
|
|
||||||
|
public class ProcessContentImagesRequest {
|
||||||
|
|
||||||
|
@NotBlank(message = "HTML content is required")
|
||||||
|
private String htmlContent;
|
||||||
|
|
||||||
|
public ProcessContentImagesRequest() {}
|
||||||
|
|
||||||
|
public ProcessContentImagesRequest(String htmlContent) {
|
||||||
|
this.htmlContent = htmlContent;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getHtmlContent() {
|
||||||
|
return htmlContent;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setHtmlContent(String htmlContent) {
|
||||||
|
this.htmlContent = htmlContent;
|
||||||
|
}
|
||||||
|
}
|
||||||
124
backend/src/main/java/com/storycove/dto/ReadingPositionDto.java
Normal file
124
backend/src/main/java/com/storycove/dto/ReadingPositionDto.java
Normal file
@@ -0,0 +1,124 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class ReadingPositionDto {
|
||||||
|
|
||||||
|
private UUID id;
|
||||||
|
private UUID storyId;
|
||||||
|
private Integer chapterIndex;
|
||||||
|
private String chapterTitle;
|
||||||
|
private Integer wordPosition;
|
||||||
|
private Integer characterPosition;
|
||||||
|
private Double percentageComplete;
|
||||||
|
private String epubCfi;
|
||||||
|
private String contextBefore;
|
||||||
|
private String contextAfter;
|
||||||
|
private LocalDateTime createdAt;
|
||||||
|
private LocalDateTime updatedAt;
|
||||||
|
|
||||||
|
public ReadingPositionDto() {}
|
||||||
|
|
||||||
|
public ReadingPositionDto(UUID storyId, Integer chapterIndex, Integer wordPosition) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(UUID id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getStoryId() {
|
||||||
|
return storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStoryId(UUID storyId) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getChapterIndex() {
|
||||||
|
return chapterIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChapterIndex(Integer chapterIndex) {
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getChapterTitle() {
|
||||||
|
return chapterTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChapterTitle(String chapterTitle) {
|
||||||
|
this.chapterTitle = chapterTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getWordPosition() {
|
||||||
|
return wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWordPosition(Integer wordPosition) {
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getCharacterPosition() {
|
||||||
|
return characterPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCharacterPosition(Integer characterPosition) {
|
||||||
|
this.characterPosition = characterPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Double getPercentageComplete() {
|
||||||
|
return percentageComplete;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPercentageComplete(Double percentageComplete) {
|
||||||
|
this.percentageComplete = percentageComplete;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getEpubCfi() {
|
||||||
|
return epubCfi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEpubCfi(String epubCfi) {
|
||||||
|
this.epubCfi = epubCfi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContextBefore() {
|
||||||
|
return contextBefore;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContextBefore(String contextBefore) {
|
||||||
|
this.contextBefore = contextBefore;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContextAfter() {
|
||||||
|
return contextAfter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContextAfter(String contextAfter) {
|
||||||
|
this.contextAfter = contextAfter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getCreatedAt() {
|
||||||
|
return createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedAt(LocalDateTime createdAt) {
|
||||||
|
this.createdAt = createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getUpdatedAt() {
|
||||||
|
return updatedAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUpdatedAt(LocalDateTime updatedAt) {
|
||||||
|
this.updatedAt = updatedAt;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.Min;
|
||||||
|
|
||||||
|
public class ReadingProgressRequest {
|
||||||
|
|
||||||
|
@Min(value = 0, message = "Reading position must be non-negative")
|
||||||
|
private Integer position;
|
||||||
|
|
||||||
|
public ReadingProgressRequest() {}
|
||||||
|
|
||||||
|
public ReadingProgressRequest(Integer position) {
|
||||||
|
this.position = position;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getPosition() {
|
||||||
|
return position;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPosition(Integer position) {
|
||||||
|
this.position = position;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
|
||||||
|
public class ReadingStatusRequest {
|
||||||
|
|
||||||
|
@NotNull(message = "Reading status is required")
|
||||||
|
private Boolean isRead;
|
||||||
|
|
||||||
|
public ReadingStatusRequest() {}
|
||||||
|
|
||||||
|
public ReadingStatusRequest(Boolean isRead) {
|
||||||
|
this.isRead = isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getIsRead() {
|
||||||
|
return isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIsRead(Boolean isRead) {
|
||||||
|
this.isRead = isRead;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,6 +1,7 @@
|
|||||||
package com.storycove.dto;
|
package com.storycove.dto;
|
||||||
|
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
|
import java.util.Map;
|
||||||
|
|
||||||
public class SearchResultDto<T> {
|
public class SearchResultDto<T> {
|
||||||
|
|
||||||
@@ -10,6 +11,7 @@ public class SearchResultDto<T> {
|
|||||||
private int perPage;
|
private int perPage;
|
||||||
private String query;
|
private String query;
|
||||||
private long searchTimeMs;
|
private long searchTimeMs;
|
||||||
|
private Map<String, List<FacetCountDto>> facets;
|
||||||
|
|
||||||
public SearchResultDto() {}
|
public SearchResultDto() {}
|
||||||
|
|
||||||
@@ -22,6 +24,16 @@ public class SearchResultDto<T> {
|
|||||||
this.searchTimeMs = searchTimeMs;
|
this.searchTimeMs = searchTimeMs;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public SearchResultDto(List<T> results, long totalHits, int page, int perPage, String query, long searchTimeMs, Map<String, List<FacetCountDto>> facets) {
|
||||||
|
this.results = results;
|
||||||
|
this.totalHits = totalHits;
|
||||||
|
this.page = page;
|
||||||
|
this.perPage = perPage;
|
||||||
|
this.query = query;
|
||||||
|
this.searchTimeMs = searchTimeMs;
|
||||||
|
this.facets = facets;
|
||||||
|
}
|
||||||
|
|
||||||
// Getters and Setters
|
// Getters and Setters
|
||||||
public List<T> getResults() {
|
public List<T> getResults() {
|
||||||
return results;
|
return results;
|
||||||
@@ -70,4 +82,12 @@ public class SearchResultDto<T> {
|
|||||||
public void setSearchTimeMs(long searchTimeMs) {
|
public void setSearchTimeMs(long searchTimeMs) {
|
||||||
this.searchTimeMs = searchTimeMs;
|
this.searchTimeMs = searchTimeMs;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Map<String, List<FacetCountDto>> getFacets() {
|
||||||
|
return facets;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setFacets(Map<String, List<FacetCountDto>> facets) {
|
||||||
|
this.facets = facets;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -21,13 +21,18 @@ public class StoryDto {
|
|||||||
private String description;
|
private String description;
|
||||||
|
|
||||||
private String contentHtml;
|
private String contentHtml;
|
||||||
private String contentPlain;
|
// contentPlain removed for performance - use StoryReadingDto when content is needed
|
||||||
private String sourceUrl;
|
private String sourceUrl;
|
||||||
private String coverPath;
|
private String coverPath;
|
||||||
private Integer wordCount;
|
private Integer wordCount;
|
||||||
private Integer rating;
|
private Integer rating;
|
||||||
private Integer volume;
|
private Integer volume;
|
||||||
|
|
||||||
|
// Reading progress fields
|
||||||
|
private Boolean isRead;
|
||||||
|
private Integer readingPosition;
|
||||||
|
private LocalDateTime lastReadAt;
|
||||||
|
|
||||||
// Related entities as simple references
|
// Related entities as simple references
|
||||||
private UUID authorId;
|
private UUID authorId;
|
||||||
private String authorName;
|
private String authorName;
|
||||||
@@ -85,13 +90,6 @@ public class StoryDto {
|
|||||||
this.contentHtml = contentHtml;
|
this.contentHtml = contentHtml;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getContentPlain() {
|
|
||||||
return contentPlain;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setContentPlain(String contentPlain) {
|
|
||||||
this.contentPlain = contentPlain;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getSourceUrl() {
|
public String getSourceUrl() {
|
||||||
return sourceUrl;
|
return sourceUrl;
|
||||||
@@ -133,6 +131,30 @@ public class StoryDto {
|
|||||||
this.volume = volume;
|
this.volume = volume;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Boolean getIsRead() {
|
||||||
|
return isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIsRead(Boolean isRead) {
|
||||||
|
this.isRead = isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getReadingPosition() {
|
||||||
|
return readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setReadingPosition(Integer readingPosition) {
|
||||||
|
this.readingPosition = readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getLastReadAt() {
|
||||||
|
return lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||||
|
this.lastReadAt = lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
public UUID getAuthorId() {
|
public UUID getAuthorId() {
|
||||||
return authorId;
|
return authorId;
|
||||||
}
|
}
|
||||||
|
|||||||
202
backend/src/main/java/com/storycove/dto/StoryReadingDto.java
Normal file
202
backend/src/main/java/com/storycove/dto/StoryReadingDto.java
Normal file
@@ -0,0 +1,202 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Story DTO specifically for reading view.
|
||||||
|
* Contains contentHtml but excludes contentPlain for performance.
|
||||||
|
*/
|
||||||
|
public class StoryReadingDto {
|
||||||
|
|
||||||
|
private UUID id;
|
||||||
|
private String title;
|
||||||
|
private String summary;
|
||||||
|
private String description;
|
||||||
|
private String contentHtml; // For reading - includes HTML
|
||||||
|
// contentPlain excluded for performance
|
||||||
|
private String sourceUrl;
|
||||||
|
private String coverPath;
|
||||||
|
private Integer wordCount;
|
||||||
|
private Integer rating;
|
||||||
|
private Integer volume;
|
||||||
|
|
||||||
|
// Reading progress fields
|
||||||
|
private Boolean isRead;
|
||||||
|
private Integer readingPosition;
|
||||||
|
private LocalDateTime lastReadAt;
|
||||||
|
|
||||||
|
// Related entities as simple references
|
||||||
|
private UUID authorId;
|
||||||
|
private String authorName;
|
||||||
|
private UUID seriesId;
|
||||||
|
private String seriesName;
|
||||||
|
private List<TagDto> tags;
|
||||||
|
|
||||||
|
private LocalDateTime createdAt;
|
||||||
|
private LocalDateTime updatedAt;
|
||||||
|
|
||||||
|
public StoryReadingDto() {}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public UUID getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(UUID id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getTitle() {
|
||||||
|
return title;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTitle(String title) {
|
||||||
|
this.title = title;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getSummary() {
|
||||||
|
return summary;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSummary(String summary) {
|
||||||
|
this.summary = summary;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDescription() {
|
||||||
|
return description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDescription(String description) {
|
||||||
|
this.description = description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContentHtml() {
|
||||||
|
return contentHtml;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContentHtml(String contentHtml) {
|
||||||
|
this.contentHtml = contentHtml;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getSourceUrl() {
|
||||||
|
return sourceUrl;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSourceUrl(String sourceUrl) {
|
||||||
|
this.sourceUrl = sourceUrl;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCoverPath() {
|
||||||
|
return coverPath;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCoverPath(String coverPath) {
|
||||||
|
this.coverPath = coverPath;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getWordCount() {
|
||||||
|
return wordCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWordCount(Integer wordCount) {
|
||||||
|
this.wordCount = wordCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getRating() {
|
||||||
|
return rating;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setRating(Integer rating) {
|
||||||
|
this.rating = rating;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getVolume() {
|
||||||
|
return volume;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setVolume(Integer volume) {
|
||||||
|
this.volume = volume;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getIsRead() {
|
||||||
|
return isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIsRead(Boolean isRead) {
|
||||||
|
this.isRead = isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getReadingPosition() {
|
||||||
|
return readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setReadingPosition(Integer readingPosition) {
|
||||||
|
this.readingPosition = readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getLastReadAt() {
|
||||||
|
return lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||||
|
this.lastReadAt = lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getAuthorId() {
|
||||||
|
return authorId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAuthorId(UUID authorId) {
|
||||||
|
this.authorId = authorId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getAuthorName() {
|
||||||
|
return authorName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAuthorName(String authorName) {
|
||||||
|
this.authorName = authorName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getSeriesId() {
|
||||||
|
return seriesId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSeriesId(UUID seriesId) {
|
||||||
|
this.seriesId = seriesId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getSeriesName() {
|
||||||
|
return seriesName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSeriesName(String seriesName) {
|
||||||
|
this.seriesName = seriesName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<TagDto> getTags() {
|
||||||
|
return tags;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTags(List<TagDto> tags) {
|
||||||
|
this.tags = tags;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getCreatedAt() {
|
||||||
|
return createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedAt(LocalDateTime createdAt) {
|
||||||
|
this.createdAt = createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getUpdatedAt() {
|
||||||
|
return updatedAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUpdatedAt(LocalDateTime updatedAt) {
|
||||||
|
this.updatedAt = updatedAt;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -9,13 +9,16 @@ public class StorySearchDto {
|
|||||||
private UUID id;
|
private UUID id;
|
||||||
private String title;
|
private String title;
|
||||||
private String description;
|
private String description;
|
||||||
private String contentPlain;
|
|
||||||
private String sourceUrl;
|
private String sourceUrl;
|
||||||
private String coverPath;
|
private String coverPath;
|
||||||
private Integer wordCount;
|
private Integer wordCount;
|
||||||
private Integer rating;
|
private Integer rating;
|
||||||
private Integer volume;
|
private Integer volume;
|
||||||
|
|
||||||
|
// Reading status
|
||||||
|
private Boolean isRead;
|
||||||
|
private LocalDateTime lastReadAt;
|
||||||
|
|
||||||
// Author info
|
// Author info
|
||||||
private UUID authorId;
|
private UUID authorId;
|
||||||
private String authorName;
|
private String authorName;
|
||||||
@@ -61,13 +64,6 @@ public class StorySearchDto {
|
|||||||
this.description = description;
|
this.description = description;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String getContentPlain() {
|
|
||||||
return contentPlain;
|
|
||||||
}
|
|
||||||
|
|
||||||
public void setContentPlain(String contentPlain) {
|
|
||||||
this.contentPlain = contentPlain;
|
|
||||||
}
|
|
||||||
|
|
||||||
public String getSourceUrl() {
|
public String getSourceUrl() {
|
||||||
return sourceUrl;
|
return sourceUrl;
|
||||||
@@ -109,6 +105,22 @@ public class StorySearchDto {
|
|||||||
this.volume = volume;
|
this.volume = volume;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Boolean getIsRead() {
|
||||||
|
return isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIsRead(Boolean isRead) {
|
||||||
|
this.isRead = isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getLastReadAt() {
|
||||||
|
return lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||||
|
this.lastReadAt = lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
public UUID getAuthorId() {
|
public UUID getAuthorId() {
|
||||||
return authorId;
|
return authorId;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -20,6 +20,11 @@ public class StorySummaryDto {
|
|||||||
private Integer rating;
|
private Integer rating;
|
||||||
private Integer volume;
|
private Integer volume;
|
||||||
|
|
||||||
|
// Reading progress fields
|
||||||
|
private Boolean isRead;
|
||||||
|
private Integer readingPosition;
|
||||||
|
private LocalDateTime lastReadAt;
|
||||||
|
|
||||||
// Related entities as simple references
|
// Related entities as simple references
|
||||||
private UUID authorId;
|
private UUID authorId;
|
||||||
private String authorName;
|
private String authorName;
|
||||||
@@ -106,6 +111,30 @@ public class StorySummaryDto {
|
|||||||
this.volume = volume;
|
this.volume = volume;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Boolean getIsRead() {
|
||||||
|
return isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIsRead(Boolean isRead) {
|
||||||
|
this.isRead = isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getReadingPosition() {
|
||||||
|
return readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setReadingPosition(Integer readingPosition) {
|
||||||
|
this.readingPosition = readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getLastReadAt() {
|
||||||
|
return lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||||
|
this.lastReadAt = lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
public UUID getAuthorId() {
|
public UUID getAuthorId() {
|
||||||
return authorId;
|
return authorId;
|
||||||
}
|
}
|
||||||
|
|||||||
77
backend/src/main/java/com/storycove/dto/TagAliasDto.java
Normal file
77
backend/src/main/java/com/storycove/dto/TagAliasDto.java
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotBlank;
|
||||||
|
import jakarta.validation.constraints.Size;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class TagAliasDto {
|
||||||
|
|
||||||
|
private UUID id;
|
||||||
|
|
||||||
|
@NotBlank(message = "Alias name is required")
|
||||||
|
@Size(max = 100, message = "Alias name must not exceed 100 characters")
|
||||||
|
private String aliasName;
|
||||||
|
|
||||||
|
private UUID canonicalTagId;
|
||||||
|
private String canonicalTagName; // For convenience in frontend
|
||||||
|
private Boolean createdFromMerge;
|
||||||
|
private LocalDateTime createdAt;
|
||||||
|
|
||||||
|
public TagAliasDto() {}
|
||||||
|
|
||||||
|
public TagAliasDto(String aliasName, UUID canonicalTagId) {
|
||||||
|
this.aliasName = aliasName;
|
||||||
|
this.canonicalTagId = canonicalTagId;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public UUID getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(UUID id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getAliasName() {
|
||||||
|
return aliasName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAliasName(String aliasName) {
|
||||||
|
this.aliasName = aliasName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getCanonicalTagId() {
|
||||||
|
return canonicalTagId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCanonicalTagId(UUID canonicalTagId) {
|
||||||
|
this.canonicalTagId = canonicalTagId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCanonicalTagName() {
|
||||||
|
return canonicalTagName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCanonicalTagName(String canonicalTagName) {
|
||||||
|
this.canonicalTagName = canonicalTagName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getCreatedFromMerge() {
|
||||||
|
return createdFromMerge;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedFromMerge(Boolean createdFromMerge) {
|
||||||
|
this.createdFromMerge = createdFromMerge;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getCreatedAt() {
|
||||||
|
return createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedAt(LocalDateTime createdAt) {
|
||||||
|
this.createdAt = createdAt;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -4,6 +4,7 @@ import jakarta.validation.constraints.NotBlank;
|
|||||||
import jakarta.validation.constraints.Size;
|
import jakarta.validation.constraints.Size;
|
||||||
|
|
||||||
import java.time.LocalDateTime;
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.List;
|
||||||
import java.util.UUID;
|
import java.util.UUID;
|
||||||
|
|
||||||
public class TagDto {
|
public class TagDto {
|
||||||
@@ -14,7 +15,16 @@ public class TagDto {
|
|||||||
@Size(max = 100, message = "Tag name must not exceed 100 characters")
|
@Size(max = 100, message = "Tag name must not exceed 100 characters")
|
||||||
private String name;
|
private String name;
|
||||||
|
|
||||||
|
@Size(max = 7, message = "Color must be a valid hex color code")
|
||||||
|
private String color;
|
||||||
|
|
||||||
|
@Size(max = 500, message = "Description must not exceed 500 characters")
|
||||||
|
private String description;
|
||||||
|
|
||||||
private Integer storyCount;
|
private Integer storyCount;
|
||||||
|
private Integer collectionCount;
|
||||||
|
private Integer aliasCount;
|
||||||
|
private List<TagAliasDto> aliases;
|
||||||
private LocalDateTime createdAt;
|
private LocalDateTime createdAt;
|
||||||
private LocalDateTime updatedAt;
|
private LocalDateTime updatedAt;
|
||||||
|
|
||||||
@@ -41,6 +51,22 @@ public class TagDto {
|
|||||||
this.name = name;
|
this.name = name;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public String getColor() {
|
||||||
|
return color;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setColor(String color) {
|
||||||
|
this.color = color;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDescription() {
|
||||||
|
return description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDescription(String description) {
|
||||||
|
this.description = description;
|
||||||
|
}
|
||||||
|
|
||||||
public Integer getStoryCount() {
|
public Integer getStoryCount() {
|
||||||
return storyCount;
|
return storyCount;
|
||||||
}
|
}
|
||||||
@@ -49,6 +75,30 @@ public class TagDto {
|
|||||||
this.storyCount = storyCount;
|
this.storyCount = storyCount;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Integer getCollectionCount() {
|
||||||
|
return collectionCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCollectionCount(Integer collectionCount) {
|
||||||
|
this.collectionCount = collectionCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getAliasCount() {
|
||||||
|
return aliasCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAliasCount(Integer aliasCount) {
|
||||||
|
this.aliasCount = aliasCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<TagAliasDto> getAliases() {
|
||||||
|
return aliases;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAliases(List<TagAliasDto> aliases) {
|
||||||
|
this.aliases = aliases;
|
||||||
|
}
|
||||||
|
|
||||||
public LocalDateTime getCreatedAt() {
|
public LocalDateTime getCreatedAt() {
|
||||||
return createdAt;
|
return createdAt;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -52,6 +52,10 @@ public class Collection {
|
|||||||
)
|
)
|
||||||
private Set<Tag> tags = new HashSet<>();
|
private Set<Tag> tags = new HashSet<>();
|
||||||
|
|
||||||
|
// Transient field for search results - tag names only to avoid lazy loading issues
|
||||||
|
@Transient
|
||||||
|
private List<String> tagNames;
|
||||||
|
|
||||||
@CreationTimestamp
|
@CreationTimestamp
|
||||||
@Column(name = "created_at", nullable = false, updatable = false)
|
@Column(name = "created_at", nullable = false, updatable = false)
|
||||||
private LocalDateTime createdAt;
|
private LocalDateTime createdAt;
|
||||||
@@ -192,6 +196,14 @@ public class Collection {
|
|||||||
this.tags = tags;
|
this.tags = tags;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public List<String> getTagNames() {
|
||||||
|
return tagNames;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTagNames(List<String> tagNames) {
|
||||||
|
this.tagNames = tagNames;
|
||||||
|
}
|
||||||
|
|
||||||
public LocalDateTime getCreatedAt() {
|
public LocalDateTime getCreatedAt() {
|
||||||
return createdAt;
|
return createdAt;
|
||||||
}
|
}
|
||||||
|
|||||||
93
backend/src/main/java/com/storycove/entity/Library.java
Normal file
93
backend/src/main/java/com/storycove/entity/Library.java
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
package com.storycove.entity;
|
||||||
|
|
||||||
|
public class Library {
|
||||||
|
private String id;
|
||||||
|
private String name;
|
||||||
|
private String description;
|
||||||
|
private String passwordHash;
|
||||||
|
private String dbName;
|
||||||
|
private String typesenseCollection;
|
||||||
|
private String imagePath;
|
||||||
|
private boolean initialized;
|
||||||
|
|
||||||
|
// Constructors
|
||||||
|
public Library() {}
|
||||||
|
|
||||||
|
public Library(String id, String name, String description, String passwordHash, String dbName) {
|
||||||
|
this.id = id;
|
||||||
|
this.name = name;
|
||||||
|
this.description = description;
|
||||||
|
this.passwordHash = passwordHash;
|
||||||
|
this.dbName = dbName;
|
||||||
|
this.typesenseCollection = "stories_" + id;
|
||||||
|
this.imagePath = "/images/" + id;
|
||||||
|
this.initialized = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public String getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(String id) {
|
||||||
|
this.id = id;
|
||||||
|
this.typesenseCollection = "stories_" + id;
|
||||||
|
this.imagePath = "/images/" + id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getName() {
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setName(String name) {
|
||||||
|
this.name = name;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDescription() {
|
||||||
|
return description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDescription(String description) {
|
||||||
|
this.description = description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getPasswordHash() {
|
||||||
|
return passwordHash;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPasswordHash(String passwordHash) {
|
||||||
|
this.passwordHash = passwordHash;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDbName() {
|
||||||
|
return dbName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDbName(String dbName) {
|
||||||
|
this.dbName = dbName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getTypesenseCollection() {
|
||||||
|
return typesenseCollection;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTypesenseCollection(String typesenseCollection) {
|
||||||
|
this.typesenseCollection = typesenseCollection;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getImagePath() {
|
||||||
|
return imagePath;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setImagePath(String imagePath) {
|
||||||
|
this.imagePath = imagePath;
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isInitialized() {
|
||||||
|
return initialized;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setInitialized(boolean initialized) {
|
||||||
|
this.initialized = initialized;
|
||||||
|
}
|
||||||
|
}
|
||||||
230
backend/src/main/java/com/storycove/entity/ReadingPosition.java
Normal file
230
backend/src/main/java/com/storycove/entity/ReadingPosition.java
Normal file
@@ -0,0 +1,230 @@
|
|||||||
|
package com.storycove.entity;
|
||||||
|
|
||||||
|
import jakarta.persistence.*;
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
import org.hibernate.annotations.CreationTimestamp;
|
||||||
|
import org.hibernate.annotations.UpdateTimestamp;
|
||||||
|
import com.fasterxml.jackson.annotation.JsonBackReference;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
@Entity
|
||||||
|
@Table(name = "reading_positions", indexes = {
|
||||||
|
@Index(name = "idx_reading_position_story", columnList = "story_id")
|
||||||
|
})
|
||||||
|
public class ReadingPosition {
|
||||||
|
|
||||||
|
@Id
|
||||||
|
@GeneratedValue(strategy = GenerationType.UUID)
|
||||||
|
private UUID id;
|
||||||
|
|
||||||
|
@NotNull
|
||||||
|
@ManyToOne(fetch = FetchType.LAZY)
|
||||||
|
@JoinColumn(name = "story_id", nullable = false)
|
||||||
|
@JsonBackReference("story-reading-positions")
|
||||||
|
private Story story;
|
||||||
|
|
||||||
|
@Column(name = "chapter_index")
|
||||||
|
private Integer chapterIndex;
|
||||||
|
|
||||||
|
@Column(name = "chapter_title")
|
||||||
|
private String chapterTitle;
|
||||||
|
|
||||||
|
@Column(name = "word_position")
|
||||||
|
private Integer wordPosition;
|
||||||
|
|
||||||
|
@Column(name = "character_position")
|
||||||
|
private Integer characterPosition;
|
||||||
|
|
||||||
|
@Column(name = "percentage_complete")
|
||||||
|
private Double percentageComplete;
|
||||||
|
|
||||||
|
@Column(name = "epub_cfi", columnDefinition = "TEXT")
|
||||||
|
private String epubCfi;
|
||||||
|
|
||||||
|
@Column(name = "context_before", length = 500)
|
||||||
|
private String contextBefore;
|
||||||
|
|
||||||
|
@Column(name = "context_after", length = 500)
|
||||||
|
private String contextAfter;
|
||||||
|
|
||||||
|
@CreationTimestamp
|
||||||
|
@Column(name = "created_at", nullable = false, updatable = false)
|
||||||
|
private LocalDateTime createdAt;
|
||||||
|
|
||||||
|
@UpdateTimestamp
|
||||||
|
@Column(name = "updated_at", nullable = false)
|
||||||
|
private LocalDateTime updatedAt;
|
||||||
|
|
||||||
|
public ReadingPosition() {}
|
||||||
|
|
||||||
|
public ReadingPosition(Story story) {
|
||||||
|
this.story = story;
|
||||||
|
this.chapterIndex = 0;
|
||||||
|
this.wordPosition = 0;
|
||||||
|
this.characterPosition = 0;
|
||||||
|
this.percentageComplete = 0.0;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ReadingPosition(Story story, Integer chapterIndex, Integer wordPosition) {
|
||||||
|
this.story = story;
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
this.characterPosition = 0;
|
||||||
|
this.percentageComplete = 0.0;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void updatePosition(Integer chapterIndex, Integer wordPosition, Integer characterPosition) {
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
this.characterPosition = characterPosition;
|
||||||
|
calculatePercentageComplete();
|
||||||
|
}
|
||||||
|
|
||||||
|
public void updatePositionWithCfi(String epubCfi, Integer chapterIndex, Integer wordPosition) {
|
||||||
|
this.epubCfi = epubCfi;
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
calculatePercentageComplete();
|
||||||
|
}
|
||||||
|
|
||||||
|
private void calculatePercentageComplete() {
|
||||||
|
if (story != null && story.getWordCount() != null && story.getWordCount() > 0) {
|
||||||
|
int totalWords = story.getWordCount();
|
||||||
|
int currentPosition = (chapterIndex != null ? chapterIndex * 1000 : 0) +
|
||||||
|
(wordPosition != null ? wordPosition : 0);
|
||||||
|
this.percentageComplete = Math.min(100.0, (double) currentPosition / totalWords * 100);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isAtBeginning() {
|
||||||
|
return (chapterIndex == null || chapterIndex == 0) &&
|
||||||
|
(wordPosition == null || wordPosition == 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isCompleted() {
|
||||||
|
return percentageComplete != null && percentageComplete >= 95.0;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public UUID getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(UUID id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Story getStory() {
|
||||||
|
return story;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStory(Story story) {
|
||||||
|
this.story = story;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getChapterIndex() {
|
||||||
|
return chapterIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChapterIndex(Integer chapterIndex) {
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getChapterTitle() {
|
||||||
|
return chapterTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChapterTitle(String chapterTitle) {
|
||||||
|
this.chapterTitle = chapterTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getWordPosition() {
|
||||||
|
return wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWordPosition(Integer wordPosition) {
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getCharacterPosition() {
|
||||||
|
return characterPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCharacterPosition(Integer characterPosition) {
|
||||||
|
this.characterPosition = characterPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Double getPercentageComplete() {
|
||||||
|
return percentageComplete;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPercentageComplete(Double percentageComplete) {
|
||||||
|
this.percentageComplete = percentageComplete;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getEpubCfi() {
|
||||||
|
return epubCfi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEpubCfi(String epubCfi) {
|
||||||
|
this.epubCfi = epubCfi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContextBefore() {
|
||||||
|
return contextBefore;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContextBefore(String contextBefore) {
|
||||||
|
this.contextBefore = contextBefore;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContextAfter() {
|
||||||
|
return contextAfter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContextAfter(String contextAfter) {
|
||||||
|
this.contextAfter = contextAfter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getCreatedAt() {
|
||||||
|
return createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedAt(LocalDateTime createdAt) {
|
||||||
|
this.createdAt = createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getUpdatedAt() {
|
||||||
|
return updatedAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUpdatedAt(LocalDateTime updatedAt) {
|
||||||
|
this.updatedAt = updatedAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public boolean equals(Object o) {
|
||||||
|
if (this == o) return true;
|
||||||
|
if (!(o instanceof ReadingPosition)) return false;
|
||||||
|
ReadingPosition that = (ReadingPosition) o;
|
||||||
|
return id != null && id.equals(that.id);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public int hashCode() {
|
||||||
|
return getClass().hashCode();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String toString() {
|
||||||
|
return "ReadingPosition{" +
|
||||||
|
"id=" + id +
|
||||||
|
", storyId=" + (story != null ? story.getId() : null) +
|
||||||
|
", chapterIndex=" + chapterIndex +
|
||||||
|
", wordPosition=" + wordPosition +
|
||||||
|
", percentageComplete=" + percentageComplete +
|
||||||
|
'}';
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -55,6 +55,15 @@ public class Story {
|
|||||||
@Column(name = "volume")
|
@Column(name = "volume")
|
||||||
private Integer volume;
|
private Integer volume;
|
||||||
|
|
||||||
|
@Column(name = "is_read")
|
||||||
|
private Boolean isRead = false;
|
||||||
|
|
||||||
|
@Column(name = "reading_position")
|
||||||
|
private Integer readingPosition = 0;
|
||||||
|
|
||||||
|
@Column(name = "last_read_at")
|
||||||
|
private LocalDateTime lastReadAt;
|
||||||
|
|
||||||
@ManyToOne(fetch = FetchType.LAZY)
|
@ManyToOne(fetch = FetchType.LAZY)
|
||||||
@JoinColumn(name = "author_id")
|
@JoinColumn(name = "author_id")
|
||||||
@JsonBackReference("author-stories")
|
@JsonBackReference("author-stories")
|
||||||
@@ -212,6 +221,30 @@ public class Story {
|
|||||||
this.volume = volume;
|
this.volume = volume;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Boolean getIsRead() {
|
||||||
|
return isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIsRead(Boolean isRead) {
|
||||||
|
this.isRead = isRead;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getReadingPosition() {
|
||||||
|
return readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setReadingPosition(Integer readingPosition) {
|
||||||
|
this.readingPosition = readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getLastReadAt() {
|
||||||
|
return lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||||
|
this.lastReadAt = lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
public Author getAuthor() {
|
public Author getAuthor() {
|
||||||
return author;
|
return author;
|
||||||
}
|
}
|
||||||
@@ -252,6 +285,37 @@ public class Story {
|
|||||||
this.updatedAt = updatedAt;
|
this.updatedAt = updatedAt;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Updates the reading progress and timestamp
|
||||||
|
*/
|
||||||
|
public void updateReadingProgress(Integer position) {
|
||||||
|
this.readingPosition = position;
|
||||||
|
this.lastReadAt = LocalDateTime.now();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Marks the story as read and updates the reading position to the end
|
||||||
|
*/
|
||||||
|
public void markAsRead() {
|
||||||
|
this.isRead = true;
|
||||||
|
this.lastReadAt = LocalDateTime.now();
|
||||||
|
// Set reading position to the end of content if available
|
||||||
|
if (contentPlain != null) {
|
||||||
|
this.readingPosition = contentPlain.length();
|
||||||
|
} else if (contentHtml != null) {
|
||||||
|
this.readingPosition = contentHtml.length();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Marks the story as unread and resets reading position
|
||||||
|
*/
|
||||||
|
public void markAsUnread() {
|
||||||
|
this.isRead = false;
|
||||||
|
this.readingPosition = 0;
|
||||||
|
this.lastReadAt = null;
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public boolean equals(Object o) {
|
public boolean equals(Object o) {
|
||||||
if (this == o) return true;
|
if (this == o) return true;
|
||||||
@@ -272,6 +336,8 @@ public class Story {
|
|||||||
", title='" + title + '\'' +
|
", title='" + title + '\'' +
|
||||||
", wordCount=" + wordCount +
|
", wordCount=" + wordCount +
|
||||||
", rating=" + rating +
|
", rating=" + rating +
|
||||||
|
", isRead=" + isRead +
|
||||||
|
", readingPosition=" + readingPosition +
|
||||||
'}';
|
'}';
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -5,6 +5,7 @@ import jakarta.validation.constraints.NotBlank;
|
|||||||
import jakarta.validation.constraints.Size;
|
import jakarta.validation.constraints.Size;
|
||||||
import org.hibernate.annotations.CreationTimestamp;
|
import org.hibernate.annotations.CreationTimestamp;
|
||||||
import com.fasterxml.jackson.annotation.JsonBackReference;
|
import com.fasterxml.jackson.annotation.JsonBackReference;
|
||||||
|
import com.fasterxml.jackson.annotation.JsonManagedReference;
|
||||||
|
|
||||||
import java.time.LocalDateTime;
|
import java.time.LocalDateTime;
|
||||||
import java.util.HashSet;
|
import java.util.HashSet;
|
||||||
@@ -24,11 +25,27 @@ public class Tag {
|
|||||||
@Column(nullable = false, unique = true)
|
@Column(nullable = false, unique = true)
|
||||||
private String name;
|
private String name;
|
||||||
|
|
||||||
|
@Size(max = 7, message = "Color must be a valid hex color code")
|
||||||
|
@Column(length = 7)
|
||||||
|
private String color; // hex color like #3B82F6
|
||||||
|
|
||||||
|
@Size(max = 500, message = "Description must not exceed 500 characters")
|
||||||
|
@Column(length = 500)
|
||||||
|
private String description;
|
||||||
|
|
||||||
|
|
||||||
@ManyToMany(mappedBy = "tags")
|
@ManyToMany(mappedBy = "tags")
|
||||||
@JsonBackReference("story-tags")
|
@JsonBackReference("story-tags")
|
||||||
private Set<Story> stories = new HashSet<>();
|
private Set<Story> stories = new HashSet<>();
|
||||||
|
|
||||||
|
@ManyToMany(mappedBy = "tags")
|
||||||
|
@JsonBackReference("collection-tags")
|
||||||
|
private Set<Collection> collections = new HashSet<>();
|
||||||
|
|
||||||
|
@OneToMany(mappedBy = "canonicalTag", cascade = CascadeType.ALL, orphanRemoval = true)
|
||||||
|
@JsonManagedReference("tag-aliases")
|
||||||
|
private Set<TagAlias> aliases = new HashSet<>();
|
||||||
|
|
||||||
@CreationTimestamp
|
@CreationTimestamp
|
||||||
@Column(name = "created_at", nullable = false, updatable = false)
|
@Column(name = "created_at", nullable = false, updatable = false)
|
||||||
private LocalDateTime createdAt;
|
private LocalDateTime createdAt;
|
||||||
@@ -39,6 +56,12 @@ public class Tag {
|
|||||||
this.name = name;
|
this.name = name;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Tag(String name, String color, String description) {
|
||||||
|
this.name = name;
|
||||||
|
this.color = color;
|
||||||
|
this.description = description;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
// Getters and Setters
|
// Getters and Setters
|
||||||
@@ -58,6 +81,22 @@ public class Tag {
|
|||||||
this.name = name;
|
this.name = name;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public String getColor() {
|
||||||
|
return color;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setColor(String color) {
|
||||||
|
this.color = color;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getDescription() {
|
||||||
|
return description;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setDescription(String description) {
|
||||||
|
this.description = description;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
public Set<Story> getStories() {
|
public Set<Story> getStories() {
|
||||||
return stories;
|
return stories;
|
||||||
@@ -67,6 +106,22 @@ public class Tag {
|
|||||||
this.stories = stories;
|
this.stories = stories;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public Set<Collection> getCollections() {
|
||||||
|
return collections;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCollections(Set<Collection> collections) {
|
||||||
|
this.collections = collections;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Set<TagAlias> getAliases() {
|
||||||
|
return aliases;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAliases(Set<TagAlias> aliases) {
|
||||||
|
this.aliases = aliases;
|
||||||
|
}
|
||||||
|
|
||||||
public LocalDateTime getCreatedAt() {
|
public LocalDateTime getCreatedAt() {
|
||||||
return createdAt;
|
return createdAt;
|
||||||
}
|
}
|
||||||
|
|||||||
113
backend/src/main/java/com/storycove/entity/TagAlias.java
Normal file
113
backend/src/main/java/com/storycove/entity/TagAlias.java
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
package com.storycove.entity;
|
||||||
|
|
||||||
|
import jakarta.persistence.*;
|
||||||
|
import jakarta.validation.constraints.NotBlank;
|
||||||
|
import jakarta.validation.constraints.Size;
|
||||||
|
import org.hibernate.annotations.CreationTimestamp;
|
||||||
|
import com.fasterxml.jackson.annotation.JsonManagedReference;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
@Entity
|
||||||
|
@Table(name = "tag_aliases")
|
||||||
|
public class TagAlias {
|
||||||
|
|
||||||
|
@Id
|
||||||
|
@GeneratedValue(strategy = GenerationType.UUID)
|
||||||
|
private UUID id;
|
||||||
|
|
||||||
|
@NotBlank(message = "Alias name is required")
|
||||||
|
@Size(max = 100, message = "Alias name must not exceed 100 characters")
|
||||||
|
@Column(name = "alias_name", nullable = false, unique = true)
|
||||||
|
private String aliasName;
|
||||||
|
|
||||||
|
@ManyToOne(fetch = FetchType.LAZY)
|
||||||
|
@JoinColumn(name = "canonical_tag_id", nullable = false)
|
||||||
|
@JsonManagedReference("tag-aliases")
|
||||||
|
private Tag canonicalTag;
|
||||||
|
|
||||||
|
@Column(name = "created_from_merge", nullable = false)
|
||||||
|
private Boolean createdFromMerge = false;
|
||||||
|
|
||||||
|
@CreationTimestamp
|
||||||
|
@Column(name = "created_at", nullable = false, updatable = false)
|
||||||
|
private LocalDateTime createdAt;
|
||||||
|
|
||||||
|
public TagAlias() {}
|
||||||
|
|
||||||
|
public TagAlias(String aliasName, Tag canonicalTag) {
|
||||||
|
this.aliasName = aliasName;
|
||||||
|
this.canonicalTag = canonicalTag;
|
||||||
|
}
|
||||||
|
|
||||||
|
public TagAlias(String aliasName, Tag canonicalTag, Boolean createdFromMerge) {
|
||||||
|
this.aliasName = aliasName;
|
||||||
|
this.canonicalTag = canonicalTag;
|
||||||
|
this.createdFromMerge = createdFromMerge;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public UUID getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(UUID id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getAliasName() {
|
||||||
|
return aliasName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAliasName(String aliasName) {
|
||||||
|
this.aliasName = aliasName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Tag getCanonicalTag() {
|
||||||
|
return canonicalTag;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCanonicalTag(Tag canonicalTag) {
|
||||||
|
this.canonicalTag = canonicalTag;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getCreatedFromMerge() {
|
||||||
|
return createdFromMerge;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedFromMerge(Boolean createdFromMerge) {
|
||||||
|
this.createdFromMerge = createdFromMerge;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getCreatedAt() {
|
||||||
|
return createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedAt(LocalDateTime createdAt) {
|
||||||
|
this.createdAt = createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public boolean equals(Object o) {
|
||||||
|
if (this == o) return true;
|
||||||
|
if (!(o instanceof TagAlias)) return false;
|
||||||
|
TagAlias tagAlias = (TagAlias) o;
|
||||||
|
return id != null && id.equals(tagAlias.id);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public int hashCode() {
|
||||||
|
return getClass().hashCode();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String toString() {
|
||||||
|
return "TagAlias{" +
|
||||||
|
"id=" + id +
|
||||||
|
", aliasName='" + aliasName + '\'' +
|
||||||
|
", canonicalTag=" + (canonicalTag != null ? canonicalTag.getName() : null) +
|
||||||
|
", createdFromMerge=" + createdFromMerge +
|
||||||
|
'}';
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -52,4 +52,5 @@ public interface AuthorRepository extends JpaRepository<Author, UUID> {
|
|||||||
|
|
||||||
@Query(value = "SELECT author_rating FROM authors WHERE id = :id", nativeQuery = true)
|
@Query(value = "SELECT author_rating FROM authors WHERE id = :id", nativeQuery = true)
|
||||||
Integer findAuthorRatingById(@Param("id") UUID id);
|
Integer findAuthorRatingById(@Param("id") UUID id);
|
||||||
|
|
||||||
}
|
}
|
||||||
@@ -45,4 +45,11 @@ public interface CollectionRepository extends JpaRepository<Collection, UUID> {
|
|||||||
*/
|
*/
|
||||||
@Query("SELECT c FROM Collection c WHERE c.isArchived = false ORDER BY c.updatedAt DESC")
|
@Query("SELECT c FROM Collection c WHERE c.isArchived = false ORDER BY c.updatedAt DESC")
|
||||||
List<Collection> findAllActiveCollections();
|
List<Collection> findAllActiveCollections();
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find all collections with tags for reindexing operations
|
||||||
|
*/
|
||||||
|
@Query("SELECT c FROM Collection c LEFT JOIN FETCH c.tags ORDER BY c.updatedAt DESC")
|
||||||
|
List<Collection> findAllWithTags();
|
||||||
|
|
||||||
}
|
}
|
||||||
@@ -0,0 +1,57 @@
|
|||||||
|
package com.storycove.repository;
|
||||||
|
|
||||||
|
import com.storycove.entity.ReadingPosition;
|
||||||
|
import com.storycove.entity.Story;
|
||||||
|
import org.springframework.data.jpa.repository.JpaRepository;
|
||||||
|
import org.springframework.data.jpa.repository.Query;
|
||||||
|
import org.springframework.data.repository.query.Param;
|
||||||
|
import org.springframework.stereotype.Repository;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
@Repository
|
||||||
|
public interface ReadingPositionRepository extends JpaRepository<ReadingPosition, UUID> {
|
||||||
|
|
||||||
|
Optional<ReadingPosition> findByStoryId(UUID storyId);
|
||||||
|
|
||||||
|
Optional<ReadingPosition> findByStory(Story story);
|
||||||
|
|
||||||
|
List<ReadingPosition> findByStoryIdIn(List<UUID> storyIds);
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.story.id = :storyId ORDER BY rp.updatedAt DESC")
|
||||||
|
List<ReadingPosition> findByStoryIdOrderByUpdatedAtDesc(@Param("storyId") UUID storyId);
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete >= :minPercentage")
|
||||||
|
List<ReadingPosition> findByMinimumPercentageComplete(@Param("minPercentage") Double minPercentage);
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete >= 95.0")
|
||||||
|
List<ReadingPosition> findCompletedReadings();
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete > 0 AND rp.percentageComplete < 95.0")
|
||||||
|
List<ReadingPosition> findInProgressReadings();
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.updatedAt >= :since ORDER BY rp.updatedAt DESC")
|
||||||
|
List<ReadingPosition> findRecentlyUpdated(@Param("since") LocalDateTime since);
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp ORDER BY rp.updatedAt DESC")
|
||||||
|
List<ReadingPosition> findAllOrderByUpdatedAtDesc();
|
||||||
|
|
||||||
|
@Query("SELECT COUNT(rp) FROM ReadingPosition rp WHERE rp.percentageComplete >= 95.0")
|
||||||
|
long countCompletedReadings();
|
||||||
|
|
||||||
|
@Query("SELECT COUNT(rp) FROM ReadingPosition rp WHERE rp.percentageComplete > 0 AND rp.percentageComplete < 95.0")
|
||||||
|
long countInProgressReadings();
|
||||||
|
|
||||||
|
@Query("SELECT AVG(rp.percentageComplete) FROM ReadingPosition rp WHERE rp.percentageComplete > 0")
|
||||||
|
Double findAverageReadingProgress();
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.epubCfi IS NOT NULL")
|
||||||
|
List<ReadingPosition> findPositionsWithEpubCfi();
|
||||||
|
|
||||||
|
boolean existsByStoryId(UUID storyId);
|
||||||
|
|
||||||
|
void deleteByStoryId(UUID storyId);
|
||||||
|
}
|
||||||
@@ -114,4 +114,130 @@ public interface StoryRepository extends JpaRepository<Story, UUID> {
|
|||||||
"LEFT JOIN FETCH s.series " +
|
"LEFT JOIN FETCH s.series " +
|
||||||
"LEFT JOIN FETCH s.tags")
|
"LEFT JOIN FETCH s.tags")
|
||||||
List<Story> findAllWithAssociations();
|
List<Story> findAllWithAssociations();
|
||||||
|
|
||||||
|
@Query("SELECT s FROM Story s WHERE UPPER(s.title) = UPPER(:title) AND UPPER(s.author.name) = UPPER(:authorName)")
|
||||||
|
List<Story> findByTitleAndAuthorNameIgnoreCase(@Param("title") String title, @Param("authorName") String authorName);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Count all stories for random selection (no filters)
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT COUNT(*) FROM stories", nativeQuery = true)
|
||||||
|
long countAllStories();
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Count stories matching tag name filter for random selection
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT COUNT(DISTINCT s.id) FROM stories s " +
|
||||||
|
"JOIN story_tags st ON s.id = st.story_id " +
|
||||||
|
"JOIN tags t ON st.tag_id = t.id " +
|
||||||
|
"WHERE UPPER(t.name) = UPPER(?1)",
|
||||||
|
nativeQuery = true)
|
||||||
|
long countStoriesByTagName(String tagName);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find a random story using offset (no filters)
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT s.* FROM stories s ORDER BY s.id OFFSET ?1 LIMIT 1", nativeQuery = true)
|
||||||
|
Optional<Story> findRandomStory(long offset);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find a random story matching tag name filter using offset
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT s.* FROM stories s " +
|
||||||
|
"JOIN story_tags st ON s.id = st.story_id " +
|
||||||
|
"JOIN tags t ON st.tag_id = t.id " +
|
||||||
|
"WHERE UPPER(t.name) = UPPER(?1) " +
|
||||||
|
"ORDER BY s.id OFFSET ?2 LIMIT 1",
|
||||||
|
nativeQuery = true)
|
||||||
|
Optional<Story> findRandomStoryByTagName(String tagName, long offset);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Count stories matching multiple tags (ALL tags must be present)
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT COUNT(*) FROM (" +
|
||||||
|
" SELECT DISTINCT s.id FROM stories s " +
|
||||||
|
" JOIN story_tags st ON s.id = st.story_id " +
|
||||||
|
" JOIN tags t ON st.tag_id = t.id " +
|
||||||
|
" WHERE UPPER(t.name) IN (?1) " +
|
||||||
|
" GROUP BY s.id " +
|
||||||
|
" HAVING COUNT(DISTINCT t.name) = ?2" +
|
||||||
|
") as matched_stories",
|
||||||
|
nativeQuery = true)
|
||||||
|
long countStoriesByMultipleTags(List<String> upperCaseTagNames, int tagCount);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find random story matching multiple tags (ALL tags must be present)
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT s.* FROM stories s " +
|
||||||
|
"JOIN story_tags st ON s.id = st.story_id " +
|
||||||
|
"JOIN tags t ON st.tag_id = t.id " +
|
||||||
|
"WHERE UPPER(t.name) IN (?1) " +
|
||||||
|
"GROUP BY s.id, s.title, s.summary, s.description, s.content_html, s.content_plain, s.source_url, s.cover_path, s.word_count, s.rating, s.volume, s.is_read, s.reading_position, s.last_read_at, s.author_id, s.series_id, s.created_at, s.updated_at " +
|
||||||
|
"HAVING COUNT(DISTINCT t.name) = ?2 " +
|
||||||
|
"ORDER BY s.id OFFSET ?3 LIMIT 1",
|
||||||
|
nativeQuery = true)
|
||||||
|
Optional<Story> findRandomStoryByMultipleTags(List<String> upperCaseTagNames, int tagCount, long offset);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Count stories matching text search (title, author, tags)
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT COUNT(DISTINCT s.id) FROM stories s " +
|
||||||
|
"LEFT JOIN authors a ON s.author_id = a.id " +
|
||||||
|
"LEFT JOIN story_tags st ON s.id = st.story_id " +
|
||||||
|
"LEFT JOIN tags t ON st.tag_id = t.id " +
|
||||||
|
"WHERE (UPPER(s.title) LIKE UPPER(?1) OR UPPER(a.name) LIKE UPPER(?1) OR UPPER(t.name) LIKE UPPER(?1))",
|
||||||
|
nativeQuery = true)
|
||||||
|
long countStoriesByTextSearch(String searchPattern);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find random story matching text search (title, author, tags)
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT DISTINCT s.* FROM stories s " +
|
||||||
|
"LEFT JOIN authors a ON s.author_id = a.id " +
|
||||||
|
"LEFT JOIN story_tags st ON s.id = st.story_id " +
|
||||||
|
"LEFT JOIN tags t ON st.tag_id = t.id " +
|
||||||
|
"WHERE (UPPER(s.title) LIKE UPPER(?1) OR UPPER(a.name) LIKE UPPER(?1) OR UPPER(t.name) LIKE UPPER(?1)) " +
|
||||||
|
"ORDER BY s.id OFFSET ?2 LIMIT 1",
|
||||||
|
nativeQuery = true)
|
||||||
|
Optional<Story> findRandomStoryByTextSearch(String searchPattern, long offset);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Count stories matching both text search AND tags
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT COUNT(DISTINCT s.id) FROM stories s " +
|
||||||
|
"LEFT JOIN authors a ON s.author_id = a.id " +
|
||||||
|
"LEFT JOIN story_tags st ON s.id = st.story_id " +
|
||||||
|
"LEFT JOIN tags t ON st.tag_id = t.id " +
|
||||||
|
"WHERE (UPPER(s.title) LIKE UPPER(?1) OR UPPER(a.name) LIKE UPPER(?1) OR UPPER(t.name) LIKE UPPER(?1)) " +
|
||||||
|
"AND s.id IN (" +
|
||||||
|
" SELECT s2.id FROM stories s2 " +
|
||||||
|
" JOIN story_tags st2 ON s2.id = st2.story_id " +
|
||||||
|
" JOIN tags t2 ON st2.tag_id = t2.id " +
|
||||||
|
" WHERE UPPER(t2.name) IN (?2) " +
|
||||||
|
" GROUP BY s2.id " +
|
||||||
|
" HAVING COUNT(DISTINCT t2.name) = ?3" +
|
||||||
|
")",
|
||||||
|
nativeQuery = true)
|
||||||
|
long countStoriesByTextSearchAndTags(String searchPattern, List<String> upperCaseTagNames, int tagCount);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find random story matching both text search AND tags
|
||||||
|
*/
|
||||||
|
@Query(value = "SELECT DISTINCT s.* FROM stories s " +
|
||||||
|
"LEFT JOIN authors a ON s.author_id = a.id " +
|
||||||
|
"LEFT JOIN story_tags st ON s.id = st.story_id " +
|
||||||
|
"LEFT JOIN tags t ON st.tag_id = t.id " +
|
||||||
|
"WHERE (UPPER(s.title) LIKE UPPER(?1) OR UPPER(a.name) LIKE UPPER(?1) OR UPPER(t.name) LIKE UPPER(?1)) " +
|
||||||
|
"AND s.id IN (" +
|
||||||
|
" SELECT s2.id FROM stories s2 " +
|
||||||
|
" JOIN story_tags st2 ON s2.id = st2.story_id " +
|
||||||
|
" JOIN tags t2 ON st2.tag_id = t2.id " +
|
||||||
|
" WHERE UPPER(t2.name) IN (?2) " +
|
||||||
|
" GROUP BY s2.id " +
|
||||||
|
" HAVING COUNT(DISTINCT t2.name) = ?3" +
|
||||||
|
") " +
|
||||||
|
"ORDER BY s.id OFFSET ?4 LIMIT 1",
|
||||||
|
nativeQuery = true)
|
||||||
|
Optional<Story> findRandomStoryByTextSearchAndTags(String searchPattern, List<String> upperCaseTagNames, int tagCount, long offset);
|
||||||
|
|
||||||
}
|
}
|
||||||
@@ -0,0 +1,60 @@
|
|||||||
|
package com.storycove.repository;
|
||||||
|
|
||||||
|
import com.storycove.entity.TagAlias;
|
||||||
|
import com.storycove.entity.Tag;
|
||||||
|
import org.springframework.data.jpa.repository.JpaRepository;
|
||||||
|
import org.springframework.data.jpa.repository.Query;
|
||||||
|
import org.springframework.data.repository.query.Param;
|
||||||
|
import org.springframework.stereotype.Repository;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
@Repository
|
||||||
|
public interface TagAliasRepository extends JpaRepository<TagAlias, UUID> {
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find alias by exact alias name (case-insensitive)
|
||||||
|
*/
|
||||||
|
@Query("SELECT ta FROM TagAlias ta WHERE LOWER(ta.aliasName) = LOWER(:aliasName)")
|
||||||
|
Optional<TagAlias> findByAliasNameIgnoreCase(@Param("aliasName") String aliasName);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find all aliases for a specific canonical tag
|
||||||
|
*/
|
||||||
|
List<TagAlias> findByCanonicalTag(Tag canonicalTag);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find all aliases for a specific canonical tag ID
|
||||||
|
*/
|
||||||
|
@Query("SELECT ta FROM TagAlias ta WHERE ta.canonicalTag.id = :tagId")
|
||||||
|
List<TagAlias> findByCanonicalTagId(@Param("tagId") UUID tagId);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find aliases created from merge operations
|
||||||
|
*/
|
||||||
|
List<TagAlias> findByCreatedFromMergeTrue();
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if an alias name already exists
|
||||||
|
*/
|
||||||
|
boolean existsByAliasNameIgnoreCase(String aliasName);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete all aliases for a specific tag
|
||||||
|
*/
|
||||||
|
void deleteByCanonicalTag(Tag canonicalTag);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Count aliases for a specific tag
|
||||||
|
*/
|
||||||
|
@Query("SELECT COUNT(ta) FROM TagAlias ta WHERE ta.canonicalTag.id = :tagId")
|
||||||
|
long countByCanonicalTagId(@Param("tagId") UUID tagId);
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find aliases that start with the given prefix (case-insensitive)
|
||||||
|
*/
|
||||||
|
@Query("SELECT ta FROM TagAlias ta WHERE LOWER(ta.aliasName) LIKE LOWER(CONCAT(:prefix, '%'))")
|
||||||
|
List<TagAlias> findByAliasNameStartingWithIgnoreCase(@Param("prefix") String prefix);
|
||||||
|
}
|
||||||
@@ -17,8 +17,12 @@ public interface TagRepository extends JpaRepository<Tag, UUID> {
|
|||||||
|
|
||||||
Optional<Tag> findByName(String name);
|
Optional<Tag> findByName(String name);
|
||||||
|
|
||||||
|
Optional<Tag> findByNameIgnoreCase(String name);
|
||||||
|
|
||||||
boolean existsByName(String name);
|
boolean existsByName(String name);
|
||||||
|
|
||||||
|
boolean existsByNameIgnoreCase(String name);
|
||||||
|
|
||||||
List<Tag> findByNameContainingIgnoreCase(String name);
|
List<Tag> findByNameContainingIgnoreCase(String name);
|
||||||
|
|
||||||
Page<Tag> findByNameContainingIgnoreCase(String name, Pageable pageable);
|
Page<Tag> findByNameContainingIgnoreCase(String name, Pageable pageable);
|
||||||
@@ -54,4 +58,7 @@ public interface TagRepository extends JpaRepository<Tag, UUID> {
|
|||||||
|
|
||||||
@Query("SELECT COUNT(t) FROM Tag t WHERE SIZE(t.stories) > 0")
|
@Query("SELECT COUNT(t) FROM Tag t WHERE SIZE(t.stories) > 0")
|
||||||
long countUsedTags();
|
long countUsedTags();
|
||||||
|
|
||||||
|
@Query("SELECT t FROM Tag t WHERE SIZE(t.collections) > 0 ORDER BY SIZE(t.collections) DESC, t.name ASC")
|
||||||
|
List<Tag> findTagsUsedByCollections();
|
||||||
}
|
}
|
||||||
@@ -3,6 +3,7 @@ package com.storycove.security;
|
|||||||
import com.storycove.util.JwtUtil;
|
import com.storycove.util.JwtUtil;
|
||||||
import jakarta.servlet.FilterChain;
|
import jakarta.servlet.FilterChain;
|
||||||
import jakarta.servlet.ServletException;
|
import jakarta.servlet.ServletException;
|
||||||
|
import jakarta.servlet.http.Cookie;
|
||||||
import jakarta.servlet.http.HttpServletRequest;
|
import jakarta.servlet.http.HttpServletRequest;
|
||||||
import jakarta.servlet.http.HttpServletResponse;
|
import jakarta.servlet.http.HttpServletResponse;
|
||||||
import org.springframework.security.authentication.UsernamePasswordAuthenticationToken;
|
import org.springframework.security.authentication.UsernamePasswordAuthenticationToken;
|
||||||
@@ -28,13 +29,27 @@ public class JwtAuthenticationFilter extends OncePerRequestFilter {
|
|||||||
HttpServletResponse response,
|
HttpServletResponse response,
|
||||||
FilterChain filterChain) throws ServletException, IOException {
|
FilterChain filterChain) throws ServletException, IOException {
|
||||||
|
|
||||||
String authHeader = request.getHeader("Authorization");
|
|
||||||
String token = null;
|
String token = null;
|
||||||
|
|
||||||
|
// First try to get token from Authorization header
|
||||||
|
String authHeader = request.getHeader("Authorization");
|
||||||
if (authHeader != null && authHeader.startsWith("Bearer ")) {
|
if (authHeader != null && authHeader.startsWith("Bearer ")) {
|
||||||
token = authHeader.substring(7);
|
token = authHeader.substring(7);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// If no token in header, try to get from cookies
|
||||||
|
if (token == null) {
|
||||||
|
Cookie[] cookies = request.getCookies();
|
||||||
|
if (cookies != null) {
|
||||||
|
for (Cookie cookie : cookies) {
|
||||||
|
if ("token".equals(cookie.getName())) {
|
||||||
|
token = cookie.getValue();
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if (token != null && jwtUtil.validateToken(token) && !jwtUtil.isTokenExpired(token)) {
|
if (token != null && jwtUtil.validateToken(token) && !jwtUtil.isTokenExpired(token)) {
|
||||||
String subject = jwtUtil.getSubjectFromToken(token);
|
String subject = jwtUtil.getSubjectFromToken(token);
|
||||||
|
|
||||||
|
|||||||
@@ -31,7 +31,7 @@ public class AuthorService {
|
|||||||
private final TypesenseService typesenseService;
|
private final TypesenseService typesenseService;
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
public AuthorService(AuthorRepository authorRepository, TypesenseService typesenseService) {
|
public AuthorService(AuthorRepository authorRepository, @Autowired(required = false) TypesenseService typesenseService) {
|
||||||
this.authorRepository = authorRepository;
|
this.authorRepository = authorRepository;
|
||||||
this.typesenseService = typesenseService;
|
this.typesenseService = typesenseService;
|
||||||
}
|
}
|
||||||
@@ -133,11 +133,13 @@ public class AuthorService {
|
|||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Index in Typesense
|
// Index in Typesense
|
||||||
|
if (typesenseService != null) {
|
||||||
try {
|
try {
|
||||||
typesenseService.indexAuthor(savedAuthor);
|
typesenseService.indexAuthor(savedAuthor);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to index author in Typesense: " + savedAuthor.getName(), e);
|
logger.warn("Failed to index author in Typesense: " + savedAuthor.getName(), e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
}
|
}
|
||||||
@@ -155,11 +157,13 @@ public class AuthorService {
|
|||||||
Author savedAuthor = authorRepository.save(existingAuthor);
|
Author savedAuthor = authorRepository.save(existingAuthor);
|
||||||
|
|
||||||
// Update in Typesense
|
// Update in Typesense
|
||||||
|
if (typesenseService != null) {
|
||||||
try {
|
try {
|
||||||
typesenseService.updateAuthor(savedAuthor);
|
typesenseService.updateAuthor(savedAuthor);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to update author in Typesense: " + savedAuthor.getName(), e);
|
logger.warn("Failed to update author in Typesense: " + savedAuthor.getName(), e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
}
|
}
|
||||||
@@ -175,12 +179,14 @@ public class AuthorService {
|
|||||||
authorRepository.delete(author);
|
authorRepository.delete(author);
|
||||||
|
|
||||||
// Remove from Typesense
|
// Remove from Typesense
|
||||||
|
if (typesenseService != null) {
|
||||||
try {
|
try {
|
||||||
typesenseService.deleteAuthor(id.toString());
|
typesenseService.deleteAuthor(id.toString());
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to delete author from Typesense: " + author.getName(), e);
|
logger.warn("Failed to delete author from Typesense: " + author.getName(), e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
public Author addUrl(UUID id, String url) {
|
public Author addUrl(UUID id, String url) {
|
||||||
Author author = findById(id);
|
Author author = findById(id);
|
||||||
@@ -188,11 +194,13 @@ public class AuthorService {
|
|||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Update in Typesense
|
// Update in Typesense
|
||||||
|
if (typesenseService != null) {
|
||||||
try {
|
try {
|
||||||
typesenseService.updateAuthor(savedAuthor);
|
typesenseService.updateAuthor(savedAuthor);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to update author in Typesense after adding URL: " + savedAuthor.getName(), e);
|
logger.warn("Failed to update author in Typesense after adding URL: " + savedAuthor.getName(), e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
}
|
}
|
||||||
@@ -203,11 +211,13 @@ public class AuthorService {
|
|||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Update in Typesense
|
// Update in Typesense
|
||||||
|
if (typesenseService != null) {
|
||||||
try {
|
try {
|
||||||
typesenseService.updateAuthor(savedAuthor);
|
typesenseService.updateAuthor(savedAuthor);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to update author in Typesense after removing URL: " + savedAuthor.getName(), e);
|
logger.warn("Failed to update author in Typesense after removing URL: " + savedAuthor.getName(), e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
}
|
}
|
||||||
@@ -232,7 +242,7 @@ public class AuthorService {
|
|||||||
rating, author.getName(), author.getAuthorRating());
|
rating, author.getName(), author.getAuthorRating());
|
||||||
|
|
||||||
author.setAuthorRating(rating);
|
author.setAuthorRating(rating);
|
||||||
Author savedAuthor = authorRepository.save(author);
|
authorRepository.save(author);
|
||||||
|
|
||||||
// Flush and refresh to ensure the entity is up-to-date
|
// Flush and refresh to ensure the entity is up-to-date
|
||||||
authorRepository.flush();
|
authorRepository.flush();
|
||||||
@@ -242,11 +252,13 @@ public class AuthorService {
|
|||||||
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
|
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
|
||||||
|
|
||||||
// Update in Typesense
|
// Update in Typesense
|
||||||
|
if (typesenseService != null) {
|
||||||
try {
|
try {
|
||||||
typesenseService.updateAuthor(refreshedAuthor);
|
typesenseService.updateAuthor(refreshedAuthor);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to update author in Typesense after rating: " + refreshedAuthor.getName(), e);
|
logger.warn("Failed to update author in Typesense after rating: " + refreshedAuthor.getName(), e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return refreshedAuthor;
|
return refreshedAuthor;
|
||||||
}
|
}
|
||||||
@@ -290,11 +302,13 @@ public class AuthorService {
|
|||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Update in Typesense
|
// Update in Typesense
|
||||||
|
if (typesenseService != null) {
|
||||||
try {
|
try {
|
||||||
typesenseService.updateAuthor(savedAuthor);
|
typesenseService.updateAuthor(savedAuthor);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to update author in Typesense after setting avatar: " + savedAuthor.getName(), e);
|
logger.warn("Failed to update author in Typesense after setting avatar: " + savedAuthor.getName(), e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
}
|
}
|
||||||
@@ -305,11 +319,13 @@ public class AuthorService {
|
|||||||
Author savedAuthor = authorRepository.save(author);
|
Author savedAuthor = authorRepository.save(author);
|
||||||
|
|
||||||
// Update in Typesense
|
// Update in Typesense
|
||||||
|
if (typesenseService != null) {
|
||||||
try {
|
try {
|
||||||
typesenseService.updateAuthor(savedAuthor);
|
typesenseService.updateAuthor(savedAuthor);
|
||||||
} catch (Exception e) {
|
} catch (Exception e) {
|
||||||
logger.warn("Failed to update author in Typesense after removing avatar: " + savedAuthor.getName(), e);
|
logger.warn("Failed to update author in Typesense after removing avatar: " + savedAuthor.getName(), e);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return savedAuthor;
|
return savedAuthor;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ public class CollectionSearchResult extends Collection {
|
|||||||
|
|
||||||
private Integer storedStoryCount;
|
private Integer storedStoryCount;
|
||||||
private Integer storedTotalWordCount;
|
private Integer storedTotalWordCount;
|
||||||
|
private int wordsPerMinute = 200; // Default, can be overridden
|
||||||
|
|
||||||
public CollectionSearchResult(Collection collection) {
|
public CollectionSearchResult(Collection collection) {
|
||||||
this.setId(collection.getId());
|
this.setId(collection.getId());
|
||||||
@@ -20,6 +21,7 @@ public class CollectionSearchResult extends Collection {
|
|||||||
this.setCreatedAt(collection.getCreatedAt());
|
this.setCreatedAt(collection.getCreatedAt());
|
||||||
this.setUpdatedAt(collection.getUpdatedAt());
|
this.setUpdatedAt(collection.getUpdatedAt());
|
||||||
this.setCoverImagePath(collection.getCoverImagePath());
|
this.setCoverImagePath(collection.getCoverImagePath());
|
||||||
|
this.setTagNames(collection.getTagNames()); // Copy tag names for search results
|
||||||
// Note: don't copy collectionStories or tags to avoid lazy loading issues
|
// Note: don't copy collectionStories or tags to avoid lazy loading issues
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -31,6 +33,10 @@ public class CollectionSearchResult extends Collection {
|
|||||||
this.storedTotalWordCount = totalWordCount;
|
this.storedTotalWordCount = totalWordCount;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public void setWordsPerMinute(int wordsPerMinute) {
|
||||||
|
this.wordsPerMinute = wordsPerMinute;
|
||||||
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int getStoryCount() {
|
public int getStoryCount() {
|
||||||
return storedStoryCount != null ? storedStoryCount : 0;
|
return storedStoryCount != null ? storedStoryCount : 0;
|
||||||
@@ -43,8 +49,7 @@ public class CollectionSearchResult extends Collection {
|
|||||||
|
|
||||||
@Override
|
@Override
|
||||||
public int getEstimatedReadingTime() {
|
public int getEstimatedReadingTime() {
|
||||||
// Assuming 200 words per minute reading speed
|
return Math.max(1, getTotalWordCount() / wordsPerMinute);
|
||||||
return Math.max(1, getTotalWordCount() / 200);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
@Override
|
@Override
|
||||||
|
|||||||
@@ -1,6 +1,8 @@
|
|||||||
package com.storycove.service;
|
package com.storycove.service;
|
||||||
|
|
||||||
import com.storycove.dto.SearchResultDto;
|
import com.storycove.dto.SearchResultDto;
|
||||||
|
import com.storycove.dto.StoryReadingDto;
|
||||||
|
import com.storycove.dto.TagDto;
|
||||||
import com.storycove.entity.Collection;
|
import com.storycove.entity.Collection;
|
||||||
import com.storycove.entity.CollectionStory;
|
import com.storycove.entity.CollectionStory;
|
||||||
import com.storycove.entity.Story;
|
import com.storycove.entity.Story;
|
||||||
@@ -9,14 +11,10 @@ import com.storycove.repository.CollectionRepository;
|
|||||||
import com.storycove.repository.CollectionStoryRepository;
|
import com.storycove.repository.CollectionStoryRepository;
|
||||||
import com.storycove.repository.StoryRepository;
|
import com.storycove.repository.StoryRepository;
|
||||||
import com.storycove.repository.TagRepository;
|
import com.storycove.repository.TagRepository;
|
||||||
import com.storycove.service.exception.DuplicateResourceException;
|
|
||||||
import com.storycove.service.exception.ResourceNotFoundException;
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
import org.slf4j.Logger;
|
import org.slf4j.Logger;
|
||||||
import org.slf4j.LoggerFactory;
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.data.domain.Page;
|
|
||||||
import org.springframework.data.domain.PageRequest;
|
|
||||||
import org.springframework.data.domain.Pageable;
|
|
||||||
import org.springframework.stereotype.Service;
|
import org.springframework.stereotype.Service;
|
||||||
import org.springframework.transaction.annotation.Transactional;
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
|
||||||
@@ -34,18 +32,21 @@ public class CollectionService {
|
|||||||
private final StoryRepository storyRepository;
|
private final StoryRepository storyRepository;
|
||||||
private final TagRepository tagRepository;
|
private final TagRepository tagRepository;
|
||||||
private final TypesenseService typesenseService;
|
private final TypesenseService typesenseService;
|
||||||
|
private final ReadingTimeService readingTimeService;
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
public CollectionService(CollectionRepository collectionRepository,
|
public CollectionService(CollectionRepository collectionRepository,
|
||||||
CollectionStoryRepository collectionStoryRepository,
|
CollectionStoryRepository collectionStoryRepository,
|
||||||
StoryRepository storyRepository,
|
StoryRepository storyRepository,
|
||||||
TagRepository tagRepository,
|
TagRepository tagRepository,
|
||||||
@Autowired(required = false) TypesenseService typesenseService) {
|
@Autowired(required = false) TypesenseService typesenseService,
|
||||||
|
ReadingTimeService readingTimeService) {
|
||||||
this.collectionRepository = collectionRepository;
|
this.collectionRepository = collectionRepository;
|
||||||
this.collectionStoryRepository = collectionStoryRepository;
|
this.collectionStoryRepository = collectionStoryRepository;
|
||||||
this.storyRepository = storyRepository;
|
this.storyRepository = storyRepository;
|
||||||
this.tagRepository = tagRepository;
|
this.tagRepository = tagRepository;
|
||||||
this.typesenseService = typesenseService;
|
this.typesenseService = typesenseService;
|
||||||
|
this.readingTimeService = readingTimeService;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -78,6 +79,13 @@ public class CollectionService {
|
|||||||
.orElseThrow(() -> new ResourceNotFoundException("Collection not found with id: " + id));
|
.orElseThrow(() -> new ResourceNotFoundException("Collection not found with id: " + id));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find all collections with tags for reindexing
|
||||||
|
*/
|
||||||
|
public List<Collection> findAllWithTags() {
|
||||||
|
return collectionRepository.findAllWithTags();
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Create a new collection with optional initial stories
|
* Create a new collection with optional initial stories
|
||||||
*/
|
*/
|
||||||
@@ -254,7 +262,7 @@ public class CollectionService {
|
|||||||
*/
|
*/
|
||||||
@Transactional
|
@Transactional
|
||||||
public void reorderStories(UUID collectionId, List<Map<String, Object>> storyOrders) {
|
public void reorderStories(UUID collectionId, List<Map<String, Object>> storyOrders) {
|
||||||
Collection collection = findByIdBasic(collectionId);
|
findByIdBasic(collectionId); // Validate collection exists
|
||||||
|
|
||||||
// Two-phase update to avoid unique constraint violations:
|
// Two-phase update to avoid unique constraint violations:
|
||||||
// Phase 1: Set all positions to negative values (temporary)
|
// Phase 1: Set all positions to negative values (temporary)
|
||||||
@@ -326,7 +334,7 @@ public class CollectionService {
|
|||||||
);
|
);
|
||||||
|
|
||||||
return Map.of(
|
return Map.of(
|
||||||
"story", story,
|
"story", convertToReadingDto(story),
|
||||||
"collection", collectionContext
|
"collection", collectionContext
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -344,7 +352,7 @@ public class CollectionService {
|
|||||||
int totalWordCount = collectionStories.stream()
|
int totalWordCount = collectionStories.stream()
|
||||||
.mapToInt(cs -> cs.getStory().getWordCount() != null ? cs.getStory().getWordCount() : 0)
|
.mapToInt(cs -> cs.getStory().getWordCount() != null ? cs.getStory().getWordCount() : 0)
|
||||||
.sum();
|
.sum();
|
||||||
int estimatedReadingTime = Math.max(1, totalWordCount / 200); // 200 words per minute
|
int estimatedReadingTime = readingTimeService.calculateReadingTime(totalWordCount);
|
||||||
|
|
||||||
double averageStoryRating = collectionStories.stream()
|
double averageStoryRating = collectionStories.stream()
|
||||||
.filter(cs -> cs.getStory().getRating() != null)
|
.filter(cs -> cs.getStory().getRating() != null)
|
||||||
@@ -420,4 +428,49 @@ public class CollectionService {
|
|||||||
public List<Collection> findAllForIndexing() {
|
public List<Collection> findAllForIndexing() {
|
||||||
return collectionRepository.findAllActiveCollections();
|
return collectionRepository.findAllActiveCollections();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private StoryReadingDto convertToReadingDto(Story story) {
|
||||||
|
StoryReadingDto dto = new StoryReadingDto();
|
||||||
|
dto.setId(story.getId());
|
||||||
|
dto.setTitle(story.getTitle());
|
||||||
|
dto.setSummary(story.getSummary());
|
||||||
|
dto.setDescription(story.getDescription());
|
||||||
|
dto.setContentHtml(story.getContentHtml());
|
||||||
|
dto.setSourceUrl(story.getSourceUrl());
|
||||||
|
dto.setCoverPath(story.getCoverPath());
|
||||||
|
dto.setWordCount(story.getWordCount());
|
||||||
|
dto.setRating(story.getRating());
|
||||||
|
dto.setVolume(story.getVolume());
|
||||||
|
dto.setCreatedAt(story.getCreatedAt());
|
||||||
|
dto.setUpdatedAt(story.getUpdatedAt());
|
||||||
|
|
||||||
|
// Reading progress fields
|
||||||
|
dto.setIsRead(story.getIsRead());
|
||||||
|
dto.setReadingPosition(story.getReadingPosition());
|
||||||
|
dto.setLastReadAt(story.getLastReadAt());
|
||||||
|
|
||||||
|
if (story.getAuthor() != null) {
|
||||||
|
dto.setAuthorId(story.getAuthor().getId());
|
||||||
|
dto.setAuthorName(story.getAuthor().getName());
|
||||||
|
}
|
||||||
|
|
||||||
|
if (story.getSeries() != null) {
|
||||||
|
dto.setSeriesId(story.getSeries().getId());
|
||||||
|
dto.setSeriesName(story.getSeries().getName());
|
||||||
|
}
|
||||||
|
|
||||||
|
dto.setTags(story.getTags().stream()
|
||||||
|
.map(this::convertTagToDto)
|
||||||
|
.collect(Collectors.toList()));
|
||||||
|
|
||||||
|
return dto;
|
||||||
|
}
|
||||||
|
|
||||||
|
private TagDto convertTagToDto(Tag tag) {
|
||||||
|
TagDto dto = new TagDto();
|
||||||
|
dto.setId(tag.getId());
|
||||||
|
dto.setName(tag.getName());
|
||||||
|
dto.setStoryCount(tag.getStories().size());
|
||||||
|
return dto;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,584 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import com.storycove.dto.EPUBExportRequest;
|
||||||
|
import com.storycove.entity.Collection;
|
||||||
|
import com.storycove.entity.ReadingPosition;
|
||||||
|
import com.storycove.entity.Story;
|
||||||
|
import com.storycove.repository.ReadingPositionRepository;
|
||||||
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
|
|
||||||
|
import nl.siegmann.epublib.domain.*;
|
||||||
|
import nl.siegmann.epublib.epub.EpubWriter;
|
||||||
|
|
||||||
|
import org.jsoup.Jsoup;
|
||||||
|
import org.jsoup.nodes.Document;
|
||||||
|
import org.jsoup.nodes.Element;
|
||||||
|
import org.jsoup.select.Elements;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.core.io.ByteArrayResource;
|
||||||
|
import org.springframework.core.io.Resource;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
|
||||||
|
import java.io.ByteArrayOutputStream;
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.nio.file.Files;
|
||||||
|
import java.nio.file.Path;
|
||||||
|
import java.nio.file.Paths;
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.time.format.DateTimeFormatter;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.UUID;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
@Transactional
|
||||||
|
public class EPUBExportService {
|
||||||
|
|
||||||
|
private final StoryService storyService;
|
||||||
|
private final ReadingPositionRepository readingPositionRepository;
|
||||||
|
private final CollectionService collectionService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
public EPUBExportService(StoryService storyService,
|
||||||
|
ReadingPositionRepository readingPositionRepository,
|
||||||
|
CollectionService collectionService) {
|
||||||
|
this.storyService = storyService;
|
||||||
|
this.readingPositionRepository = readingPositionRepository;
|
||||||
|
this.collectionService = collectionService;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Resource exportStoryAsEPUB(EPUBExportRequest request) throws IOException {
|
||||||
|
Story story = storyService.findById(request.getStoryId());
|
||||||
|
|
||||||
|
Book book = createEPUBBook(story, request);
|
||||||
|
|
||||||
|
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
|
||||||
|
EpubWriter epubWriter = new EpubWriter();
|
||||||
|
epubWriter.write(book, outputStream);
|
||||||
|
|
||||||
|
return new ByteArrayResource(outputStream.toByteArray());
|
||||||
|
}
|
||||||
|
|
||||||
|
public Resource exportCollectionAsEPUB(UUID collectionId, EPUBExportRequest request) throws IOException {
|
||||||
|
Collection collection = collectionService.findById(collectionId);
|
||||||
|
List<Story> stories = collection.getCollectionStories().stream()
|
||||||
|
.sorted((cs1, cs2) -> Integer.compare(cs1.getPosition(), cs2.getPosition()))
|
||||||
|
.map(cs -> cs.getStory())
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
|
||||||
|
if (stories.isEmpty()) {
|
||||||
|
throw new ResourceNotFoundException("Collection contains no stories to export");
|
||||||
|
}
|
||||||
|
|
||||||
|
Book book = createCollectionEPUBBook(collection, stories, request);
|
||||||
|
|
||||||
|
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
|
||||||
|
EpubWriter epubWriter = new EpubWriter();
|
||||||
|
epubWriter.write(book, outputStream);
|
||||||
|
|
||||||
|
return new ByteArrayResource(outputStream.toByteArray());
|
||||||
|
}
|
||||||
|
|
||||||
|
private Book createEPUBBook(Story story, EPUBExportRequest request) throws IOException {
|
||||||
|
Book book = new Book();
|
||||||
|
|
||||||
|
setupMetadata(book, story, request);
|
||||||
|
|
||||||
|
addCoverImage(book, story, request);
|
||||||
|
|
||||||
|
addContent(book, story, request);
|
||||||
|
|
||||||
|
addReadingPosition(book, story, request);
|
||||||
|
|
||||||
|
return book;
|
||||||
|
}
|
||||||
|
|
||||||
|
private Book createCollectionEPUBBook(Collection collection, List<Story> stories, EPUBExportRequest request) throws IOException {
|
||||||
|
Book book = new Book();
|
||||||
|
|
||||||
|
setupCollectionMetadata(book, collection, stories, request);
|
||||||
|
|
||||||
|
addCollectionCoverImage(book, collection, request);
|
||||||
|
|
||||||
|
addCollectionContent(book, stories, request);
|
||||||
|
|
||||||
|
return book;
|
||||||
|
}
|
||||||
|
|
||||||
|
private void setupMetadata(Book book, Story story, EPUBExportRequest request) {
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
String title = request.getCustomTitle() != null ?
|
||||||
|
request.getCustomTitle() : story.getTitle();
|
||||||
|
metadata.addTitle(title);
|
||||||
|
|
||||||
|
String authorName = request.getCustomAuthor() != null ?
|
||||||
|
request.getCustomAuthor() :
|
||||||
|
(story.getAuthor() != null ? story.getAuthor().getName() : "Unknown Author");
|
||||||
|
metadata.addAuthor(new Author(authorName));
|
||||||
|
|
||||||
|
metadata.setLanguage(request.getLanguage() != null ? request.getLanguage() : "en");
|
||||||
|
|
||||||
|
metadata.addIdentifier(new Identifier("storycove", story.getId().toString()));
|
||||||
|
|
||||||
|
if (story.getDescription() != null) {
|
||||||
|
metadata.addDescription(story.getDescription());
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getIncludeMetadata()) {
|
||||||
|
metadata.addDate(new Date(java.util.Date.from(
|
||||||
|
story.getCreatedAt().atZone(java.time.ZoneId.systemDefault()).toInstant()
|
||||||
|
), Date.Event.CREATION));
|
||||||
|
|
||||||
|
if (story.getSeries() != null) {
|
||||||
|
// Add series and metadata info to description instead of using addMeta
|
||||||
|
StringBuilder description = new StringBuilder();
|
||||||
|
if (story.getDescription() != null) {
|
||||||
|
description.append(story.getDescription()).append("\n\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
description.append("Series: ").append(story.getSeries().getName());
|
||||||
|
if (story.getVolume() != null) {
|
||||||
|
description.append(" (Volume ").append(story.getVolume()).append(")");
|
||||||
|
}
|
||||||
|
description.append("\n");
|
||||||
|
|
||||||
|
if (story.getWordCount() != null) {
|
||||||
|
description.append("Word Count: ").append(story.getWordCount()).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (story.getRating() != null) {
|
||||||
|
description.append("Rating: ").append(story.getRating()).append("/5\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!story.getTags().isEmpty()) {
|
||||||
|
String tags = story.getTags().stream()
|
||||||
|
.map(tag -> tag.getName())
|
||||||
|
.reduce((a, b) -> a + ", " + b)
|
||||||
|
.orElse("");
|
||||||
|
description.append("Tags: ").append(tags).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
description.append("\nGenerated by StoryCove on ")
|
||||||
|
.append(LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||||
|
|
||||||
|
metadata.addDescription(description.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getCustomMetadata() != null && !request.getCustomMetadata().isEmpty()) {
|
||||||
|
// Add custom metadata to description since addMeta doesn't exist
|
||||||
|
StringBuilder customDesc = new StringBuilder();
|
||||||
|
for (String customMeta : request.getCustomMetadata()) {
|
||||||
|
String[] parts = customMeta.split(":", 2);
|
||||||
|
if (parts.length == 2) {
|
||||||
|
customDesc.append(parts[0].trim()).append(": ").append(parts[1].trim()).append("\n");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (customDesc.length() > 0) {
|
||||||
|
String existingDesc = metadata.getDescriptions().isEmpty() ? "" : metadata.getDescriptions().get(0);
|
||||||
|
metadata.addDescription(existingDesc + "\n" + customDesc.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addCoverImage(Book book, Story story, EPUBExportRequest request) {
|
||||||
|
if (!request.getIncludeCoverImage() || story.getCoverPath() == null) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
Path coverPath = Paths.get(story.getCoverPath());
|
||||||
|
if (Files.exists(coverPath)) {
|
||||||
|
byte[] coverImageData = Files.readAllBytes(coverPath);
|
||||||
|
String mimeType = Files.probeContentType(coverPath);
|
||||||
|
if (mimeType == null) {
|
||||||
|
mimeType = "image/jpeg";
|
||||||
|
}
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource coverResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(coverImageData, "cover.jpg");
|
||||||
|
|
||||||
|
book.setCoverImage(coverResource);
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
// Skip cover image on error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addContent(Book book, Story story, EPUBExportRequest request) {
|
||||||
|
String content = story.getContentHtml();
|
||||||
|
if (content == null) {
|
||||||
|
content = story.getContentPlain() != null ?
|
||||||
|
"<p>" + story.getContentPlain().replace("\n", "</p><p>") + "</p>" :
|
||||||
|
"<p>No content available</p>";
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getSplitByChapters()) {
|
||||||
|
addChapterizedContent(book, content, request);
|
||||||
|
} else {
|
||||||
|
addSingleChapterContent(book, content, story);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addSingleChapterContent(Book book, String content, Story story) {
|
||||||
|
String html = createChapterHTML(story.getTitle(), content);
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter.html");
|
||||||
|
|
||||||
|
book.addSection(story.getTitle(), chapterResource);
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addChapterizedContent(Book book, String content, EPUBExportRequest request) {
|
||||||
|
Document doc = Jsoup.parse(content);
|
||||||
|
Elements chapters = doc.select("div.chapter, h1, h2, h3");
|
||||||
|
|
||||||
|
if (chapters.isEmpty()) {
|
||||||
|
List<String> paragraphs = splitByWords(content,
|
||||||
|
request.getMaxWordsPerChapter() != null ? request.getMaxWordsPerChapter() : 2000);
|
||||||
|
|
||||||
|
for (int i = 0; i < paragraphs.size(); i++) {
|
||||||
|
String chapterTitle = "Chapter " + (i + 1);
|
||||||
|
String html = createChapterHTML(chapterTitle, paragraphs.get(i));
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter" + (i + 1) + ".html");
|
||||||
|
|
||||||
|
book.addSection(chapterTitle, chapterResource);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
for (int i = 0; i < chapters.size(); i++) {
|
||||||
|
Element chapter = chapters.get(i);
|
||||||
|
String chapterTitle = chapter.text();
|
||||||
|
if (chapterTitle.trim().isEmpty()) {
|
||||||
|
chapterTitle = "Chapter " + (i + 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
String chapterContent = chapter.html();
|
||||||
|
String html = createChapterHTML(chapterTitle, chapterContent);
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter" + (i + 1) + ".html");
|
||||||
|
|
||||||
|
book.addSection(chapterTitle, chapterResource);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private List<String> splitByWords(String content, int maxWordsPerChapter) {
|
||||||
|
String[] words = content.split("\\s+");
|
||||||
|
List<String> chapters = new ArrayList<>();
|
||||||
|
StringBuilder currentChapter = new StringBuilder();
|
||||||
|
int wordCount = 0;
|
||||||
|
|
||||||
|
for (String word : words) {
|
||||||
|
currentChapter.append(word).append(" ");
|
||||||
|
wordCount++;
|
||||||
|
|
||||||
|
if (wordCount >= maxWordsPerChapter) {
|
||||||
|
chapters.add(currentChapter.toString().trim());
|
||||||
|
currentChapter = new StringBuilder();
|
||||||
|
wordCount = 0;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (currentChapter.length() > 0) {
|
||||||
|
chapters.add(currentChapter.toString().trim());
|
||||||
|
}
|
||||||
|
|
||||||
|
return chapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String createChapterHTML(String title, String content) {
|
||||||
|
return "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" +
|
||||||
|
"<!DOCTYPE html PUBLIC \"-//W3C//DTD XHTML 1.1//EN\" " +
|
||||||
|
"\"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd\">" +
|
||||||
|
"<html xmlns=\"http://www.w3.org/1999/xhtml\">" +
|
||||||
|
"<head>" +
|
||||||
|
"<title>" + escapeHtml(title) + "</title>" +
|
||||||
|
"<style type=\"text/css\">" +
|
||||||
|
"body { font-family: serif; margin: 1em; }" +
|
||||||
|
"h1 { text-align: center; }" +
|
||||||
|
"p { text-indent: 1em; margin: 0.5em 0; }" +
|
||||||
|
"</style>" +
|
||||||
|
"</head>" +
|
||||||
|
"<body>" +
|
||||||
|
"<h1>" + escapeHtml(title) + "</h1>" +
|
||||||
|
fixHtmlForXhtml(content) +
|
||||||
|
"</body>" +
|
||||||
|
"</html>";
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addReadingPosition(Book book, Story story, EPUBExportRequest request) {
|
||||||
|
if (!request.getIncludeReadingPosition()) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
Optional<ReadingPosition> positionOpt = readingPositionRepository.findByStoryId(story.getId());
|
||||||
|
if (positionOpt.isPresent()) {
|
||||||
|
ReadingPosition position = positionOpt.get();
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
// Add reading position to description since addMeta doesn't exist
|
||||||
|
StringBuilder positionDesc = new StringBuilder();
|
||||||
|
if (position.getEpubCfi() != null) {
|
||||||
|
positionDesc.append("EPUB CFI: ").append(position.getEpubCfi()).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (position.getChapterIndex() != null && position.getWordPosition() != null) {
|
||||||
|
positionDesc.append("Reading Position: Chapter ")
|
||||||
|
.append(position.getChapterIndex())
|
||||||
|
.append(", Word ").append(position.getWordPosition()).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (position.getPercentageComplete() != null) {
|
||||||
|
positionDesc.append("Reading Progress: ")
|
||||||
|
.append(String.format("%.1f%%", position.getPercentageComplete())).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
positionDesc.append("Last Read: ")
|
||||||
|
.append(position.getUpdatedAt().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||||
|
|
||||||
|
String existingDesc = metadata.getDescriptions().isEmpty() ? "" : metadata.getDescriptions().get(0);
|
||||||
|
metadata.addDescription(existingDesc + "\n\n--- Reading Position ---\n" + positionDesc.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private String fixHtmlForXhtml(String html) {
|
||||||
|
if (html == null) return "";
|
||||||
|
|
||||||
|
// Fix common XHTML validation issues
|
||||||
|
String fixed = html
|
||||||
|
// Fix self-closing tags to be XHTML compliant
|
||||||
|
.replaceAll("<br>", "<br />")
|
||||||
|
.replaceAll("<hr>", "<hr />")
|
||||||
|
.replaceAll("<img([^>]*)>", "<img$1 />")
|
||||||
|
.replaceAll("<input([^>]*)>", "<input$1 />")
|
||||||
|
.replaceAll("<area([^>]*)>", "<area$1 />")
|
||||||
|
.replaceAll("<base([^>]*)>", "<base$1 />")
|
||||||
|
.replaceAll("<col([^>]*)>", "<col$1 />")
|
||||||
|
.replaceAll("<embed([^>]*)>", "<embed$1 />")
|
||||||
|
.replaceAll("<link([^>]*)>", "<link$1 />")
|
||||||
|
.replaceAll("<meta([^>]*)>", "<meta$1 />")
|
||||||
|
.replaceAll("<param([^>]*)>", "<param$1 />")
|
||||||
|
.replaceAll("<source([^>]*)>", "<source$1 />")
|
||||||
|
.replaceAll("<track([^>]*)>", "<track$1 />")
|
||||||
|
.replaceAll("<wbr([^>]*)>", "<wbr$1 />");
|
||||||
|
|
||||||
|
return fixed;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String escapeHtml(String text) {
|
||||||
|
if (text == null) return "";
|
||||||
|
return text.replace("&", "&")
|
||||||
|
.replace("<", "<")
|
||||||
|
.replace(">", ">")
|
||||||
|
.replace("\"", """)
|
||||||
|
.replace("'", "'");
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getEPUBFilename(Story story) {
|
||||||
|
StringBuilder filename = new StringBuilder();
|
||||||
|
|
||||||
|
if (story.getAuthor() != null) {
|
||||||
|
filename.append(sanitizeFilename(story.getAuthor().getName()))
|
||||||
|
.append(" - ");
|
||||||
|
}
|
||||||
|
|
||||||
|
filename.append(sanitizeFilename(story.getTitle()));
|
||||||
|
|
||||||
|
if (story.getSeries() != null && story.getVolume() != null) {
|
||||||
|
filename.append(" (")
|
||||||
|
.append(sanitizeFilename(story.getSeries().getName()))
|
||||||
|
.append(" ")
|
||||||
|
.append(story.getVolume())
|
||||||
|
.append(")");
|
||||||
|
}
|
||||||
|
|
||||||
|
filename.append(".epub");
|
||||||
|
|
||||||
|
return filename.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
private String sanitizeFilename(String filename) {
|
||||||
|
if (filename == null) return "unknown";
|
||||||
|
return filename.replaceAll("[^a-zA-Z0-9._\\- ]", "")
|
||||||
|
.trim()
|
||||||
|
.replaceAll("\\s+", "_");
|
||||||
|
}
|
||||||
|
|
||||||
|
private void setupCollectionMetadata(Book book, Collection collection, List<Story> stories, EPUBExportRequest request) {
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
String title = request.getCustomTitle() != null ?
|
||||||
|
request.getCustomTitle() : collection.getName();
|
||||||
|
metadata.addTitle(title);
|
||||||
|
|
||||||
|
// Use collection creator as author, or combine story authors
|
||||||
|
String authorName = "Collection";
|
||||||
|
if (stories.size() == 1) {
|
||||||
|
Story story = stories.get(0);
|
||||||
|
authorName = story.getAuthor() != null ? story.getAuthor().getName() : "Unknown Author";
|
||||||
|
} else {
|
||||||
|
// For multiple stories, use "Various Authors" or collection name
|
||||||
|
authorName = "Various Authors";
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getCustomAuthor() != null) {
|
||||||
|
authorName = request.getCustomAuthor();
|
||||||
|
}
|
||||||
|
|
||||||
|
metadata.addAuthor(new Author(authorName));
|
||||||
|
metadata.setLanguage(request.getLanguage() != null ? request.getLanguage() : "en");
|
||||||
|
metadata.addIdentifier(new Identifier("storycove-collection", collection.getId().toString()));
|
||||||
|
|
||||||
|
// Create description from collection description and story list
|
||||||
|
StringBuilder description = new StringBuilder();
|
||||||
|
if (collection.getDescription() != null && !collection.getDescription().trim().isEmpty()) {
|
||||||
|
description.append(collection.getDescription()).append("\n\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
description.append("This collection contains ").append(stories.size()).append(" stories:\n");
|
||||||
|
for (int i = 0; i < stories.size() && i < 10; i++) {
|
||||||
|
Story story = stories.get(i);
|
||||||
|
description.append((i + 1)).append(". ").append(story.getTitle());
|
||||||
|
if (story.getAuthor() != null) {
|
||||||
|
description.append(" by ").append(story.getAuthor().getName());
|
||||||
|
}
|
||||||
|
description.append("\n");
|
||||||
|
}
|
||||||
|
if (stories.size() > 10) {
|
||||||
|
description.append("... and ").append(stories.size() - 10).append(" more stories.");
|
||||||
|
}
|
||||||
|
|
||||||
|
metadata.addDescription(description.toString());
|
||||||
|
|
||||||
|
if (request.getIncludeMetadata()) {
|
||||||
|
metadata.addDate(new Date(java.util.Date.from(
|
||||||
|
collection.getCreatedAt().atZone(java.time.ZoneId.systemDefault()).toInstant()
|
||||||
|
), Date.Event.CREATION));
|
||||||
|
|
||||||
|
// Add collection statistics to description
|
||||||
|
int totalWordCount = stories.stream().mapToInt(s -> s.getWordCount() != null ? s.getWordCount() : 0).sum();
|
||||||
|
description.append("\n\nTotal Word Count: ").append(totalWordCount);
|
||||||
|
description.append("\nGenerated by StoryCove on ")
|
||||||
|
.append(LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||||
|
|
||||||
|
metadata.addDescription(description.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addCollectionCoverImage(Book book, Collection collection, EPUBExportRequest request) {
|
||||||
|
if (!request.getIncludeCoverImage()) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Try to use collection cover first
|
||||||
|
if (collection.getCoverImagePath() != null) {
|
||||||
|
Path coverPath = Paths.get(collection.getCoverImagePath());
|
||||||
|
if (Files.exists(coverPath)) {
|
||||||
|
byte[] coverImageData = Files.readAllBytes(coverPath);
|
||||||
|
String mimeType = Files.probeContentType(coverPath);
|
||||||
|
if (mimeType == null) {
|
||||||
|
mimeType = "image/jpeg";
|
||||||
|
}
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource coverResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(coverImageData, "collection-cover.jpg");
|
||||||
|
|
||||||
|
book.setCoverImage(coverResource);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// TODO: Could generate a composite cover from story covers
|
||||||
|
// For now, skip cover if collection doesn't have one
|
||||||
|
|
||||||
|
} catch (IOException e) {
|
||||||
|
// Skip cover image on error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addCollectionContent(Book book, List<Story> stories, EPUBExportRequest request) {
|
||||||
|
// Create table of contents chapter
|
||||||
|
StringBuilder tocContent = new StringBuilder();
|
||||||
|
tocContent.append("<h1>Table of Contents</h1>\n<ul>\n");
|
||||||
|
|
||||||
|
for (int i = 0; i < stories.size(); i++) {
|
||||||
|
Story story = stories.get(i);
|
||||||
|
tocContent.append("<li><a href=\"#story").append(i + 1).append("\">")
|
||||||
|
.append(escapeHtml(story.getTitle()));
|
||||||
|
if (story.getAuthor() != null) {
|
||||||
|
tocContent.append(" by ").append(escapeHtml(story.getAuthor().getName()));
|
||||||
|
}
|
||||||
|
tocContent.append("</a></li>\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
tocContent.append("</ul>\n");
|
||||||
|
|
||||||
|
String tocHtml = createChapterHTML("Table of Contents", tocContent.toString());
|
||||||
|
nl.siegmann.epublib.domain.Resource tocResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(tocHtml.getBytes(), "toc.html");
|
||||||
|
book.addSection("Table of Contents", tocResource);
|
||||||
|
|
||||||
|
// Add each story as a chapter
|
||||||
|
for (int i = 0; i < stories.size(); i++) {
|
||||||
|
Story story = stories.get(i);
|
||||||
|
String storyContent = story.getContentHtml();
|
||||||
|
|
||||||
|
if (storyContent == null) {
|
||||||
|
storyContent = story.getContentPlain() != null ?
|
||||||
|
"<p>" + story.getContentPlain().replace("\n", "</p><p>") + "</p>" :
|
||||||
|
"<p>No content available</p>";
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add story metadata header
|
||||||
|
StringBuilder storyHtml = new StringBuilder();
|
||||||
|
storyHtml.append("<div id=\"story").append(i + 1).append("\">\n");
|
||||||
|
storyHtml.append("<h1>").append(escapeHtml(story.getTitle())).append("</h1>\n");
|
||||||
|
if (story.getAuthor() != null) {
|
||||||
|
storyHtml.append("<p><em>by ").append(escapeHtml(story.getAuthor().getName())).append("</em></p>\n");
|
||||||
|
}
|
||||||
|
if (story.getDescription() != null && !story.getDescription().trim().isEmpty()) {
|
||||||
|
storyHtml.append("<div class=\"summary\">\n")
|
||||||
|
.append("<p>").append(escapeHtml(story.getDescription())).append("</p>\n")
|
||||||
|
.append("</div>\n");
|
||||||
|
}
|
||||||
|
storyHtml.append("<hr />\n");
|
||||||
|
storyHtml.append(fixHtmlForXhtml(storyContent));
|
||||||
|
storyHtml.append("</div>\n");
|
||||||
|
|
||||||
|
String chapterTitle = story.getTitle();
|
||||||
|
if (story.getAuthor() != null) {
|
||||||
|
chapterTitle += " by " + story.getAuthor().getName();
|
||||||
|
}
|
||||||
|
|
||||||
|
String html = createChapterHTML(chapterTitle, storyHtml.toString());
|
||||||
|
nl.siegmann.epublib.domain.Resource storyResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "story" + (i + 1) + ".html");
|
||||||
|
|
||||||
|
book.addSection(chapterTitle, storyResource);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean canExportStory(UUID storyId) {
|
||||||
|
try {
|
||||||
|
Story story = storyService.findById(storyId);
|
||||||
|
return story.getContentHtml() != null || story.getContentPlain() != null;
|
||||||
|
} catch (ResourceNotFoundException e) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCollectionEPUBFilename(Collection collection) {
|
||||||
|
StringBuilder filename = new StringBuilder();
|
||||||
|
filename.append(sanitizeFilename(collection.getName()));
|
||||||
|
filename.append("_collection.epub");
|
||||||
|
return filename.toString();
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,520 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import com.storycove.dto.EPUBImportRequest;
|
||||||
|
import com.storycove.dto.EPUBImportResponse;
|
||||||
|
import com.storycove.dto.ReadingPositionDto;
|
||||||
|
import com.storycove.entity.*;
|
||||||
|
import com.storycove.repository.ReadingPositionRepository;
|
||||||
|
import com.storycove.service.exception.InvalidFileException;
|
||||||
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
|
|
||||||
|
import nl.siegmann.epublib.domain.Book;
|
||||||
|
import nl.siegmann.epublib.domain.Metadata;
|
||||||
|
import nl.siegmann.epublib.domain.Resource;
|
||||||
|
import nl.siegmann.epublib.domain.SpineReference;
|
||||||
|
import nl.siegmann.epublib.epub.EpubReader;
|
||||||
|
|
||||||
|
import org.jsoup.Jsoup;
|
||||||
|
import org.jsoup.nodes.Document;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.io.InputStream;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Optional;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
@Transactional
|
||||||
|
public class EPUBImportService {
|
||||||
|
|
||||||
|
private final StoryService storyService;
|
||||||
|
private final AuthorService authorService;
|
||||||
|
private final SeriesService seriesService;
|
||||||
|
private final TagService tagService;
|
||||||
|
private final ReadingPositionRepository readingPositionRepository;
|
||||||
|
private final HtmlSanitizationService sanitizationService;
|
||||||
|
private final ImageService imageService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
public EPUBImportService(StoryService storyService,
|
||||||
|
AuthorService authorService,
|
||||||
|
SeriesService seriesService,
|
||||||
|
TagService tagService,
|
||||||
|
ReadingPositionRepository readingPositionRepository,
|
||||||
|
HtmlSanitizationService sanitizationService,
|
||||||
|
ImageService imageService) {
|
||||||
|
this.storyService = storyService;
|
||||||
|
this.authorService = authorService;
|
||||||
|
this.seriesService = seriesService;
|
||||||
|
this.tagService = tagService;
|
||||||
|
this.readingPositionRepository = readingPositionRepository;
|
||||||
|
this.sanitizationService = sanitizationService;
|
||||||
|
this.imageService = imageService;
|
||||||
|
}
|
||||||
|
|
||||||
|
public EPUBImportResponse importEPUB(EPUBImportRequest request) {
|
||||||
|
try {
|
||||||
|
MultipartFile epubFile = request.getEpubFile();
|
||||||
|
|
||||||
|
if (epubFile == null || epubFile.isEmpty()) {
|
||||||
|
return EPUBImportResponse.error("EPUB file is required");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!isValidEPUBFile(epubFile)) {
|
||||||
|
return EPUBImportResponse.error("Invalid EPUB file format");
|
||||||
|
}
|
||||||
|
|
||||||
|
Book book = parseEPUBFile(epubFile);
|
||||||
|
|
||||||
|
Story story = createStoryFromEPUB(book, request);
|
||||||
|
|
||||||
|
Story savedStory = storyService.create(story);
|
||||||
|
|
||||||
|
EPUBImportResponse response = EPUBImportResponse.success(savedStory.getId(), savedStory.getTitle());
|
||||||
|
response.setWordCount(savedStory.getWordCount());
|
||||||
|
response.setTotalChapters(book.getSpine().size());
|
||||||
|
|
||||||
|
if (request.getPreserveReadingPosition() != null && request.getPreserveReadingPosition()) {
|
||||||
|
ReadingPosition readingPosition = extractReadingPosition(book, savedStory);
|
||||||
|
if (readingPosition != null) {
|
||||||
|
ReadingPosition savedPosition = readingPositionRepository.save(readingPosition);
|
||||||
|
response.setReadingPosition(convertToDto(savedPosition));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return response;
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
return EPUBImportResponse.error("Failed to import EPUB: " + e.getMessage());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private boolean isValidEPUBFile(MultipartFile file) {
|
||||||
|
String filename = file.getOriginalFilename();
|
||||||
|
if (filename == null || !filename.toLowerCase().endsWith(".epub")) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
String contentType = file.getContentType();
|
||||||
|
return "application/epub+zip".equals(contentType) ||
|
||||||
|
"application/zip".equals(contentType) ||
|
||||||
|
contentType == null;
|
||||||
|
}
|
||||||
|
|
||||||
|
private Book parseEPUBFile(MultipartFile epubFile) throws IOException {
|
||||||
|
try (InputStream inputStream = epubFile.getInputStream()) {
|
||||||
|
EpubReader epubReader = new EpubReader();
|
||||||
|
return epubReader.readEpub(inputStream);
|
||||||
|
} catch (Exception e) {
|
||||||
|
throw new InvalidFileException("Failed to parse EPUB file: " + e.getMessage());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private Story createStoryFromEPUB(Book book, EPUBImportRequest request) {
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
String title = extractTitle(metadata);
|
||||||
|
String authorName = extractAuthorName(metadata, request);
|
||||||
|
String description = extractDescription(metadata);
|
||||||
|
String content = extractContent(book);
|
||||||
|
|
||||||
|
Story story = new Story();
|
||||||
|
story.setTitle(title);
|
||||||
|
story.setDescription(description);
|
||||||
|
story.setContentHtml(sanitizationService.sanitize(content));
|
||||||
|
|
||||||
|
// Extract and process cover image
|
||||||
|
if (request.getExtractCover() == null || request.getExtractCover()) {
|
||||||
|
String coverPath = extractAndSaveCoverImage(book);
|
||||||
|
if (coverPath != null) {
|
||||||
|
story.setCoverPath(coverPath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getAuthorId() != null) {
|
||||||
|
try {
|
||||||
|
Author author = authorService.findById(request.getAuthorId());
|
||||||
|
story.setAuthor(author);
|
||||||
|
} catch (ResourceNotFoundException e) {
|
||||||
|
if (request.getCreateMissingAuthor()) {
|
||||||
|
Author newAuthor = createAuthor(authorName);
|
||||||
|
story.setAuthor(newAuthor);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (authorName != null && request.getCreateMissingAuthor()) {
|
||||||
|
Author author = findOrCreateAuthor(authorName);
|
||||||
|
story.setAuthor(author);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getSeriesId() != null && request.getSeriesVolume() != null) {
|
||||||
|
try {
|
||||||
|
Series series = seriesService.findById(request.getSeriesId());
|
||||||
|
story.setSeries(series);
|
||||||
|
story.setVolume(request.getSeriesVolume());
|
||||||
|
} catch (ResourceNotFoundException e) {
|
||||||
|
if (request.getCreateMissingSeries() && request.getSeriesName() != null) {
|
||||||
|
Series newSeries = createSeries(request.getSeriesName());
|
||||||
|
story.setSeries(newSeries);
|
||||||
|
story.setVolume(request.getSeriesVolume());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle tags from request or extract from EPUB metadata
|
||||||
|
List<String> allTags = new ArrayList<>();
|
||||||
|
if (request.getTags() != null && !request.getTags().isEmpty()) {
|
||||||
|
allTags.addAll(request.getTags());
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract subjects/keywords from EPUB metadata
|
||||||
|
List<String> epubTags = extractTags(metadata);
|
||||||
|
if (epubTags != null && !epubTags.isEmpty()) {
|
||||||
|
allTags.addAll(epubTags);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove duplicates and create tags
|
||||||
|
allTags.stream()
|
||||||
|
.distinct()
|
||||||
|
.forEach(tagName -> {
|
||||||
|
Tag tag = tagService.findOrCreate(tagName.trim());
|
||||||
|
story.addTag(tag);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Extract additional metadata for potential future use
|
||||||
|
extractAdditionalMetadata(metadata, story);
|
||||||
|
|
||||||
|
return story;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractTitle(Metadata metadata) {
|
||||||
|
List<String> titles = metadata.getTitles();
|
||||||
|
if (titles != null && !titles.isEmpty()) {
|
||||||
|
return titles.get(0);
|
||||||
|
}
|
||||||
|
return "Untitled EPUB";
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractAuthorName(Metadata metadata, EPUBImportRequest request) {
|
||||||
|
if (request.getAuthorName() != null && !request.getAuthorName().trim().isEmpty()) {
|
||||||
|
return request.getAuthorName().trim();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (metadata.getAuthors() != null && !metadata.getAuthors().isEmpty()) {
|
||||||
|
return metadata.getAuthors().get(0).getFirstname() + " " + metadata.getAuthors().get(0).getLastname();
|
||||||
|
}
|
||||||
|
|
||||||
|
return "Unknown Author";
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractDescription(Metadata metadata) {
|
||||||
|
List<String> descriptions = metadata.getDescriptions();
|
||||||
|
if (descriptions != null && !descriptions.isEmpty()) {
|
||||||
|
return descriptions.get(0);
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
private List<String> extractTags(Metadata metadata) {
|
||||||
|
List<String> tags = new ArrayList<>();
|
||||||
|
|
||||||
|
// Extract subjects (main source of tags in EPUB)
|
||||||
|
List<String> subjects = metadata.getSubjects();
|
||||||
|
if (subjects != null && !subjects.isEmpty()) {
|
||||||
|
tags.addAll(subjects);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract keywords from meta tags
|
||||||
|
String keywords = metadata.getMetaAttribute("keywords");
|
||||||
|
if (keywords != null && !keywords.trim().isEmpty()) {
|
||||||
|
String[] keywordArray = keywords.split("[,;]");
|
||||||
|
for (String keyword : keywordArray) {
|
||||||
|
String trimmed = keyword.trim();
|
||||||
|
if (!trimmed.isEmpty()) {
|
||||||
|
tags.add(trimmed);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract genre information
|
||||||
|
String genre = metadata.getMetaAttribute("genre");
|
||||||
|
if (genre != null && !genre.trim().isEmpty()) {
|
||||||
|
tags.add(genre.trim());
|
||||||
|
}
|
||||||
|
|
||||||
|
return tags;
|
||||||
|
}
|
||||||
|
|
||||||
|
private void extractAdditionalMetadata(Metadata metadata, Story story) {
|
||||||
|
// Extract language (could be useful for future i18n)
|
||||||
|
String language = metadata.getLanguage();
|
||||||
|
if (language != null && !language.trim().isEmpty()) {
|
||||||
|
// Store as metadata in story description if needed
|
||||||
|
// For now, we'll just log it for potential future use
|
||||||
|
System.out.println("EPUB Language: " + language);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract publisher information
|
||||||
|
List<String> publishers = metadata.getPublishers();
|
||||||
|
if (publishers != null && !publishers.isEmpty()) {
|
||||||
|
String publisher = publishers.get(0);
|
||||||
|
// Could append to description or store separately in future
|
||||||
|
System.out.println("EPUB Publisher: " + publisher);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract publication date
|
||||||
|
List<nl.siegmann.epublib.domain.Date> dates = metadata.getDates();
|
||||||
|
if (dates != null && !dates.isEmpty()) {
|
||||||
|
for (nl.siegmann.epublib.domain.Date date : dates) {
|
||||||
|
System.out.println("EPUB Date (" + date.getEvent() + "): " + date.getValue());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract ISBN or other identifiers
|
||||||
|
List<nl.siegmann.epublib.domain.Identifier> identifiers = metadata.getIdentifiers();
|
||||||
|
if (identifiers != null && !identifiers.isEmpty()) {
|
||||||
|
for (nl.siegmann.epublib.domain.Identifier identifier : identifiers) {
|
||||||
|
System.out.println("EPUB Identifier (" + identifier.getScheme() + "): " + identifier.getValue());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractContent(Book book) {
|
||||||
|
StringBuilder contentBuilder = new StringBuilder();
|
||||||
|
|
||||||
|
List<SpineReference> spine = book.getSpine().getSpineReferences();
|
||||||
|
for (SpineReference spineRef : spine) {
|
||||||
|
try {
|
||||||
|
Resource resource = spineRef.getResource();
|
||||||
|
if (resource != null && resource.getData() != null) {
|
||||||
|
String html = new String(resource.getData(), "UTF-8");
|
||||||
|
|
||||||
|
Document doc = Jsoup.parse(html);
|
||||||
|
doc.select("script, style").remove();
|
||||||
|
|
||||||
|
String chapterContent = doc.body() != null ? doc.body().html() : doc.html();
|
||||||
|
|
||||||
|
contentBuilder.append("<div class=\"chapter\">")
|
||||||
|
.append(chapterContent)
|
||||||
|
.append("</div>");
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Skip this chapter on error
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return contentBuilder.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
private Author findOrCreateAuthor(String authorName) {
|
||||||
|
Optional<Author> existingAuthor = authorService.findByNameOptional(authorName);
|
||||||
|
if (existingAuthor.isPresent()) {
|
||||||
|
return existingAuthor.get();
|
||||||
|
}
|
||||||
|
return createAuthor(authorName);
|
||||||
|
}
|
||||||
|
|
||||||
|
private Author createAuthor(String authorName) {
|
||||||
|
Author author = new Author();
|
||||||
|
author.setName(authorName);
|
||||||
|
return authorService.create(author);
|
||||||
|
}
|
||||||
|
|
||||||
|
private Series createSeries(String seriesName) {
|
||||||
|
Series series = new Series();
|
||||||
|
series.setName(seriesName);
|
||||||
|
return seriesService.create(series);
|
||||||
|
}
|
||||||
|
|
||||||
|
private ReadingPosition extractReadingPosition(Book book, Story story) {
|
||||||
|
try {
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
String positionMeta = metadata.getMetaAttribute("reading-position");
|
||||||
|
String cfiMeta = metadata.getMetaAttribute("epub-cfi");
|
||||||
|
|
||||||
|
ReadingPosition position = new ReadingPosition(story);
|
||||||
|
|
||||||
|
if (cfiMeta != null) {
|
||||||
|
position.setEpubCfi(cfiMeta);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (positionMeta != null) {
|
||||||
|
try {
|
||||||
|
String[] parts = positionMeta.split(":");
|
||||||
|
if (parts.length >= 2) {
|
||||||
|
position.setChapterIndex(Integer.parseInt(parts[0]));
|
||||||
|
position.setWordPosition(Integer.parseInt(parts[1]));
|
||||||
|
}
|
||||||
|
} catch (NumberFormatException e) {
|
||||||
|
// Ignore invalid position format
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return position;
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Return null if no reading position found
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractAndSaveCoverImage(Book book) {
|
||||||
|
try {
|
||||||
|
Resource coverResource = book.getCoverImage();
|
||||||
|
if (coverResource != null && coverResource.getData() != null) {
|
||||||
|
// Create a temporary MultipartFile from the EPUB cover data
|
||||||
|
byte[] imageData = coverResource.getData();
|
||||||
|
String mediaType = coverResource.getMediaType() != null ?
|
||||||
|
coverResource.getMediaType().toString() : "image/jpeg";
|
||||||
|
|
||||||
|
// Determine file extension from media type
|
||||||
|
String extension = getExtensionFromMediaType(mediaType);
|
||||||
|
String filename = "epub_cover_" + System.currentTimeMillis() + "." + extension;
|
||||||
|
|
||||||
|
// Create a custom MultipartFile implementation for the cover image
|
||||||
|
MultipartFile coverFile = new EPUBCoverMultipartFile(imageData, filename, mediaType);
|
||||||
|
|
||||||
|
// Use ImageService to process and save the cover
|
||||||
|
return imageService.uploadImage(coverFile, ImageService.ImageType.COVER);
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Log error but don't fail the import
|
||||||
|
System.err.println("Failed to extract cover image: " + e.getMessage());
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String getExtensionFromMediaType(String mediaType) {
|
||||||
|
switch (mediaType.toLowerCase()) {
|
||||||
|
case "image/jpeg":
|
||||||
|
case "image/jpg":
|
||||||
|
return "jpg";
|
||||||
|
case "image/png":
|
||||||
|
return "png";
|
||||||
|
case "image/gif":
|
||||||
|
return "gif";
|
||||||
|
case "image/webp":
|
||||||
|
return "webp";
|
||||||
|
default:
|
||||||
|
return "jpg"; // Default fallback
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private ReadingPositionDto convertToDto(ReadingPosition position) {
|
||||||
|
if (position == null) return null;
|
||||||
|
|
||||||
|
ReadingPositionDto dto = new ReadingPositionDto();
|
||||||
|
dto.setId(position.getId());
|
||||||
|
dto.setStoryId(position.getStory().getId());
|
||||||
|
dto.setChapterIndex(position.getChapterIndex());
|
||||||
|
dto.setChapterTitle(position.getChapterTitle());
|
||||||
|
dto.setWordPosition(position.getWordPosition());
|
||||||
|
dto.setCharacterPosition(position.getCharacterPosition());
|
||||||
|
dto.setPercentageComplete(position.getPercentageComplete());
|
||||||
|
dto.setEpubCfi(position.getEpubCfi());
|
||||||
|
dto.setContextBefore(position.getContextBefore());
|
||||||
|
dto.setContextAfter(position.getContextAfter());
|
||||||
|
dto.setCreatedAt(position.getCreatedAt());
|
||||||
|
dto.setUpdatedAt(position.getUpdatedAt());
|
||||||
|
|
||||||
|
return dto;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> validateEPUBFile(MultipartFile file) {
|
||||||
|
List<String> errors = new ArrayList<>();
|
||||||
|
|
||||||
|
if (file == null || file.isEmpty()) {
|
||||||
|
errors.add("EPUB file is required");
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!isValidEPUBFile(file)) {
|
||||||
|
errors.add("Invalid EPUB file format. Only .epub files are supported");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (file.getSize() > 100 * 1024 * 1024) { // 100MB limit
|
||||||
|
errors.add("EPUB file size exceeds 100MB limit");
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
Book book = parseEPUBFile(file);
|
||||||
|
if (book.getMetadata() == null) {
|
||||||
|
errors.add("EPUB file contains no metadata");
|
||||||
|
}
|
||||||
|
if (book.getSpine() == null || book.getSpine().isEmpty()) {
|
||||||
|
errors.add("EPUB file contains no readable content");
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
errors.add("Failed to parse EPUB file: " + e.getMessage());
|
||||||
|
}
|
||||||
|
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Custom MultipartFile implementation for EPUB cover images
|
||||||
|
*/
|
||||||
|
private static class EPUBCoverMultipartFile implements MultipartFile {
|
||||||
|
private final byte[] data;
|
||||||
|
private final String filename;
|
||||||
|
private final String contentType;
|
||||||
|
|
||||||
|
public EPUBCoverMultipartFile(byte[] data, String filename, String contentType) {
|
||||||
|
this.data = data;
|
||||||
|
this.filename = filename;
|
||||||
|
this.contentType = contentType;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String getName() {
|
||||||
|
return "coverImage";
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String getOriginalFilename() {
|
||||||
|
return filename;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String getContentType() {
|
||||||
|
return contentType;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public boolean isEmpty() {
|
||||||
|
return data == null || data.length == 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public long getSize() {
|
||||||
|
return data != null ? data.length : 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public byte[] getBytes() {
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public InputStream getInputStream() {
|
||||||
|
return new java.io.ByteArrayInputStream(data);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public void transferTo(java.io.File dest) throws IOException {
|
||||||
|
try (java.io.FileOutputStream fos = new java.io.FileOutputStream(dest)) {
|
||||||
|
fos.write(data);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public void transferTo(java.nio.file.Path dest) throws IOException {
|
||||||
|
java.nio.file.Files.write(dest, data);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -54,7 +54,7 @@ public class HtmlSanitizationService {
|
|||||||
"p", "br", "div", "span", "h1", "h2", "h3", "h4", "h5", "h6",
|
"p", "br", "div", "span", "h1", "h2", "h3", "h4", "h5", "h6",
|
||||||
"b", "strong", "i", "em", "u", "s", "strike", "del", "ins",
|
"b", "strong", "i", "em", "u", "s", "strike", "del", "ins",
|
||||||
"sup", "sub", "small", "big", "mark", "pre", "code",
|
"sup", "sub", "small", "big", "mark", "pre", "code",
|
||||||
"ul", "ol", "li", "dl", "dt", "dd", "a",
|
"ul", "ol", "li", "dl", "dt", "dd", "a", "img",
|
||||||
"table", "thead", "tbody", "tfoot", "tr", "th", "td", "caption",
|
"table", "thead", "tbody", "tfoot", "tr", "th", "td", "caption",
|
||||||
"blockquote", "cite", "q", "hr"
|
"blockquote", "cite", "q", "hr"
|
||||||
));
|
));
|
||||||
@@ -65,7 +65,7 @@ public class HtmlSanitizationService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
private void createSafelist() {
|
private void createSafelist() {
|
||||||
this.allowlist = new Safelist();
|
this.allowlist = Safelist.relaxed();
|
||||||
|
|
||||||
// Add allowed tags
|
// Add allowed tags
|
||||||
if (config.getAllowedTags() != null) {
|
if (config.getAllowedTags() != null) {
|
||||||
@@ -83,7 +83,34 @@ public class HtmlSanitizationService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Remove specific attributes (like href from links for security)
|
// Special handling for img tags - allow all src attributes and validate later
|
||||||
|
allowlist.removeProtocols("img", "src", "http", "https");
|
||||||
|
// This is the key: preserve relative URLs by not restricting them
|
||||||
|
allowlist.preserveRelativeLinks(true);
|
||||||
|
|
||||||
|
// Configure allowed protocols for other attributes
|
||||||
|
if (config.getAllowedProtocols() != null) {
|
||||||
|
for (Map.Entry<String, Map<String, List<String>>> tagEntry : config.getAllowedProtocols().entrySet()) {
|
||||||
|
String tag = tagEntry.getKey();
|
||||||
|
Map<String, List<String>> attributeProtocols = tagEntry.getValue();
|
||||||
|
|
||||||
|
if (attributeProtocols != null) {
|
||||||
|
for (Map.Entry<String, List<String>> attrEntry : attributeProtocols.entrySet()) {
|
||||||
|
String attribute = attrEntry.getKey();
|
||||||
|
List<String> protocols = attrEntry.getValue();
|
||||||
|
|
||||||
|
if (protocols != null && !("img".equals(tag) && "src".equals(attribute))) {
|
||||||
|
// Skip img src since we handled it above
|
||||||
|
allowlist.addProtocols(tag, attribute, protocols.toArray(new String[0]));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Configured Jsoup Safelist with preserveRelativeLinks=true for local image URLs");
|
||||||
|
|
||||||
|
// Remove specific attributes if needed (deprecated in favor of protocol control)
|
||||||
if (config.getRemovedAttributes() != null) {
|
if (config.getRemovedAttributes() != null) {
|
||||||
for (Map.Entry<String, List<String>> entry : config.getRemovedAttributes().entrySet()) {
|
for (Map.Entry<String, List<String>> entry : config.getRemovedAttributes().entrySet()) {
|
||||||
String tag = entry.getKey();
|
String tag = entry.getKey();
|
||||||
@@ -114,8 +141,10 @@ public class HtmlSanitizationService {
|
|||||||
if (html == null || html.trim().isEmpty()) {
|
if (html == null || html.trim().isEmpty()) {
|
||||||
return "";
|
return "";
|
||||||
}
|
}
|
||||||
|
logger.info("Content before sanitization: "+html);
|
||||||
return Jsoup.clean(html, allowlist);
|
String saniztedHtml = Jsoup.clean(html, allowlist.preserveRelativeLinks(true));
|
||||||
|
logger.info("Content after sanitization: "+saniztedHtml);
|
||||||
|
return saniztedHtml;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String extractPlainText(String html) {
|
public String extractPlainText(String html) {
|
||||||
|
|||||||
@@ -1,5 +1,8 @@
|
|||||||
package com.storycove.service;
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.beans.factory.annotation.Value;
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
import org.springframework.stereotype.Service;
|
import org.springframework.stereotype.Service;
|
||||||
import org.springframework.web.multipart.MultipartFile;
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
@@ -7,28 +10,40 @@ import org.springframework.web.multipart.MultipartFile;
|
|||||||
import javax.imageio.ImageIO;
|
import javax.imageio.ImageIO;
|
||||||
import java.awt.*;
|
import java.awt.*;
|
||||||
import java.awt.image.BufferedImage;
|
import java.awt.image.BufferedImage;
|
||||||
import java.io.ByteArrayInputStream;
|
import java.io.*;
|
||||||
import java.io.ByteArrayOutputStream;
|
import java.net.HttpURLConnection;
|
||||||
import java.io.IOException;
|
import java.net.URL;
|
||||||
import java.nio.file.Files;
|
import java.nio.file.Files;
|
||||||
import java.nio.file.Path;
|
import java.nio.file.Path;
|
||||||
import java.nio.file.Paths;
|
import java.nio.file.Paths;
|
||||||
import java.util.Set;
|
import java.util.*;
|
||||||
import java.util.UUID;
|
import java.util.List;
|
||||||
|
import java.util.regex.Matcher;
|
||||||
|
import java.util.regex.Pattern;
|
||||||
|
|
||||||
@Service
|
@Service
|
||||||
public class ImageService {
|
public class ImageService {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(ImageService.class);
|
||||||
|
|
||||||
private static final Set<String> ALLOWED_CONTENT_TYPES = Set.of(
|
private static final Set<String> ALLOWED_CONTENT_TYPES = Set.of(
|
||||||
"image/jpeg", "image/jpg", "image/png", "image/webp"
|
"image/jpeg", "image/jpg", "image/png"
|
||||||
);
|
);
|
||||||
|
|
||||||
private static final Set<String> ALLOWED_EXTENSIONS = Set.of(
|
private static final Set<String> ALLOWED_EXTENSIONS = Set.of(
|
||||||
"jpg", "jpeg", "png", "webp"
|
"jpg", "jpeg", "png"
|
||||||
);
|
);
|
||||||
|
|
||||||
@Value("${storycove.images.upload-dir:/app/images}")
|
@Value("${storycove.images.upload-dir:/app/images}")
|
||||||
private String uploadDir;
|
private String baseUploadDir;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private LibraryService libraryService;
|
||||||
|
|
||||||
|
private String getUploadDir() {
|
||||||
|
String libraryPath = libraryService.getCurrentImagePath();
|
||||||
|
return baseUploadDir + libraryPath;
|
||||||
|
}
|
||||||
|
|
||||||
@Value("${storycove.images.cover.max-width:800}")
|
@Value("${storycove.images.cover.max-width:800}")
|
||||||
private int coverMaxWidth;
|
private int coverMaxWidth;
|
||||||
@@ -44,7 +59,8 @@ public class ImageService {
|
|||||||
|
|
||||||
public enum ImageType {
|
public enum ImageType {
|
||||||
COVER("covers"),
|
COVER("covers"),
|
||||||
AVATAR("avatars");
|
AVATAR("avatars"),
|
||||||
|
CONTENT("content");
|
||||||
|
|
||||||
private final String directory;
|
private final String directory;
|
||||||
|
|
||||||
@@ -61,7 +77,7 @@ public class ImageService {
|
|||||||
validateFile(file);
|
validateFile(file);
|
||||||
|
|
||||||
// Create directories if they don't exist
|
// Create directories if they don't exist
|
||||||
Path typeDir = Paths.get(uploadDir, imageType.getDirectory());
|
Path typeDir = Paths.get(getUploadDir(), imageType.getDirectory());
|
||||||
Files.createDirectories(typeDir);
|
Files.createDirectories(typeDir);
|
||||||
|
|
||||||
// Generate unique filename
|
// Generate unique filename
|
||||||
@@ -88,7 +104,7 @@ public class ImageService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
Path fullPath = Paths.get(uploadDir, imagePath);
|
Path fullPath = Paths.get(getUploadDir(), imagePath);
|
||||||
return Files.deleteIfExists(fullPath);
|
return Files.deleteIfExists(fullPath);
|
||||||
} catch (IOException e) {
|
} catch (IOException e) {
|
||||||
return false;
|
return false;
|
||||||
@@ -96,7 +112,7 @@ public class ImageService {
|
|||||||
}
|
}
|
||||||
|
|
||||||
public Path getImagePath(String imagePath) {
|
public Path getImagePath(String imagePath) {
|
||||||
return Paths.get(uploadDir, imagePath);
|
return Paths.get(getUploadDir(), imagePath);
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean imageExists(String imagePath) {
|
public boolean imageExists(String imagePath) {
|
||||||
@@ -107,6 +123,19 @@ public class ImageService {
|
|||||||
return Files.exists(getImagePath(imagePath));
|
return Files.exists(getImagePath(imagePath));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public boolean imageExistsInLibrary(String imagePath, String libraryId) {
|
||||||
|
if (imagePath == null || imagePath.trim().isEmpty() || libraryId == null) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return Files.exists(getImagePathInLibrary(imagePath, libraryId));
|
||||||
|
}
|
||||||
|
|
||||||
|
public Path getImagePathInLibrary(String imagePath, String libraryId) {
|
||||||
|
String libraryPath = libraryService.getImagePathForLibrary(libraryId);
|
||||||
|
return Paths.get(baseUploadDir + libraryPath, imagePath);
|
||||||
|
}
|
||||||
|
|
||||||
private void validateFile(MultipartFile file) throws IOException {
|
private void validateFile(MultipartFile file) throws IOException {
|
||||||
if (file == null || file.isEmpty()) {
|
if (file == null || file.isEmpty()) {
|
||||||
throw new IllegalArgumentException("File is empty");
|
throw new IllegalArgumentException("File is empty");
|
||||||
@@ -160,6 +189,9 @@ public class ImageService {
|
|||||||
maxWidth = avatarMaxSize;
|
maxWidth = avatarMaxSize;
|
||||||
maxHeight = avatarMaxSize;
|
maxHeight = avatarMaxSize;
|
||||||
break;
|
break;
|
||||||
|
case CONTENT:
|
||||||
|
// Content images are not resized
|
||||||
|
return new Dimension(originalWidth, originalHeight);
|
||||||
default:
|
default:
|
||||||
return new Dimension(originalWidth, originalHeight);
|
return new Dimension(originalWidth, originalHeight);
|
||||||
}
|
}
|
||||||
@@ -206,4 +238,224 @@ public class ImageService {
|
|||||||
String extension = getFileExtension(filename);
|
String extension = getFileExtension(filename);
|
||||||
return ALLOWED_EXTENSIONS.contains(extension);
|
return ALLOWED_EXTENSIONS.contains(extension);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Content image processing methods
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Process HTML content and download all referenced images, replacing URLs with local paths
|
||||||
|
*/
|
||||||
|
public ContentImageProcessingResult processContentImages(String htmlContent, UUID storyId) {
|
||||||
|
logger.info("Processing content images for story: {}, content length: {}", storyId,
|
||||||
|
htmlContent != null ? htmlContent.length() : 0);
|
||||||
|
|
||||||
|
List<String> warnings = new ArrayList<>();
|
||||||
|
List<String> downloadedImages = new ArrayList<>();
|
||||||
|
|
||||||
|
if (htmlContent == null || htmlContent.trim().isEmpty()) {
|
||||||
|
logger.info("No content to process for story: {}", storyId);
|
||||||
|
return new ContentImageProcessingResult(htmlContent, warnings, downloadedImages);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find all img tags with src attributes
|
||||||
|
Pattern imgPattern = Pattern.compile("<img[^>]+src=[\"']([^\"']+)[\"'][^>]*>", Pattern.CASE_INSENSITIVE);
|
||||||
|
Matcher matcher = imgPattern.matcher(htmlContent);
|
||||||
|
|
||||||
|
int imageCount = 0;
|
||||||
|
int externalImageCount = 0;
|
||||||
|
|
||||||
|
StringBuffer processedContent = new StringBuffer();
|
||||||
|
|
||||||
|
while (matcher.find()) {
|
||||||
|
String fullImgTag = matcher.group(0);
|
||||||
|
String imageUrl = matcher.group(1);
|
||||||
|
imageCount++;
|
||||||
|
|
||||||
|
logger.info("Found image #{}: {} in tag: {}", imageCount, imageUrl, fullImgTag);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Skip if it's already a local path or data URL
|
||||||
|
if (imageUrl.startsWith("/") || imageUrl.startsWith("data:")) {
|
||||||
|
logger.info("Skipping local/data URL: {}", imageUrl);
|
||||||
|
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
externalImageCount++;
|
||||||
|
logger.info("Processing external image #{}: {}", externalImageCount, imageUrl);
|
||||||
|
|
||||||
|
// Download and store the image
|
||||||
|
String localPath = downloadImageFromUrl(imageUrl, storyId);
|
||||||
|
downloadedImages.add(localPath);
|
||||||
|
|
||||||
|
// Generate local URL
|
||||||
|
String localUrl = getLocalImageUrl(storyId, localPath);
|
||||||
|
logger.info("Downloaded image: {} -> {}", imageUrl, localUrl);
|
||||||
|
|
||||||
|
// Replace the src attribute with the local path - handle both single and double quotes
|
||||||
|
String newImgTag = fullImgTag
|
||||||
|
.replaceFirst("src=\"" + Pattern.quote(imageUrl) + "\"", "src=\"" + localUrl + "\"")
|
||||||
|
.replaceFirst("src='" + Pattern.quote(imageUrl) + "'", "src=\"" + localUrl + "\"");
|
||||||
|
|
||||||
|
// If replacement didn't work, try a more generic approach
|
||||||
|
if (newImgTag.equals(fullImgTag)) {
|
||||||
|
logger.warn("Standard replacement failed for image URL: {}, trying generic replacement", imageUrl);
|
||||||
|
newImgTag = fullImgTag.replaceAll("src\\s*=\\s*[\"']?" + Pattern.quote(imageUrl) + "[\"']?", "src=\"" + localUrl + "\"");
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Replaced img tag: {} -> {}", fullImgTag, newImgTag);
|
||||||
|
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(newImgTag));
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to download image: {} - {}", imageUrl, e.getMessage(), e);
|
||||||
|
warnings.add("Failed to download image: " + imageUrl + " - " + e.getMessage());
|
||||||
|
// Keep original URL in case of failure
|
||||||
|
matcher.appendReplacement(processedContent, Matcher.quoteReplacement(fullImgTag));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
matcher.appendTail(processedContent);
|
||||||
|
|
||||||
|
logger.info("Finished processing images for story: {}. Found {} total images, {} external. Downloaded {} images, {} warnings",
|
||||||
|
storyId, imageCount, externalImageCount, downloadedImages.size(), warnings.size());
|
||||||
|
|
||||||
|
return new ContentImageProcessingResult(processedContent.toString(), warnings, downloadedImages);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Download an image from a URL and store it locally
|
||||||
|
*/
|
||||||
|
private String downloadImageFromUrl(String imageUrl, UUID storyId) throws IOException {
|
||||||
|
URL url = new URL(imageUrl);
|
||||||
|
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
|
||||||
|
|
||||||
|
// Set a reasonable user agent to avoid blocks
|
||||||
|
connection.setRequestProperty("User-Agent", "Mozilla/5.0 (StoryCove Image Processor)");
|
||||||
|
connection.setConnectTimeout(30000); // 30 seconds
|
||||||
|
connection.setReadTimeout(30000);
|
||||||
|
|
||||||
|
try (InputStream inputStream = connection.getInputStream()) {
|
||||||
|
// Get content type to determine file extension
|
||||||
|
String contentType = connection.getContentType();
|
||||||
|
String extension = getExtensionFromContentType(contentType);
|
||||||
|
|
||||||
|
if (extension == null) {
|
||||||
|
// Try to extract from URL
|
||||||
|
extension = getExtensionFromUrl(imageUrl);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (extension == null || !ALLOWED_EXTENSIONS.contains(extension.toLowerCase())) {
|
||||||
|
throw new IllegalArgumentException("Unsupported image format: " + contentType);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create directories for content images
|
||||||
|
Path contentDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory(), storyId.toString());
|
||||||
|
Files.createDirectories(contentDir);
|
||||||
|
|
||||||
|
// Generate unique filename
|
||||||
|
String filename = UUID.randomUUID().toString() + "." + extension.toLowerCase();
|
||||||
|
Path filePath = contentDir.resolve(filename);
|
||||||
|
|
||||||
|
// Read and validate the image
|
||||||
|
byte[] imageData = inputStream.readAllBytes();
|
||||||
|
ByteArrayInputStream bais = new ByteArrayInputStream(imageData);
|
||||||
|
BufferedImage image = ImageIO.read(bais);
|
||||||
|
|
||||||
|
if (image == null) {
|
||||||
|
throw new IOException("Invalid image format");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save the image
|
||||||
|
Files.write(filePath, imageData);
|
||||||
|
|
||||||
|
// Return relative path
|
||||||
|
return ImageType.CONTENT.getDirectory() + "/" + storyId.toString() + "/" + filename;
|
||||||
|
|
||||||
|
} finally {
|
||||||
|
connection.disconnect();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate local image URL for serving
|
||||||
|
*/
|
||||||
|
private String getLocalImageUrl(UUID storyId, String imagePath) {
|
||||||
|
String currentLibraryId = libraryService.getCurrentLibraryId();
|
||||||
|
if (currentLibraryId == null || currentLibraryId.trim().isEmpty()) {
|
||||||
|
logger.warn("Current library ID is null or empty when generating local image URL for story: {}", storyId);
|
||||||
|
return "/api/files/images/default/" + imagePath;
|
||||||
|
}
|
||||||
|
String localUrl = "/api/files/images/" + currentLibraryId + "/" + imagePath;
|
||||||
|
logger.info("Generated local image URL: {} for story: {}", localUrl, storyId);
|
||||||
|
return localUrl;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get file extension from content type
|
||||||
|
*/
|
||||||
|
private String getExtensionFromContentType(String contentType) {
|
||||||
|
if (contentType == null) return null;
|
||||||
|
|
||||||
|
switch (contentType.toLowerCase()) {
|
||||||
|
case "image/jpeg":
|
||||||
|
case "image/jpg":
|
||||||
|
return "jpg";
|
||||||
|
case "image/png":
|
||||||
|
return "png";
|
||||||
|
default:
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract file extension from URL
|
||||||
|
*/
|
||||||
|
private String getExtensionFromUrl(String url) {
|
||||||
|
try {
|
||||||
|
String path = new URL(url).getPath();
|
||||||
|
int lastDot = path.lastIndexOf('.');
|
||||||
|
if (lastDot > 0 && lastDot < path.length() - 1) {
|
||||||
|
return path.substring(lastDot + 1).toLowerCase();
|
||||||
|
}
|
||||||
|
} catch (Exception ignored) {
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clean up content images for a story
|
||||||
|
*/
|
||||||
|
public void deleteContentImages(UUID storyId) {
|
||||||
|
try {
|
||||||
|
Path contentDir = Paths.get(getUploadDir(), ImageType.CONTENT.getDirectory(), storyId.toString());
|
||||||
|
if (Files.exists(contentDir)) {
|
||||||
|
Files.walk(contentDir)
|
||||||
|
.sorted(Comparator.reverseOrder())
|
||||||
|
.map(Path::toFile)
|
||||||
|
.forEach(java.io.File::delete);
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
// Log but don't throw - this is cleanup
|
||||||
|
System.err.println("Failed to clean up content images for story " + storyId + ": " + e.getMessage());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Result class for content image processing
|
||||||
|
*/
|
||||||
|
public static class ContentImageProcessingResult {
|
||||||
|
private final String processedContent;
|
||||||
|
private final List<String> warnings;
|
||||||
|
private final List<String> downloadedImages;
|
||||||
|
|
||||||
|
public ContentImageProcessingResult(String processedContent, List<String> warnings, List<String> downloadedImages) {
|
||||||
|
this.processedContent = processedContent;
|
||||||
|
this.warnings = warnings;
|
||||||
|
this.downloadedImages = downloadedImages;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getProcessedContent() { return processedContent; }
|
||||||
|
public List<String> getWarnings() { return warnings; }
|
||||||
|
public List<String> getDownloadedImages() { return downloadedImages; }
|
||||||
|
public boolean hasWarnings() { return !warnings.isEmpty(); }
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -0,0 +1,73 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.beans.factory.annotation.Qualifier;
|
||||||
|
import org.springframework.stereotype.Component;
|
||||||
|
|
||||||
|
import javax.sql.DataSource;
|
||||||
|
import java.sql.Connection;
|
||||||
|
import java.sql.SQLException;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Base service class that provides library-aware database access.
|
||||||
|
*
|
||||||
|
* This approach is safer than routing at the datasource level because:
|
||||||
|
* 1. It doesn't interfere with Spring's initialization process
|
||||||
|
* 2. It allows fine-grained control over which operations are library-aware
|
||||||
|
* 3. It provides clear separation between authentication (uses default DB) and library operations
|
||||||
|
*/
|
||||||
|
@Component
|
||||||
|
public class LibraryAwareService {
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
private LibraryService libraryService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
@Qualifier("dataSource")
|
||||||
|
private DataSource defaultDataSource;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get a database connection for the current active library.
|
||||||
|
* Falls back to default datasource if no library is active.
|
||||||
|
*/
|
||||||
|
public Connection getCurrentLibraryConnection() throws SQLException {
|
||||||
|
try {
|
||||||
|
// Try to get library-specific connection
|
||||||
|
DataSource libraryDataSource = libraryService.getCurrentDataSource();
|
||||||
|
return libraryDataSource.getConnection();
|
||||||
|
} catch (IllegalStateException e) {
|
||||||
|
// No active library - use default datasource
|
||||||
|
return defaultDataSource.getConnection();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get a database connection for the default/fallback database.
|
||||||
|
* Use this for authentication and system-level operations.
|
||||||
|
*/
|
||||||
|
public Connection getDefaultConnection() throws SQLException {
|
||||||
|
return defaultDataSource.getConnection();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a library is currently active
|
||||||
|
*/
|
||||||
|
public boolean hasActiveLibrary() {
|
||||||
|
try {
|
||||||
|
return libraryService.getCurrentLibraryId() != null;
|
||||||
|
} catch (Exception e) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get the current active library ID, or null if none
|
||||||
|
*/
|
||||||
|
public String getCurrentLibraryId() {
|
||||||
|
try {
|
||||||
|
return libraryService.getCurrentLibraryId();
|
||||||
|
} catch (Exception e) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
862
backend/src/main/java/com/storycove/service/LibraryService.java
Normal file
862
backend/src/main/java/com/storycove/service/LibraryService.java
Normal file
@@ -0,0 +1,862 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import com.storycove.entity.Library;
|
||||||
|
import com.storycove.dto.LibraryDto;
|
||||||
|
import com.fasterxml.jackson.core.type.TypeReference;
|
||||||
|
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||||
|
import com.zaxxer.hikari.HikariConfig;
|
||||||
|
import com.zaxxer.hikari.HikariDataSource;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
|
import org.springframework.context.ApplicationContext;
|
||||||
|
import org.springframework.context.ApplicationContextAware;
|
||||||
|
import org.springframework.security.crypto.bcrypt.BCryptPasswordEncoder;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
import org.typesense.api.Client;
|
||||||
|
import org.typesense.resources.Node;
|
||||||
|
|
||||||
|
import jakarta.annotation.PostConstruct;
|
||||||
|
import jakarta.annotation.PreDestroy;
|
||||||
|
import javax.sql.DataSource;
|
||||||
|
import java.io.File;
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.nio.charset.StandardCharsets;
|
||||||
|
import java.nio.file.Files;
|
||||||
|
import java.nio.file.Path;
|
||||||
|
import java.nio.file.Paths;
|
||||||
|
import java.sql.SQLException;
|
||||||
|
import java.time.Duration;
|
||||||
|
import java.util.*;
|
||||||
|
import java.util.concurrent.ConcurrentHashMap;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
public class LibraryService implements ApplicationContextAware {
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(LibraryService.class);
|
||||||
|
|
||||||
|
@Value("${spring.datasource.url}")
|
||||||
|
private String baseDbUrl;
|
||||||
|
|
||||||
|
@Value("${spring.datasource.username}")
|
||||||
|
private String dbUsername;
|
||||||
|
|
||||||
|
@Value("${spring.datasource.password}")
|
||||||
|
private String dbPassword;
|
||||||
|
|
||||||
|
@Value("${typesense.host}")
|
||||||
|
private String typesenseHost;
|
||||||
|
|
||||||
|
@Value("${typesense.port}")
|
||||||
|
private String typesensePort;
|
||||||
|
|
||||||
|
@Value("${typesense.api-key}")
|
||||||
|
private String typesenseApiKey;
|
||||||
|
|
||||||
|
private final ObjectMapper objectMapper = new ObjectMapper();
|
||||||
|
private final BCryptPasswordEncoder passwordEncoder = new BCryptPasswordEncoder();
|
||||||
|
private final Map<String, Library> libraries = new ConcurrentHashMap<>();
|
||||||
|
|
||||||
|
// Spring ApplicationContext for accessing other services without circular dependencies
|
||||||
|
private ApplicationContext applicationContext;
|
||||||
|
|
||||||
|
// Current active resources
|
||||||
|
private volatile String currentLibraryId;
|
||||||
|
private volatile Client currentTypesenseClient;
|
||||||
|
|
||||||
|
// Security: Track if user has explicitly authenticated in this session
|
||||||
|
private volatile boolean explicitlyAuthenticated = false;
|
||||||
|
|
||||||
|
private static final String LIBRARIES_CONFIG_PATH = "/app/config/libraries.json";
|
||||||
|
private static final Path libraryConfigDir = Paths.get("/app/config");
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public void setApplicationContext(ApplicationContext applicationContext) {
|
||||||
|
this.applicationContext = applicationContext;
|
||||||
|
}
|
||||||
|
|
||||||
|
@PostConstruct
|
||||||
|
public void initialize() {
|
||||||
|
loadLibrariesFromFile();
|
||||||
|
|
||||||
|
// If no libraries exist, create a default one
|
||||||
|
if (libraries.isEmpty()) {
|
||||||
|
createDefaultLibrary();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Security: Do NOT automatically switch to any library on startup
|
||||||
|
// Users must authenticate before accessing any library
|
||||||
|
explicitlyAuthenticated = false;
|
||||||
|
currentLibraryId = null;
|
||||||
|
|
||||||
|
if (!libraries.isEmpty()) {
|
||||||
|
logger.info("Loaded {} libraries. Authentication required to access any library.", libraries.size());
|
||||||
|
} else {
|
||||||
|
logger.info("No libraries found. A default library will be created on first authentication.");
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Security: Application startup completed. All users must re-authenticate.");
|
||||||
|
}
|
||||||
|
|
||||||
|
@PreDestroy
|
||||||
|
public void cleanup() {
|
||||||
|
currentLibraryId = null;
|
||||||
|
currentTypesenseClient = null;
|
||||||
|
explicitlyAuthenticated = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear authentication state (for logout)
|
||||||
|
*/
|
||||||
|
public void clearAuthentication() {
|
||||||
|
explicitlyAuthenticated = false;
|
||||||
|
currentLibraryId = null;
|
||||||
|
currentTypesenseClient = null;
|
||||||
|
logger.info("Authentication cleared - user must re-authenticate to access libraries");
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
public String authenticateAndGetLibrary(String password) {
|
||||||
|
for (Library library : libraries.values()) {
|
||||||
|
if (passwordEncoder.matches(password, library.getPasswordHash())) {
|
||||||
|
// Mark as explicitly authenticated for this session
|
||||||
|
explicitlyAuthenticated = true;
|
||||||
|
logger.info("User explicitly authenticated for library: {}", library.getId());
|
||||||
|
return library.getId();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return null; // Authentication failed
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Switch to library after authentication with forced reindexing
|
||||||
|
* This ensures Typesense is always up-to-date after login
|
||||||
|
*/
|
||||||
|
public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception {
|
||||||
|
logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId);
|
||||||
|
switchToLibrary(libraryId, true);
|
||||||
|
}
|
||||||
|
|
||||||
|
public synchronized void switchToLibrary(String libraryId) throws Exception {
|
||||||
|
switchToLibrary(libraryId, false);
|
||||||
|
}
|
||||||
|
|
||||||
|
public synchronized void switchToLibrary(String libraryId, boolean forceReindex) throws Exception {
|
||||||
|
// Security: Only allow library switching after explicit authentication
|
||||||
|
if (!explicitlyAuthenticated) {
|
||||||
|
throw new IllegalStateException("Library switching requires explicit authentication. Please log in first.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (libraryId.equals(currentLibraryId) && !forceReindex) {
|
||||||
|
return; // Already active and no forced reindex requested
|
||||||
|
}
|
||||||
|
|
||||||
|
Library library = libraries.get(libraryId);
|
||||||
|
if (library == null) {
|
||||||
|
throw new IllegalArgumentException("Library not found: " + libraryId);
|
||||||
|
}
|
||||||
|
|
||||||
|
String previousLibraryId = currentLibraryId;
|
||||||
|
|
||||||
|
if (libraryId.equals(currentLibraryId) && forceReindex) {
|
||||||
|
logger.info("Forcing reindex for current library: {} ({})", library.getName(), libraryId);
|
||||||
|
} else {
|
||||||
|
logger.info("Switching to library: {} ({})", library.getName(), libraryId);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close current resources
|
||||||
|
closeCurrentResources();
|
||||||
|
|
||||||
|
// Set new active library (datasource routing handled by SmartRoutingDataSource)
|
||||||
|
currentLibraryId = libraryId;
|
||||||
|
currentTypesenseClient = createTypesenseClient(library.getTypesenseCollection());
|
||||||
|
|
||||||
|
// Initialize Typesense collections for this library
|
||||||
|
try {
|
||||||
|
TypesenseService typesenseService = applicationContext.getBean(TypesenseService.class);
|
||||||
|
// First ensure collections exist
|
||||||
|
typesenseService.initializeCollectionsForCurrentLibrary();
|
||||||
|
logger.info("Completed Typesense initialization for library: {}", libraryId);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.warn("Failed to initialize Typesense for library {}: {}", libraryId, e.getMessage());
|
||||||
|
// Don't fail the switch - collections can be created later
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Successfully switched to library: {}", library.getName());
|
||||||
|
|
||||||
|
// Perform complete reindex AFTER library switch is fully complete
|
||||||
|
// This ensures database routing is properly established
|
||||||
|
if (forceReindex || !libraryId.equals(previousLibraryId)) {
|
||||||
|
logger.info("Starting post-switch Typesense reindex for library: {}", libraryId);
|
||||||
|
|
||||||
|
// Run reindex asynchronously to avoid blocking authentication response
|
||||||
|
// and allow time for database routing to fully stabilize
|
||||||
|
String finalLibraryId = libraryId;
|
||||||
|
new Thread(() -> {
|
||||||
|
try {
|
||||||
|
// Give routing time to stabilize
|
||||||
|
Thread.sleep(500);
|
||||||
|
logger.info("Starting async Typesense reindex for library: {}", finalLibraryId);
|
||||||
|
|
||||||
|
TypesenseService typesenseService = applicationContext.getBean(TypesenseService.class);
|
||||||
|
typesenseService.performCompleteReindex();
|
||||||
|
logger.info("Completed async Typesense reindexing for library: {}", finalLibraryId);
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.warn("Failed to async reindex Typesense for library {}: {}", finalLibraryId, e.getMessage());
|
||||||
|
}
|
||||||
|
}, "TypesenseReindex-" + libraryId).start();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public DataSource getCurrentDataSource() {
|
||||||
|
if (currentLibraryId == null) {
|
||||||
|
throw new IllegalStateException("No active library - please authenticate first");
|
||||||
|
}
|
||||||
|
// Return the Spring-managed primary datasource which handles routing automatically
|
||||||
|
try {
|
||||||
|
return applicationContext.getBean("dataSource", DataSource.class);
|
||||||
|
} catch (Exception e) {
|
||||||
|
throw new IllegalStateException("Failed to get routing datasource", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public Client getCurrentTypesenseClient() {
|
||||||
|
if (currentTypesenseClient == null) {
|
||||||
|
throw new IllegalStateException("No active library - please authenticate first");
|
||||||
|
}
|
||||||
|
return currentTypesenseClient;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCurrentLibraryId() {
|
||||||
|
return currentLibraryId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Library getCurrentLibrary() {
|
||||||
|
if (currentLibraryId == null) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return libraries.get(currentLibraryId);
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<LibraryDto> getAllLibraries() {
|
||||||
|
List<LibraryDto> result = new ArrayList<>();
|
||||||
|
for (Library library : libraries.values()) {
|
||||||
|
boolean isActive = library.getId().equals(currentLibraryId);
|
||||||
|
result.add(new LibraryDto(
|
||||||
|
library.getId(),
|
||||||
|
library.getName(),
|
||||||
|
library.getDescription(),
|
||||||
|
isActive,
|
||||||
|
library.isInitialized()
|
||||||
|
));
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LibraryDto getLibraryById(String libraryId) {
|
||||||
|
Library library = libraries.get(libraryId);
|
||||||
|
if (library != null) {
|
||||||
|
boolean isActive = library.getId().equals(currentLibraryId);
|
||||||
|
return new LibraryDto(
|
||||||
|
library.getId(),
|
||||||
|
library.getName(),
|
||||||
|
library.getDescription(),
|
||||||
|
isActive,
|
||||||
|
library.isInitialized()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCurrentImagePath() {
|
||||||
|
Library current = getCurrentLibrary();
|
||||||
|
return current != null ? current.getImagePath() : "/images/default";
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getImagePathForLibrary(String libraryId) {
|
||||||
|
if (libraryId == null) {
|
||||||
|
return "/images/default";
|
||||||
|
}
|
||||||
|
|
||||||
|
Library library = libraries.get(libraryId);
|
||||||
|
return library != null ? library.getImagePath() : "/images/default";
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean changeLibraryPassword(String libraryId, String currentPassword, String newPassword) {
|
||||||
|
Library library = libraries.get(libraryId);
|
||||||
|
if (library == null) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify current password
|
||||||
|
if (!passwordEncoder.matches(currentPassword, library.getPasswordHash())) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update password
|
||||||
|
library.setPasswordHash(passwordEncoder.encode(newPassword));
|
||||||
|
saveLibrariesToFile();
|
||||||
|
|
||||||
|
logger.info("Password changed for library: {}", library.getName());
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Library createNewLibrary(String name, String description, String password) {
|
||||||
|
// Generate unique ID
|
||||||
|
String id = name.toLowerCase().replaceAll("[^a-z0-9]", "");
|
||||||
|
int counter = 1;
|
||||||
|
String originalId = id;
|
||||||
|
while (libraries.containsKey(id)) {
|
||||||
|
id = originalId + counter++;
|
||||||
|
}
|
||||||
|
|
||||||
|
Library newLibrary = new Library(
|
||||||
|
id,
|
||||||
|
name,
|
||||||
|
description,
|
||||||
|
passwordEncoder.encode(password),
|
||||||
|
"storycove_" + id
|
||||||
|
);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Test database creation by creating a connection
|
||||||
|
DataSource testDs = createDataSource(newLibrary.getDbName());
|
||||||
|
testDs.getConnection().close(); // This will create the database and schema if it doesn't exist
|
||||||
|
|
||||||
|
// Initialize library resources (image directories)
|
||||||
|
initializeNewLibraryResources(id);
|
||||||
|
|
||||||
|
newLibrary.setInitialized(true);
|
||||||
|
logger.info("Database and resources created for library: {}", newLibrary.getDbName());
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.warn("Database/resource creation failed for library {}: {}", id, e.getMessage());
|
||||||
|
// Continue anyway - resources will be created when needed
|
||||||
|
}
|
||||||
|
|
||||||
|
libraries.put(id, newLibrary);
|
||||||
|
saveLibrariesToFile();
|
||||||
|
|
||||||
|
logger.info("Created new library: {} ({})", name, id);
|
||||||
|
return newLibrary;
|
||||||
|
}
|
||||||
|
|
||||||
|
private void loadLibrariesFromFile() {
|
||||||
|
try {
|
||||||
|
File configFile = new File(LIBRARIES_CONFIG_PATH);
|
||||||
|
if (configFile.exists()) {
|
||||||
|
String content = Files.readString(Paths.get(LIBRARIES_CONFIG_PATH));
|
||||||
|
Map<String, Object> config = objectMapper.readValue(content, new TypeReference<Map<String, Object>>() {});
|
||||||
|
|
||||||
|
@SuppressWarnings("unchecked")
|
||||||
|
Map<String, Map<String, Object>> librariesData = (Map<String, Map<String, Object>>) config.get("libraries");
|
||||||
|
|
||||||
|
for (Map.Entry<String, Map<String, Object>> entry : librariesData.entrySet()) {
|
||||||
|
String id = entry.getKey();
|
||||||
|
Map<String, Object> data = entry.getValue();
|
||||||
|
|
||||||
|
Library library = new Library();
|
||||||
|
library.setId(id);
|
||||||
|
library.setName((String) data.get("name"));
|
||||||
|
library.setDescription((String) data.get("description"));
|
||||||
|
library.setPasswordHash((String) data.get("passwordHash"));
|
||||||
|
library.setDbName((String) data.get("dbName"));
|
||||||
|
library.setInitialized((Boolean) data.getOrDefault("initialized", false));
|
||||||
|
|
||||||
|
libraries.put(id, library);
|
||||||
|
logger.info("Loaded library: {} ({})", library.getName(), id);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
logger.info("No libraries configuration file found, will create default");
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error("Failed to load libraries configuration", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void createDefaultLibrary() {
|
||||||
|
// Check if we're migrating from the old single-library system
|
||||||
|
String existingDbName = extractDatabaseName(baseDbUrl);
|
||||||
|
|
||||||
|
Library defaultLibrary = new Library(
|
||||||
|
"main",
|
||||||
|
"Main Library",
|
||||||
|
"Your existing story collection (migrated)",
|
||||||
|
passwordEncoder.encode("temp-password-change-me"), // Temporary password
|
||||||
|
existingDbName // Use existing database name
|
||||||
|
);
|
||||||
|
defaultLibrary.setInitialized(true); // Mark as initialized since it has existing data
|
||||||
|
|
||||||
|
libraries.put("main", defaultLibrary);
|
||||||
|
saveLibrariesToFile();
|
||||||
|
|
||||||
|
logger.warn("=".repeat(80));
|
||||||
|
logger.warn("MIGRATION: Created 'Main Library' for your existing data");
|
||||||
|
logger.warn("Temporary password: 'temp-password-change-me'");
|
||||||
|
logger.warn("IMPORTANT: Please set a proper password in Settings > Library Settings");
|
||||||
|
logger.warn("=".repeat(80));
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractDatabaseName(String jdbcUrl) {
|
||||||
|
// Extract database name from JDBC URL like "jdbc:postgresql://db:5432/storycove"
|
||||||
|
int lastSlash = jdbcUrl.lastIndexOf('/');
|
||||||
|
if (lastSlash != -1 && lastSlash < jdbcUrl.length() - 1) {
|
||||||
|
String dbPart = jdbcUrl.substring(lastSlash + 1);
|
||||||
|
// Remove any query parameters
|
||||||
|
int queryStart = dbPart.indexOf('?');
|
||||||
|
return queryStart != -1 ? dbPart.substring(0, queryStart) : dbPart;
|
||||||
|
}
|
||||||
|
return "storycove"; // fallback
|
||||||
|
}
|
||||||
|
|
||||||
|
private void saveLibrariesToFile() {
|
||||||
|
try {
|
||||||
|
Map<String, Object> config = new HashMap<>();
|
||||||
|
Map<String, Map<String, Object>> librariesData = new HashMap<>();
|
||||||
|
|
||||||
|
for (Library library : libraries.values()) {
|
||||||
|
Map<String, Object> data = new HashMap<>();
|
||||||
|
data.put("name", library.getName());
|
||||||
|
data.put("description", library.getDescription());
|
||||||
|
data.put("passwordHash", library.getPasswordHash());
|
||||||
|
data.put("dbName", library.getDbName());
|
||||||
|
data.put("initialized", library.isInitialized());
|
||||||
|
|
||||||
|
librariesData.put(library.getId(), data);
|
||||||
|
}
|
||||||
|
|
||||||
|
config.put("libraries", librariesData);
|
||||||
|
|
||||||
|
// Ensure config directory exists
|
||||||
|
new File("/app/config").mkdirs();
|
||||||
|
|
||||||
|
String json = objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(config);
|
||||||
|
Files.writeString(Paths.get(LIBRARIES_CONFIG_PATH), json);
|
||||||
|
|
||||||
|
logger.info("Saved libraries configuration");
|
||||||
|
} catch (IOException e) {
|
||||||
|
logger.error("Failed to save libraries configuration", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private DataSource createDataSource(String dbName) {
|
||||||
|
String url = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
||||||
|
logger.info("Creating DataSource for: {}", url);
|
||||||
|
|
||||||
|
// First, ensure the database exists
|
||||||
|
ensureDatabaseExists(dbName);
|
||||||
|
|
||||||
|
HikariConfig config = new HikariConfig();
|
||||||
|
config.setJdbcUrl(url);
|
||||||
|
config.setUsername(dbUsername);
|
||||||
|
config.setPassword(dbPassword);
|
||||||
|
config.setDriverClassName("org.postgresql.Driver");
|
||||||
|
config.setMaximumPoolSize(10);
|
||||||
|
config.setConnectionTimeout(30000);
|
||||||
|
|
||||||
|
return new HikariDataSource(config);
|
||||||
|
}
|
||||||
|
|
||||||
|
private void ensureDatabaseExists(String dbName) {
|
||||||
|
// Connect to the 'postgres' database to create the new database
|
||||||
|
String adminUrl = baseDbUrl.replaceAll("/[^/]*$", "/postgres");
|
||||||
|
|
||||||
|
HikariConfig adminConfig = new HikariConfig();
|
||||||
|
adminConfig.setJdbcUrl(adminUrl);
|
||||||
|
adminConfig.setUsername(dbUsername);
|
||||||
|
adminConfig.setPassword(dbPassword);
|
||||||
|
adminConfig.setDriverClassName("org.postgresql.Driver");
|
||||||
|
adminConfig.setMaximumPoolSize(1);
|
||||||
|
adminConfig.setConnectionTimeout(30000);
|
||||||
|
|
||||||
|
boolean databaseCreated = false;
|
||||||
|
|
||||||
|
try (HikariDataSource adminDataSource = new HikariDataSource(adminConfig);
|
||||||
|
var connection = adminDataSource.getConnection();
|
||||||
|
var statement = connection.createStatement()) {
|
||||||
|
|
||||||
|
// Check if database exists
|
||||||
|
String checkQuery = "SELECT 1 FROM pg_database WHERE datname = ?";
|
||||||
|
try (var preparedStatement = connection.prepareStatement(checkQuery)) {
|
||||||
|
preparedStatement.setString(1, dbName);
|
||||||
|
try (var resultSet = preparedStatement.executeQuery()) {
|
||||||
|
if (resultSet.next()) {
|
||||||
|
logger.info("Database {} already exists", dbName);
|
||||||
|
return; // Database exists, nothing to do
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create database if it doesn't exist
|
||||||
|
// Note: Database names cannot be parameterized, but we validate the name is safe
|
||||||
|
if (!dbName.matches("^[a-zA-Z][a-zA-Z0-9_]*$")) {
|
||||||
|
throw new IllegalArgumentException("Invalid database name: " + dbName);
|
||||||
|
}
|
||||||
|
|
||||||
|
String createQuery = "CREATE DATABASE " + dbName;
|
||||||
|
statement.executeUpdate(createQuery);
|
||||||
|
logger.info("Created database: {}", dbName);
|
||||||
|
databaseCreated = true;
|
||||||
|
|
||||||
|
} catch (SQLException e) {
|
||||||
|
logger.error("Failed to ensure database {} exists: {}", dbName, e.getMessage());
|
||||||
|
throw new RuntimeException("Database creation failed", e);
|
||||||
|
}
|
||||||
|
|
||||||
|
// If we just created the database, initialize its schema
|
||||||
|
if (databaseCreated) {
|
||||||
|
initializeNewDatabaseSchema(dbName);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void initializeNewDatabaseSchema(String dbName) {
|
||||||
|
logger.info("Initializing schema for new database: {}", dbName);
|
||||||
|
|
||||||
|
// Create a temporary DataSource for the new database to initialize schema
|
||||||
|
String newDbUrl = baseDbUrl.replaceAll("/[^/]*$", "/" + dbName);
|
||||||
|
|
||||||
|
HikariConfig config = new HikariConfig();
|
||||||
|
config.setJdbcUrl(newDbUrl);
|
||||||
|
config.setUsername(dbUsername);
|
||||||
|
config.setPassword(dbPassword);
|
||||||
|
config.setDriverClassName("org.postgresql.Driver");
|
||||||
|
config.setMaximumPoolSize(1);
|
||||||
|
config.setConnectionTimeout(30000);
|
||||||
|
|
||||||
|
try (HikariDataSource tempDataSource = new HikariDataSource(config)) {
|
||||||
|
// Use Hibernate to create the schema
|
||||||
|
// This mimics what Spring Boot does during startup
|
||||||
|
createSchemaUsingHibernate(tempDataSource);
|
||||||
|
logger.info("Schema initialized for database: {}", dbName);
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to initialize schema for database {}: {}", dbName, e.getMessage());
|
||||||
|
throw new RuntimeException("Schema initialization failed", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public void initializeNewLibraryResources(String libraryId) {
|
||||||
|
Library library = libraries.get(libraryId);
|
||||||
|
if (library == null) {
|
||||||
|
throw new IllegalArgumentException("Library not found: " + libraryId);
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
logger.info("Initializing resources for new library: {}", library.getName());
|
||||||
|
|
||||||
|
// 1. Create image directory structure
|
||||||
|
initializeImageDirectories(library);
|
||||||
|
|
||||||
|
// 2. Initialize Typesense collections (this will be done when switching to the library)
|
||||||
|
// The TypesenseService.initializeCollections() will be called automatically
|
||||||
|
|
||||||
|
logger.info("Successfully initialized resources for library: {}", library.getName());
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to initialize resources for library {}: {}", libraryId, e.getMessage());
|
||||||
|
throw new RuntimeException("Library resource initialization failed", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void initializeImageDirectories(Library library) {
|
||||||
|
try {
|
||||||
|
// Create the library-specific image directory
|
||||||
|
String imagePath = "/app/images/" + library.getId();
|
||||||
|
java.nio.file.Path libraryImagePath = java.nio.file.Paths.get(imagePath);
|
||||||
|
|
||||||
|
if (!java.nio.file.Files.exists(libraryImagePath)) {
|
||||||
|
java.nio.file.Files.createDirectories(libraryImagePath);
|
||||||
|
logger.info("Created image directory: {}", imagePath);
|
||||||
|
|
||||||
|
// Create subdirectories for different image types
|
||||||
|
java.nio.file.Files.createDirectories(libraryImagePath.resolve("stories"));
|
||||||
|
java.nio.file.Files.createDirectories(libraryImagePath.resolve("authors"));
|
||||||
|
java.nio.file.Files.createDirectories(libraryImagePath.resolve("collections"));
|
||||||
|
|
||||||
|
logger.info("Created image subdirectories for library: {}", library.getId());
|
||||||
|
} else {
|
||||||
|
logger.info("Image directory already exists: {}", imagePath);
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to create image directories for library {}: {}", library.getId(), e.getMessage());
|
||||||
|
throw new RuntimeException("Image directory creation failed", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void createSchemaUsingHibernate(DataSource dataSource) {
|
||||||
|
// Create the essential tables manually using the same DDL that Hibernate would generate
|
||||||
|
// This is simpler than setting up a full Hibernate configuration for schema creation
|
||||||
|
|
||||||
|
String[] createTableStatements = {
|
||||||
|
// Authors table
|
||||||
|
"""
|
||||||
|
CREATE TABLE authors (
|
||||||
|
author_rating integer,
|
||||||
|
created_at timestamp(6) not null,
|
||||||
|
updated_at timestamp(6) not null,
|
||||||
|
id uuid not null,
|
||||||
|
avatar_image_path varchar(255),
|
||||||
|
name varchar(255) not null,
|
||||||
|
notes TEXT,
|
||||||
|
primary key (id)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
// Author URLs table
|
||||||
|
"""
|
||||||
|
CREATE TABLE author_urls (
|
||||||
|
author_id uuid not null,
|
||||||
|
url varchar(255)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
// Series table
|
||||||
|
"""
|
||||||
|
CREATE TABLE series (
|
||||||
|
created_at timestamp(6) not null,
|
||||||
|
id uuid not null,
|
||||||
|
description varchar(1000),
|
||||||
|
name varchar(255) not null,
|
||||||
|
primary key (id)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
// Tags table
|
||||||
|
"""
|
||||||
|
CREATE TABLE tags (
|
||||||
|
color varchar(7),
|
||||||
|
created_at timestamp(6) not null,
|
||||||
|
id uuid not null,
|
||||||
|
description varchar(500),
|
||||||
|
name varchar(255) not null unique,
|
||||||
|
primary key (id)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
// Tag aliases table
|
||||||
|
"""
|
||||||
|
CREATE TABLE tag_aliases (
|
||||||
|
created_from_merge boolean not null,
|
||||||
|
created_at timestamp(6) not null,
|
||||||
|
canonical_tag_id uuid not null,
|
||||||
|
id uuid not null,
|
||||||
|
alias_name varchar(255) not null unique,
|
||||||
|
primary key (id)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
// Collections table
|
||||||
|
"""
|
||||||
|
CREATE TABLE collections (
|
||||||
|
is_archived boolean not null,
|
||||||
|
rating integer,
|
||||||
|
created_at timestamp(6) not null,
|
||||||
|
updated_at timestamp(6) not null,
|
||||||
|
id uuid not null,
|
||||||
|
cover_image_path varchar(500),
|
||||||
|
name varchar(500) not null,
|
||||||
|
description TEXT,
|
||||||
|
primary key (id)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
// Stories table
|
||||||
|
"""
|
||||||
|
CREATE TABLE stories (
|
||||||
|
is_read boolean,
|
||||||
|
rating integer,
|
||||||
|
reading_position integer,
|
||||||
|
volume integer,
|
||||||
|
word_count integer,
|
||||||
|
created_at timestamp(6) not null,
|
||||||
|
last_read_at timestamp(6),
|
||||||
|
updated_at timestamp(6) not null,
|
||||||
|
author_id uuid,
|
||||||
|
id uuid not null,
|
||||||
|
series_id uuid,
|
||||||
|
description varchar(1000),
|
||||||
|
content_html TEXT,
|
||||||
|
content_plain TEXT,
|
||||||
|
cover_path varchar(255),
|
||||||
|
source_url varchar(255),
|
||||||
|
summary TEXT,
|
||||||
|
title varchar(255) not null,
|
||||||
|
primary key (id)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
// Reading positions table
|
||||||
|
"""
|
||||||
|
CREATE TABLE reading_positions (
|
||||||
|
chapter_index integer,
|
||||||
|
character_position integer,
|
||||||
|
percentage_complete float(53),
|
||||||
|
word_position integer,
|
||||||
|
created_at timestamp(6) not null,
|
||||||
|
updated_at timestamp(6) not null,
|
||||||
|
id uuid not null,
|
||||||
|
story_id uuid not null,
|
||||||
|
context_after varchar(500),
|
||||||
|
context_before varchar(500),
|
||||||
|
chapter_title varchar(255),
|
||||||
|
epub_cfi TEXT,
|
||||||
|
primary key (id)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
// Junction tables
|
||||||
|
"""
|
||||||
|
CREATE TABLE story_tags (
|
||||||
|
story_id uuid not null,
|
||||||
|
tag_id uuid not null,
|
||||||
|
primary key (story_id, tag_id)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
"""
|
||||||
|
CREATE TABLE collection_stories (
|
||||||
|
position integer not null,
|
||||||
|
added_at timestamp(6) not null,
|
||||||
|
collection_id uuid not null,
|
||||||
|
story_id uuid not null,
|
||||||
|
primary key (collection_id, story_id),
|
||||||
|
unique (collection_id, position)
|
||||||
|
)
|
||||||
|
""",
|
||||||
|
|
||||||
|
"""
|
||||||
|
CREATE TABLE collection_tags (
|
||||||
|
collection_id uuid not null,
|
||||||
|
tag_id uuid not null,
|
||||||
|
primary key (collection_id, tag_id)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
};
|
||||||
|
|
||||||
|
String[] createIndexStatements = {
|
||||||
|
"CREATE INDEX idx_reading_position_story ON reading_positions (story_id)"
|
||||||
|
};
|
||||||
|
|
||||||
|
String[] createConstraintStatements = {
|
||||||
|
// Foreign key constraints
|
||||||
|
"ALTER TABLE author_urls ADD CONSTRAINT FKdqhp51m0uveybsts098gd79uo FOREIGN KEY (author_id) REFERENCES authors",
|
||||||
|
"ALTER TABLE stories ADD CONSTRAINT FKhwecpqeaxy40ftrctef1u7gw7 FOREIGN KEY (author_id) REFERENCES authors",
|
||||||
|
"ALTER TABLE stories ADD CONSTRAINT FK1kulyvy7wwcolp2gkndt57cp7 FOREIGN KEY (series_id) REFERENCES series",
|
||||||
|
"ALTER TABLE reading_positions ADD CONSTRAINT FKglfhdhflan3pgyr2u0gxi21i5 FOREIGN KEY (story_id) REFERENCES stories",
|
||||||
|
"ALTER TABLE story_tags ADD CONSTRAINT FKmans33ijt0nf65t0sng2r848j FOREIGN KEY (tag_id) REFERENCES tags",
|
||||||
|
"ALTER TABLE story_tags ADD CONSTRAINT FKq9guid7swnjxwdpgxj3jo1rsi FOREIGN KEY (story_id) REFERENCES stories",
|
||||||
|
"ALTER TABLE tag_aliases ADD CONSTRAINT FKqfsawmcj3ey4yycb6958y24ch FOREIGN KEY (canonical_tag_id) REFERENCES tags",
|
||||||
|
"ALTER TABLE collection_stories ADD CONSTRAINT FKr55ho4vhj0wp03x13iskr1jds FOREIGN KEY (collection_id) REFERENCES collections",
|
||||||
|
"ALTER TABLE collection_stories ADD CONSTRAINT FK7n41tbbrt7r2e81hpu3612r1o FOREIGN KEY (story_id) REFERENCES stories",
|
||||||
|
"ALTER TABLE collection_tags ADD CONSTRAINT FKceq7ggev8n8ibjui1x5yo4x67 FOREIGN KEY (tag_id) REFERENCES tags",
|
||||||
|
"ALTER TABLE collection_tags ADD CONSTRAINT FKq9sa5s8csdpbphrvb48tts8jt FOREIGN KEY (collection_id) REFERENCES collections"
|
||||||
|
};
|
||||||
|
|
||||||
|
try (var connection = dataSource.getConnection();
|
||||||
|
var statement = connection.createStatement()) {
|
||||||
|
|
||||||
|
// Create tables
|
||||||
|
for (String sql : createTableStatements) {
|
||||||
|
statement.executeUpdate(sql);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create indexes
|
||||||
|
for (String sql : createIndexStatements) {
|
||||||
|
statement.executeUpdate(sql);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create constraints
|
||||||
|
for (String sql : createConstraintStatements) {
|
||||||
|
statement.executeUpdate(sql);
|
||||||
|
}
|
||||||
|
|
||||||
|
logger.info("Successfully created all database tables and constraints");
|
||||||
|
|
||||||
|
} catch (SQLException e) {
|
||||||
|
logger.error("Failed to create database schema", e);
|
||||||
|
throw new RuntimeException("Schema creation failed", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private Client createTypesenseClient(String collection) {
|
||||||
|
logger.info("Creating Typesense client for collection: {}", collection);
|
||||||
|
|
||||||
|
List<Node> nodes = Arrays.asList(
|
||||||
|
new Node("http", typesenseHost, typesensePort)
|
||||||
|
);
|
||||||
|
|
||||||
|
org.typesense.api.Configuration configuration = new org.typesense.api.Configuration(nodes, Duration.ofSeconds(10), typesenseApiKey);
|
||||||
|
return new Client(configuration);
|
||||||
|
}
|
||||||
|
|
||||||
|
private void closeCurrentResources() {
|
||||||
|
// No need to close datasource - SmartRoutingDataSource handles this
|
||||||
|
// Typesense client doesn't need explicit cleanup
|
||||||
|
currentTypesenseClient = null;
|
||||||
|
// Don't clear currentLibraryId here - only when explicitly switching
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update library metadata (name and description)
|
||||||
|
*/
|
||||||
|
public synchronized void updateLibraryMetadata(String libraryId, String newName, String newDescription) throws Exception {
|
||||||
|
if (libraryId == null || libraryId.trim().isEmpty()) {
|
||||||
|
throw new IllegalArgumentException("Library ID cannot be null or empty");
|
||||||
|
}
|
||||||
|
|
||||||
|
Library library = libraries.get(libraryId);
|
||||||
|
if (library == null) {
|
||||||
|
throw new IllegalArgumentException("Library not found: " + libraryId);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate new name
|
||||||
|
if (newName == null || newName.trim().isEmpty()) {
|
||||||
|
throw new IllegalArgumentException("Library name cannot be null or empty");
|
||||||
|
}
|
||||||
|
|
||||||
|
String oldName = library.getName();
|
||||||
|
String oldDescription = library.getDescription();
|
||||||
|
|
||||||
|
// Update the library object
|
||||||
|
library.setName(newName.trim());
|
||||||
|
library.setDescription(newDescription != null ? newDescription.trim() : "");
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Save to configuration file
|
||||||
|
saveLibraryConfiguration(library);
|
||||||
|
|
||||||
|
logger.info("Updated library metadata - ID: {}, Name: '{}' -> '{}', Description: '{}' -> '{}'",
|
||||||
|
libraryId, oldName, newName, oldDescription, library.getDescription());
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Rollback changes on failure
|
||||||
|
library.setName(oldName);
|
||||||
|
library.setDescription(oldDescription);
|
||||||
|
throw new RuntimeException("Failed to update library metadata: " + e.getMessage(), e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Save library configuration to file
|
||||||
|
*/
|
||||||
|
private void saveLibraryConfiguration(Library library) throws Exception {
|
||||||
|
Path libraryConfigPath = libraryConfigDir.resolve(library.getId() + ".json");
|
||||||
|
|
||||||
|
// Create library configuration object
|
||||||
|
Map<String, Object> config = new HashMap<>();
|
||||||
|
config.put("id", library.getId());
|
||||||
|
config.put("name", library.getName());
|
||||||
|
config.put("description", library.getDescription());
|
||||||
|
config.put("passwordHash", library.getPasswordHash());
|
||||||
|
config.put("dbName", library.getDbName());
|
||||||
|
config.put("typesenseCollection", library.getTypesenseCollection());
|
||||||
|
config.put("imagePath", library.getImagePath());
|
||||||
|
config.put("initialized", library.isInitialized());
|
||||||
|
|
||||||
|
// Write to file
|
||||||
|
ObjectMapper mapper = new ObjectMapper();
|
||||||
|
String configJson = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(config);
|
||||||
|
Files.writeString(libraryConfigPath, configJson, StandardCharsets.UTF_8);
|
||||||
|
|
||||||
|
logger.debug("Saved library configuration to: {}", libraryConfigPath);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,36 +1,83 @@
|
|||||||
package com.storycove.service;
|
package com.storycove.service;
|
||||||
|
|
||||||
import org.springframework.beans.factory.annotation.Value;
|
import com.storycove.util.JwtUtil;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.security.crypto.password.PasswordEncoder;
|
import org.springframework.security.crypto.password.PasswordEncoder;
|
||||||
import org.springframework.stereotype.Service;
|
import org.springframework.stereotype.Service;
|
||||||
|
|
||||||
@Service
|
@Service
|
||||||
public class PasswordAuthenticationService {
|
public class PasswordAuthenticationService {
|
||||||
|
|
||||||
@Value("${storycove.auth.password}")
|
private static final Logger logger = LoggerFactory.getLogger(PasswordAuthenticationService.class);
|
||||||
private String applicationPassword;
|
|
||||||
|
|
||||||
private final PasswordEncoder passwordEncoder;
|
private final PasswordEncoder passwordEncoder;
|
||||||
|
private final LibraryService libraryService;
|
||||||
|
private final JwtUtil jwtUtil;
|
||||||
|
|
||||||
public PasswordAuthenticationService(PasswordEncoder passwordEncoder) {
|
@Autowired
|
||||||
|
public PasswordAuthenticationService(
|
||||||
|
PasswordEncoder passwordEncoder,
|
||||||
|
LibraryService libraryService,
|
||||||
|
JwtUtil jwtUtil) {
|
||||||
this.passwordEncoder = passwordEncoder;
|
this.passwordEncoder = passwordEncoder;
|
||||||
|
this.libraryService = libraryService;
|
||||||
|
this.jwtUtil = jwtUtil;
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean authenticate(String providedPassword) {
|
/**
|
||||||
|
* Authenticate user and switch to the appropriate library
|
||||||
|
* Returns JWT token if authentication successful, null otherwise
|
||||||
|
*/
|
||||||
|
public String authenticateAndSwitchLibrary(String providedPassword) {
|
||||||
if (providedPassword == null || providedPassword.trim().isEmpty()) {
|
if (providedPassword == null || providedPassword.trim().isEmpty()) {
|
||||||
return false;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
// If application password starts with {bcrypt}, it's already encoded
|
// Find which library this password belongs to
|
||||||
if (applicationPassword.startsWith("{bcrypt}") || applicationPassword.startsWith("$2")) {
|
String libraryId = libraryService.authenticateAndGetLibrary(providedPassword);
|
||||||
return passwordEncoder.matches(providedPassword, applicationPassword);
|
if (libraryId == null) {
|
||||||
|
logger.warn("Authentication failed - invalid password");
|
||||||
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Otherwise, compare directly (for development/testing)
|
try {
|
||||||
return applicationPassword.equals(providedPassword);
|
// Switch to the authenticated library with forced reindexing (may take 2-3 seconds)
|
||||||
|
libraryService.switchToLibraryAfterAuthentication(libraryId);
|
||||||
|
|
||||||
|
// Generate JWT token with library context
|
||||||
|
String token = jwtUtil.generateToken("user", libraryId);
|
||||||
|
|
||||||
|
logger.info("Successfully authenticated and switched to library: {}", libraryId);
|
||||||
|
return token;
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Failed to switch to library: {}", libraryId, e);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Legacy method - kept for backward compatibility
|
||||||
|
*/
|
||||||
|
@Deprecated
|
||||||
|
public boolean authenticate(String providedPassword) {
|
||||||
|
return authenticateAndSwitchLibrary(providedPassword) != null;
|
||||||
}
|
}
|
||||||
|
|
||||||
public String encodePassword(String rawPassword) {
|
public String encodePassword(String rawPassword) {
|
||||||
return passwordEncoder.encode(rawPassword);
|
return passwordEncoder.encode(rawPassword);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get current library info for authenticated user
|
||||||
|
*/
|
||||||
|
public String getCurrentLibraryInfo() {
|
||||||
|
var library = libraryService.getCurrentLibrary();
|
||||||
|
if (library != null) {
|
||||||
|
return String.format("Library: %s (%s)", library.getName(), library.getId());
|
||||||
|
}
|
||||||
|
return "No library active";
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -0,0 +1,28 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
public class ReadingTimeService {
|
||||||
|
|
||||||
|
@Value("${app.reading.speed.default:200}")
|
||||||
|
private int defaultWordsPerMinute;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Calculate estimated reading time in minutes for the given word count
|
||||||
|
* @param wordCount the number of words to read
|
||||||
|
* @return estimated reading time in minutes (minimum 1 minute)
|
||||||
|
*/
|
||||||
|
public int calculateReadingTime(int wordCount) {
|
||||||
|
return Math.max(1, wordCount / defaultWordsPerMinute);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get the current words per minute setting
|
||||||
|
* @return words per minute reading speed
|
||||||
|
*/
|
||||||
|
public int getWordsPerMinute() {
|
||||||
|
return defaultWordsPerMinute;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -5,6 +5,8 @@ import com.storycove.repository.SeriesRepository;
|
|||||||
import com.storycove.service.exception.DuplicateResourceException;
|
import com.storycove.service.exception.DuplicateResourceException;
|
||||||
import com.storycove.service.exception.ResourceNotFoundException;
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
import jakarta.validation.Valid;
|
import jakarta.validation.Valid;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.data.domain.Page;
|
import org.springframework.data.domain.Page;
|
||||||
import org.springframework.data.domain.Pageable;
|
import org.springframework.data.domain.Pageable;
|
||||||
@@ -21,6 +23,8 @@ import java.util.UUID;
|
|||||||
@Transactional
|
@Transactional
|
||||||
public class SeriesService {
|
public class SeriesService {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(SeriesService.class);
|
||||||
|
|
||||||
private final SeriesRepository seriesRepository;
|
private final SeriesRepository seriesRepository;
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
|
|||||||
@@ -4,13 +4,15 @@ import com.storycove.entity.Author;
|
|||||||
import com.storycove.entity.Series;
|
import com.storycove.entity.Series;
|
||||||
import com.storycove.entity.Story;
|
import com.storycove.entity.Story;
|
||||||
import com.storycove.entity.Tag;
|
import com.storycove.entity.Tag;
|
||||||
|
import com.storycove.repository.ReadingPositionRepository;
|
||||||
import com.storycove.repository.StoryRepository;
|
import com.storycove.repository.StoryRepository;
|
||||||
import com.storycove.repository.TagRepository;
|
import com.storycove.repository.TagRepository;
|
||||||
import com.storycove.service.exception.DuplicateResourceException;
|
import com.storycove.service.exception.DuplicateResourceException;
|
||||||
import com.storycove.service.exception.ResourceNotFoundException;
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
import jakarta.validation.Valid;
|
import jakarta.validation.Valid;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnBean;
|
|
||||||
import org.springframework.data.domain.Page;
|
import org.springframework.data.domain.Page;
|
||||||
import org.springframework.data.domain.Pageable;
|
import org.springframework.data.domain.Pageable;
|
||||||
import org.springframework.stereotype.Service;
|
import org.springframework.stereotype.Service;
|
||||||
@@ -18,19 +20,24 @@ import org.springframework.transaction.annotation.Transactional;
|
|||||||
import org.springframework.validation.annotation.Validated;
|
import org.springframework.validation.annotation.Validated;
|
||||||
|
|
||||||
import java.time.LocalDateTime;
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.ArrayList;
|
||||||
import java.util.HashSet;
|
import java.util.HashSet;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Optional;
|
import java.util.Optional;
|
||||||
import java.util.Set;
|
import java.util.Set;
|
||||||
import java.util.UUID;
|
import java.util.UUID;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
@Service
|
@Service
|
||||||
@Validated
|
@Validated
|
||||||
@Transactional
|
@Transactional
|
||||||
public class StoryService {
|
public class StoryService {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(StoryService.class);
|
||||||
|
|
||||||
private final StoryRepository storyRepository;
|
private final StoryRepository storyRepository;
|
||||||
private final TagRepository tagRepository;
|
private final TagRepository tagRepository;
|
||||||
|
private final ReadingPositionRepository readingPositionRepository;
|
||||||
private final AuthorService authorService;
|
private final AuthorService authorService;
|
||||||
private final TagService tagService;
|
private final TagService tagService;
|
||||||
private final SeriesService seriesService;
|
private final SeriesService seriesService;
|
||||||
@@ -40,6 +47,7 @@ public class StoryService {
|
|||||||
@Autowired
|
@Autowired
|
||||||
public StoryService(StoryRepository storyRepository,
|
public StoryService(StoryRepository storyRepository,
|
||||||
TagRepository tagRepository,
|
TagRepository tagRepository,
|
||||||
|
ReadingPositionRepository readingPositionRepository,
|
||||||
AuthorService authorService,
|
AuthorService authorService,
|
||||||
TagService tagService,
|
TagService tagService,
|
||||||
SeriesService seriesService,
|
SeriesService seriesService,
|
||||||
@@ -47,6 +55,7 @@ public class StoryService {
|
|||||||
@Autowired(required = false) TypesenseService typesenseService) {
|
@Autowired(required = false) TypesenseService typesenseService) {
|
||||||
this.storyRepository = storyRepository;
|
this.storyRepository = storyRepository;
|
||||||
this.tagRepository = tagRepository;
|
this.tagRepository = tagRepository;
|
||||||
|
this.readingPositionRepository = readingPositionRepository;
|
||||||
this.authorService = authorService;
|
this.authorService = authorService;
|
||||||
this.tagService = tagService;
|
this.tagService = tagService;
|
||||||
this.seriesService = seriesService;
|
this.seriesService = seriesService;
|
||||||
@@ -75,11 +84,13 @@ public class StoryService {
|
|||||||
.orElseThrow(() -> new ResourceNotFoundException("Story", id.toString()));
|
.orElseThrow(() -> new ResourceNotFoundException("Story", id.toString()));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@Transactional(readOnly = true)
|
@Transactional(readOnly = true)
|
||||||
public Optional<Story> findByIdOptional(UUID id) {
|
public Optional<Story> findByIdOptional(UUID id) {
|
||||||
return storyRepository.findById(id);
|
return storyRepository.findById(id);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@Transactional(readOnly = true)
|
@Transactional(readOnly = true)
|
||||||
public Optional<Story> findByTitle(String title) {
|
public Optional<Story> findByTitle(String title) {
|
||||||
return storyRepository.findByTitle(title);
|
return storyRepository.findByTitle(title);
|
||||||
@@ -114,7 +125,7 @@ public class StoryService {
|
|||||||
|
|
||||||
@Transactional(readOnly = true)
|
@Transactional(readOnly = true)
|
||||||
public List<Story> findBySeries(UUID seriesId) {
|
public List<Story> findBySeries(UUID seriesId) {
|
||||||
Series series = seriesService.findById(seriesId);
|
seriesService.findById(seriesId); // Validate series exists
|
||||||
return storyRepository.findBySeriesOrderByVolume(seriesId);
|
return storyRepository.findBySeriesOrderByVolume(seriesId);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -271,6 +282,45 @@ public class StoryService {
|
|||||||
return savedStory;
|
return savedStory;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Transactional
|
||||||
|
public Story updateReadingProgress(UUID id, Integer position) {
|
||||||
|
if (position != null && position < 0) {
|
||||||
|
throw new IllegalArgumentException("Reading position must be non-negative");
|
||||||
|
}
|
||||||
|
|
||||||
|
Story story = findById(id);
|
||||||
|
story.updateReadingProgress(position);
|
||||||
|
Story savedStory = storyRepository.save(story);
|
||||||
|
|
||||||
|
// Update Typesense index with new reading progress
|
||||||
|
if (typesenseService != null) {
|
||||||
|
typesenseService.updateStory(savedStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
return savedStory;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Transactional
|
||||||
|
public Story updateReadingStatus(UUID id, Boolean isRead) {
|
||||||
|
Story story = findById(id);
|
||||||
|
|
||||||
|
if (Boolean.TRUE.equals(isRead)) {
|
||||||
|
story.markAsRead();
|
||||||
|
} else {
|
||||||
|
story.setIsRead(false);
|
||||||
|
story.setLastReadAt(LocalDateTime.now());
|
||||||
|
}
|
||||||
|
|
||||||
|
Story savedStory = storyRepository.save(story);
|
||||||
|
|
||||||
|
// Update Typesense index with new reading status
|
||||||
|
if (typesenseService != null) {
|
||||||
|
typesenseService.updateStory(savedStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
return savedStory;
|
||||||
|
}
|
||||||
|
|
||||||
@Transactional(readOnly = true)
|
@Transactional(readOnly = true)
|
||||||
public List<Story> findBySeriesOrderByVolume(UUID seriesId) {
|
public List<Story> findBySeriesOrderByVolume(UUID seriesId) {
|
||||||
return storyRepository.findBySeriesOrderByVolume(seriesId);
|
return storyRepository.findBySeriesOrderByVolume(seriesId);
|
||||||
@@ -393,13 +443,17 @@ public class StoryService {
|
|||||||
public void delete(UUID id) {
|
public void delete(UUID id) {
|
||||||
Story story = findById(id);
|
Story story = findById(id);
|
||||||
|
|
||||||
|
// Clean up reading positions first (to avoid foreign key constraint violations)
|
||||||
|
readingPositionRepository.deleteByStoryId(id);
|
||||||
|
|
||||||
// Remove from series if part of one
|
// Remove from series if part of one
|
||||||
if (story.getSeries() != null) {
|
if (story.getSeries() != null) {
|
||||||
story.getSeries().removeStory(story);
|
story.getSeries().removeStory(story);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Remove tags (this will update tag usage counts)
|
// Remove tags (this will update tag usage counts)
|
||||||
story.getTags().forEach(tag -> story.removeTag(tag));
|
// Create a copy to avoid ConcurrentModificationException
|
||||||
|
new ArrayList<>(story.getTags()).forEach(tag -> story.removeTag(tag));
|
||||||
|
|
||||||
// Delete from Typesense first (if available)
|
// Delete from Typesense first (if available)
|
||||||
if (typesenseService != null) {
|
if (typesenseService != null) {
|
||||||
@@ -562,13 +616,29 @@ public class StoryService {
|
|||||||
if (updateReq.getVolume() != null) {
|
if (updateReq.getVolume() != null) {
|
||||||
story.setVolume(updateReq.getVolume());
|
story.setVolume(updateReq.getVolume());
|
||||||
}
|
}
|
||||||
|
// Handle author - either by ID or by name
|
||||||
if (updateReq.getAuthorId() != null) {
|
if (updateReq.getAuthorId() != null) {
|
||||||
Author author = authorService.findById(updateReq.getAuthorId());
|
Author author = authorService.findById(updateReq.getAuthorId());
|
||||||
story.setAuthor(author);
|
story.setAuthor(author);
|
||||||
}
|
}
|
||||||
|
// Handle series - either by ID or by name
|
||||||
if (updateReq.getSeriesId() != null) {
|
if (updateReq.getSeriesId() != null) {
|
||||||
Series series = seriesService.findById(updateReq.getSeriesId());
|
Series series = seriesService.findById(updateReq.getSeriesId());
|
||||||
story.setSeries(series);
|
story.setSeries(series);
|
||||||
|
} else if (updateReq.getSeriesName() != null) {
|
||||||
|
if (updateReq.getSeriesName().trim().isEmpty()) {
|
||||||
|
// Empty series name means remove from series
|
||||||
|
story.setSeries(null);
|
||||||
|
} else {
|
||||||
|
// Find or create series by name
|
||||||
|
Series series = seriesService.findByNameOptional(updateReq.getSeriesName().trim())
|
||||||
|
.orElseGet(() -> {
|
||||||
|
Series newSeries = new Series();
|
||||||
|
newSeries.setName(updateReq.getSeriesName().trim());
|
||||||
|
return seriesService.create(newSeries);
|
||||||
|
});
|
||||||
|
story.setSeries(series);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -593,4 +663,145 @@ public class StoryService {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public List<Story> findPotentialDuplicates(String title, String authorName) {
|
||||||
|
if (title == null || title.trim().isEmpty() || authorName == null || authorName.trim().isEmpty()) {
|
||||||
|
return List.of();
|
||||||
|
}
|
||||||
|
return storyRepository.findByTitleAndAuthorNameIgnoreCase(title.trim(), authorName.trim());
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find a random story based on optional filters.
|
||||||
|
* Uses Typesense for consistency with Library search functionality.
|
||||||
|
* Supports text search and multiple tags using the same logic as the Library view.
|
||||||
|
* @param searchQuery Optional search query
|
||||||
|
* @param tags Optional list of tags to filter by
|
||||||
|
* @return Optional containing the random story if found
|
||||||
|
*/
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public Optional<Story> findRandomStory(String searchQuery, List<String> tags) {
|
||||||
|
return findRandomStory(searchQuery, tags, null, null, null, null, null, null, null,
|
||||||
|
null, null, null, null, null, null, null, null, null, null, null);
|
||||||
|
}
|
||||||
|
|
||||||
|
public Optional<Story> findRandomStory(String searchQuery, List<String> tags, Long seed) {
|
||||||
|
return findRandomStory(searchQuery, tags, seed, null, null, null, null, null, null,
|
||||||
|
null, null, null, null, null, null, null, null, null, null, null);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find a random story based on optional filters with seed support.
|
||||||
|
* Uses Typesense for consistency with Library search functionality.
|
||||||
|
* Supports text search and multiple tags using the same logic as the Library view.
|
||||||
|
* @param searchQuery Optional search query
|
||||||
|
* @param tags Optional list of tags to filter by
|
||||||
|
* @param seed Optional seed for consistent randomization (null for truly random)
|
||||||
|
* @return Optional containing the random story if found
|
||||||
|
*/
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public Optional<Story> findRandomStory(String searchQuery, List<String> tags, Long seed,
|
||||||
|
Integer minWordCount, Integer maxWordCount,
|
||||||
|
String createdAfter, String createdBefore,
|
||||||
|
String lastReadAfter, String lastReadBefore,
|
||||||
|
Integer minRating, Integer maxRating, Boolean unratedOnly,
|
||||||
|
String readingStatus, Boolean hasReadingProgress,
|
||||||
|
Boolean hasCoverImage, String sourceDomain,
|
||||||
|
String seriesFilter, Integer minTagCount,
|
||||||
|
Boolean popularOnly, Boolean hiddenGemsOnly) {
|
||||||
|
|
||||||
|
// Use Typesense if available for consistency with Library search
|
||||||
|
if (typesenseService != null) {
|
||||||
|
try {
|
||||||
|
Optional<UUID> randomStoryId = typesenseService.getRandomStoryId(searchQuery, tags, seed,
|
||||||
|
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||||
|
minRating, maxRating, unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
|
||||||
|
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
|
||||||
|
if (randomStoryId.isPresent()) {
|
||||||
|
return storyRepository.findById(randomStoryId.get());
|
||||||
|
}
|
||||||
|
return Optional.empty();
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Fallback to database queries if Typesense fails
|
||||||
|
logger.warn("Typesense random story lookup failed, falling back to database queries", e);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback to repository-based implementation (global routing handles library selection)
|
||||||
|
return findRandomStoryFromRepository(searchQuery, tags);
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find random story using repository methods (for default database or when library-aware fails)
|
||||||
|
*/
|
||||||
|
private Optional<Story> findRandomStoryFromRepository(String searchQuery, List<String> tags) {
|
||||||
|
// Clean up inputs
|
||||||
|
String cleanSearchQuery = (searchQuery != null && !searchQuery.trim().isEmpty()) ? searchQuery.trim() : null;
|
||||||
|
List<String> cleanTags = (tags != null) ? tags.stream()
|
||||||
|
.filter(tag -> tag != null && !tag.trim().isEmpty())
|
||||||
|
.map(String::trim)
|
||||||
|
.collect(Collectors.toList()) : List.of();
|
||||||
|
|
||||||
|
long totalCount = 0;
|
||||||
|
Optional<Story> randomStory = Optional.empty();
|
||||||
|
|
||||||
|
if (cleanSearchQuery != null && !cleanTags.isEmpty()) {
|
||||||
|
// Both search query and tags
|
||||||
|
String searchPattern = "%" + cleanSearchQuery + "%";
|
||||||
|
List<String> upperCaseTags = cleanTags.stream()
|
||||||
|
.map(String::toUpperCase)
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
|
||||||
|
totalCount = storyRepository.countStoriesByTextSearchAndTags(searchPattern, upperCaseTags, cleanTags.size());
|
||||||
|
if (totalCount > 0) {
|
||||||
|
long randomOffset = (long) (Math.random() * totalCount);
|
||||||
|
randomStory = storyRepository.findRandomStoryByTextSearchAndTags(searchPattern, upperCaseTags, cleanTags.size(), randomOffset);
|
||||||
|
}
|
||||||
|
|
||||||
|
} else if (cleanSearchQuery != null) {
|
||||||
|
// Only search query
|
||||||
|
String searchPattern = "%" + cleanSearchQuery + "%";
|
||||||
|
totalCount = storyRepository.countStoriesByTextSearch(searchPattern);
|
||||||
|
if (totalCount > 0) {
|
||||||
|
long randomOffset = (long) (Math.random() * totalCount);
|
||||||
|
randomStory = storyRepository.findRandomStoryByTextSearch(searchPattern, randomOffset);
|
||||||
|
}
|
||||||
|
|
||||||
|
} else if (!cleanTags.isEmpty()) {
|
||||||
|
// Only tags
|
||||||
|
if (cleanTags.size() == 1) {
|
||||||
|
// Single tag - use optimized single tag query
|
||||||
|
totalCount = storyRepository.countStoriesByTagName(cleanTags.get(0));
|
||||||
|
if (totalCount > 0) {
|
||||||
|
long randomOffset = (long) (Math.random() * totalCount);
|
||||||
|
randomStory = storyRepository.findRandomStoryByTagName(cleanTags.get(0), randomOffset);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Multiple tags
|
||||||
|
List<String> upperCaseTags = cleanTags.stream()
|
||||||
|
.map(String::toUpperCase)
|
||||||
|
.collect(Collectors.toList());
|
||||||
|
|
||||||
|
totalCount = storyRepository.countStoriesByMultipleTags(upperCaseTags, cleanTags.size());
|
||||||
|
if (totalCount > 0) {
|
||||||
|
long randomOffset = (long) (Math.random() * totalCount);
|
||||||
|
randomStory = storyRepository.findRandomStoryByMultipleTags(upperCaseTags, cleanTags.size(), randomOffset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
} else {
|
||||||
|
// No filters - get random from all stories
|
||||||
|
totalCount = storyRepository.countAllStories();
|
||||||
|
if (totalCount > 0) {
|
||||||
|
long randomOffset = (long) (Math.random() * totalCount);
|
||||||
|
randomStory = storyRepository.findRandomStory(randomOffset);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return randomStory;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
}
|
}
|
||||||
1192
backend/src/main/java/com/storycove/service/StoryService.java.backup
Normal file
1192
backend/src/main/java/com/storycove/service/StoryService.java.backup
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,10 +1,15 @@
|
|||||||
package com.storycove.service;
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import com.storycove.entity.Story;
|
||||||
import com.storycove.entity.Tag;
|
import com.storycove.entity.Tag;
|
||||||
|
import com.storycove.entity.TagAlias;
|
||||||
import com.storycove.repository.TagRepository;
|
import com.storycove.repository.TagRepository;
|
||||||
|
import com.storycove.repository.TagAliasRepository;
|
||||||
import com.storycove.service.exception.DuplicateResourceException;
|
import com.storycove.service.exception.DuplicateResourceException;
|
||||||
import com.storycove.service.exception.ResourceNotFoundException;
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
import jakarta.validation.Valid;
|
import jakarta.validation.Valid;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Autowired;
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
import org.springframework.data.domain.Page;
|
import org.springframework.data.domain.Page;
|
||||||
import org.springframework.data.domain.Pageable;
|
import org.springframework.data.domain.Pageable;
|
||||||
@@ -12,8 +17,11 @@ import org.springframework.stereotype.Service;
|
|||||||
import org.springframework.transaction.annotation.Transactional;
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
import org.springframework.validation.annotation.Validated;
|
import org.springframework.validation.annotation.Validated;
|
||||||
|
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.HashSet;
|
||||||
import java.util.List;
|
import java.util.List;
|
||||||
import java.util.Optional;
|
import java.util.Optional;
|
||||||
|
import java.util.Set;
|
||||||
import java.util.UUID;
|
import java.util.UUID;
|
||||||
|
|
||||||
@Service
|
@Service
|
||||||
@@ -21,11 +29,15 @@ import java.util.UUID;
|
|||||||
@Transactional
|
@Transactional
|
||||||
public class TagService {
|
public class TagService {
|
||||||
|
|
||||||
|
private static final Logger logger = LoggerFactory.getLogger(TagService.class);
|
||||||
|
|
||||||
private final TagRepository tagRepository;
|
private final TagRepository tagRepository;
|
||||||
|
private final TagAliasRepository tagAliasRepository;
|
||||||
|
|
||||||
@Autowired
|
@Autowired
|
||||||
public TagService(TagRepository tagRepository) {
|
public TagService(TagRepository tagRepository, TagAliasRepository tagAliasRepository) {
|
||||||
this.tagRepository = tagRepository;
|
this.tagRepository = tagRepository;
|
||||||
|
this.tagAliasRepository = tagAliasRepository;
|
||||||
}
|
}
|
||||||
|
|
||||||
@Transactional(readOnly = true)
|
@Transactional(readOnly = true)
|
||||||
@@ -192,6 +204,11 @@ public class TagService {
|
|||||||
return tagRepository.countUsedTags();
|
return tagRepository.countUsedTags();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public List<Tag> findTagsUsedByCollections() {
|
||||||
|
return tagRepository.findTagsUsedByCollections();
|
||||||
|
}
|
||||||
|
|
||||||
private void validateTagForCreate(Tag tag) {
|
private void validateTagForCreate(Tag tag) {
|
||||||
if (existsByName(tag.getName())) {
|
if (existsByName(tag.getName())) {
|
||||||
throw new DuplicateResourceException("Tag", tag.getName());
|
throw new DuplicateResourceException("Tag", tag.getName());
|
||||||
@@ -202,5 +219,273 @@ public class TagService {
|
|||||||
if (updates.getName() != null) {
|
if (updates.getName() != null) {
|
||||||
existing.setName(updates.getName());
|
existing.setName(updates.getName());
|
||||||
}
|
}
|
||||||
|
if (updates.getColor() != null) {
|
||||||
|
existing.setColor(updates.getColor());
|
||||||
|
}
|
||||||
|
if (updates.getDescription() != null) {
|
||||||
|
existing.setDescription(updates.getDescription());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tag alias management methods
|
||||||
|
|
||||||
|
public TagAlias addAlias(UUID tagId, String aliasName) {
|
||||||
|
Tag canonicalTag = findById(tagId);
|
||||||
|
|
||||||
|
// Check if alias already exists (case-insensitive)
|
||||||
|
if (tagAliasRepository.existsByAliasNameIgnoreCase(aliasName)) {
|
||||||
|
throw new DuplicateResourceException("Tag alias", aliasName);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if alias name conflicts with existing tag names
|
||||||
|
if (tagRepository.existsByNameIgnoreCase(aliasName)) {
|
||||||
|
throw new DuplicateResourceException("Tag alias conflicts with existing tag name", aliasName);
|
||||||
|
}
|
||||||
|
|
||||||
|
TagAlias alias = new TagAlias();
|
||||||
|
alias.setAliasName(aliasName);
|
||||||
|
alias.setCanonicalTag(canonicalTag);
|
||||||
|
alias.setCreatedFromMerge(false);
|
||||||
|
|
||||||
|
return tagAliasRepository.save(alias);
|
||||||
|
}
|
||||||
|
|
||||||
|
public void removeAlias(UUID tagId, UUID aliasId) {
|
||||||
|
findById(tagId); // Validate tag exists
|
||||||
|
TagAlias alias = tagAliasRepository.findById(aliasId)
|
||||||
|
.orElseThrow(() -> new ResourceNotFoundException("Tag alias", aliasId.toString()));
|
||||||
|
|
||||||
|
// Verify the alias belongs to the specified tag
|
||||||
|
if (!alias.getCanonicalTag().getId().equals(tagId)) {
|
||||||
|
throw new IllegalArgumentException("Alias does not belong to the specified tag");
|
||||||
|
}
|
||||||
|
|
||||||
|
tagAliasRepository.delete(alias);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public Tag resolveTagByName(String name) {
|
||||||
|
// First try to find exact tag match
|
||||||
|
Optional<Tag> directMatch = tagRepository.findByNameIgnoreCase(name);
|
||||||
|
if (directMatch.isPresent()) {
|
||||||
|
return directMatch.get();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Then try to find by alias
|
||||||
|
Optional<TagAlias> aliasMatch = tagAliasRepository.findByAliasNameIgnoreCase(name);
|
||||||
|
if (aliasMatch.isPresent()) {
|
||||||
|
return aliasMatch.get().getCanonicalTag();
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Transactional
|
||||||
|
public Tag mergeTags(List<UUID> sourceTagIds, UUID targetTagId) {
|
||||||
|
// Validate target tag exists
|
||||||
|
Tag targetTag = findById(targetTagId);
|
||||||
|
|
||||||
|
// Validate source tags exist and are different from target
|
||||||
|
List<Tag> sourceTags = sourceTagIds.stream()
|
||||||
|
.filter(id -> !id.equals(targetTagId)) // Don't merge tag with itself
|
||||||
|
.map(this::findById)
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
if (sourceTags.isEmpty()) {
|
||||||
|
throw new IllegalArgumentException("No valid source tags to merge");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Perform the merge atomically
|
||||||
|
for (Tag sourceTag : sourceTags) {
|
||||||
|
// Move all stories from source tag to target tag
|
||||||
|
// Create a copy to avoid ConcurrentModificationException
|
||||||
|
List<Story> storiesToMove = new ArrayList<>(sourceTag.getStories());
|
||||||
|
storiesToMove.forEach(story -> {
|
||||||
|
story.removeTag(sourceTag);
|
||||||
|
story.addTag(targetTag);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create alias for the source tag name
|
||||||
|
TagAlias alias = new TagAlias();
|
||||||
|
alias.setAliasName(sourceTag.getName());
|
||||||
|
alias.setCanonicalTag(targetTag);
|
||||||
|
alias.setCreatedFromMerge(true);
|
||||||
|
tagAliasRepository.save(alias);
|
||||||
|
|
||||||
|
// Delete the source tag
|
||||||
|
tagRepository.delete(sourceTag);
|
||||||
|
}
|
||||||
|
|
||||||
|
return tagRepository.save(targetTag);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public List<Tag> findByNameOrAliasStartingWith(String query, int limit) {
|
||||||
|
// Find tags that start with the query
|
||||||
|
List<Tag> directMatches = tagRepository.findByNameStartingWithIgnoreCase(query.toLowerCase());
|
||||||
|
|
||||||
|
// Find tags via aliases that start with the query
|
||||||
|
List<TagAlias> aliasMatches = tagAliasRepository.findByAliasNameStartingWithIgnoreCase(query.toLowerCase());
|
||||||
|
List<Tag> aliasTagMatches = aliasMatches.stream()
|
||||||
|
.map(TagAlias::getCanonicalTag)
|
||||||
|
.distinct()
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
// Combine and deduplicate
|
||||||
|
Set<Tag> allMatches = new HashSet<>(directMatches);
|
||||||
|
allMatches.addAll(aliasTagMatches);
|
||||||
|
|
||||||
|
// Convert to list and limit results
|
||||||
|
return allMatches.stream()
|
||||||
|
.sorted((a, b) -> a.getName().compareToIgnoreCase(b.getName()))
|
||||||
|
.limit(limit)
|
||||||
|
.toList();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public com.storycove.controller.TagController.MergePreviewResponse previewMerge(List<UUID> sourceTagIds, UUID targetTagId) {
|
||||||
|
// Validate target tag exists
|
||||||
|
Tag targetTag = findById(targetTagId);
|
||||||
|
|
||||||
|
// Validate source tags exist and are different from target
|
||||||
|
List<Tag> sourceTags = sourceTagIds.stream()
|
||||||
|
.filter(id -> !id.equals(targetTagId))
|
||||||
|
.map(this::findById)
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
if (sourceTags.isEmpty()) {
|
||||||
|
throw new IllegalArgumentException("No valid source tags to merge");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate preview data
|
||||||
|
int targetStoryCount = targetTag.getStories().size();
|
||||||
|
|
||||||
|
// Collect all unique stories from all tags (including target) to handle overlaps correctly
|
||||||
|
Set<Story> allUniqueStories = new HashSet<>(targetTag.getStories());
|
||||||
|
for (Tag sourceTag : sourceTags) {
|
||||||
|
allUniqueStories.addAll(sourceTag.getStories());
|
||||||
|
}
|
||||||
|
int totalStories = allUniqueStories.size();
|
||||||
|
|
||||||
|
List<String> aliasesToCreate = sourceTags.stream()
|
||||||
|
.map(Tag::getName)
|
||||||
|
.toList();
|
||||||
|
|
||||||
|
// Create response object using the controller's inner class
|
||||||
|
var preview = new com.storycove.controller.TagController.MergePreviewResponse();
|
||||||
|
preview.setTargetTagName(targetTag.getName());
|
||||||
|
preview.setTargetStoryCount(targetStoryCount);
|
||||||
|
preview.setTotalResultStoryCount(totalStories);
|
||||||
|
preview.setAliasesToCreate(aliasesToCreate);
|
||||||
|
|
||||||
|
return preview;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Transactional(readOnly = true)
|
||||||
|
public List<com.storycove.controller.TagController.TagSuggestion> suggestTags(String title, String content, String summary, int limit) {
|
||||||
|
List<com.storycove.controller.TagController.TagSuggestion> suggestions = new ArrayList<>();
|
||||||
|
|
||||||
|
// Get all existing tags for matching
|
||||||
|
List<Tag> existingTags = findAll();
|
||||||
|
|
||||||
|
// Combine all text for analysis
|
||||||
|
String combinedText = (title != null ? title : "") + " " +
|
||||||
|
(summary != null ? summary : "") + " " +
|
||||||
|
(content != null ? stripHtml(content) : "");
|
||||||
|
|
||||||
|
if (combinedText.trim().isEmpty()) {
|
||||||
|
return suggestions;
|
||||||
|
}
|
||||||
|
|
||||||
|
String lowerText = combinedText.toLowerCase();
|
||||||
|
|
||||||
|
// Score each existing tag based on how well it matches the content
|
||||||
|
for (Tag tag : existingTags) {
|
||||||
|
double score = calculateTagRelevanceScore(tag, lowerText, title, summary);
|
||||||
|
|
||||||
|
if (score > 0.1) { // Only suggest tags with reasonable confidence
|
||||||
|
String reason = generateReason(tag, lowerText, title, summary);
|
||||||
|
suggestions.add(new com.storycove.controller.TagController.TagSuggestion(
|
||||||
|
tag.getName(), score, reason
|
||||||
|
));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sort by confidence score (descending) and limit results
|
||||||
|
return suggestions.stream()
|
||||||
|
.sorted((a, b) -> Double.compare(b.getConfidence(), a.getConfidence()))
|
||||||
|
.limit(limit)
|
||||||
|
.collect(java.util.stream.Collectors.toList());
|
||||||
|
}
|
||||||
|
|
||||||
|
private double calculateTagRelevanceScore(Tag tag, String lowerText, String title, String summary) {
|
||||||
|
String tagName = tag.getName().toLowerCase();
|
||||||
|
double score = 0.0;
|
||||||
|
|
||||||
|
// Exact matches get highest score
|
||||||
|
if (lowerText.contains(" " + tagName + " ") || lowerText.startsWith(tagName + " ") || lowerText.endsWith(" " + tagName)) {
|
||||||
|
score += 0.8;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Partial matches in title get high score
|
||||||
|
if (title != null && title.toLowerCase().contains(tagName)) {
|
||||||
|
score += 0.6;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Partial matches in summary get medium score
|
||||||
|
if (summary != null && summary.toLowerCase().contains(tagName)) {
|
||||||
|
score += 0.4;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Word-based matching (split tag name and look for individual words)
|
||||||
|
String[] tagWords = tagName.split("[\\s-_]+");
|
||||||
|
int matchedWords = 0;
|
||||||
|
for (String word : tagWords) {
|
||||||
|
if (word.length() > 2 && lowerText.contains(word)) {
|
||||||
|
matchedWords++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (tagWords.length > 0) {
|
||||||
|
score += 0.3 * ((double) matchedWords / tagWords.length);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Boost score based on tag popularity (more used tags are more likely to be relevant)
|
||||||
|
int storyCount = tag.getStories() != null ? tag.getStories().size() : 0;
|
||||||
|
if (storyCount > 0) {
|
||||||
|
score += Math.min(0.2, storyCount * 0.01); // Small boost, capped at 0.2
|
||||||
|
}
|
||||||
|
|
||||||
|
return Math.min(1.0, score); // Cap at 1.0
|
||||||
|
}
|
||||||
|
|
||||||
|
private String generateReason(Tag tag, String lowerText, String title, String summary) {
|
||||||
|
String tagName = tag.getName().toLowerCase();
|
||||||
|
|
||||||
|
if (title != null && title.toLowerCase().contains(tagName)) {
|
||||||
|
return "Found in title";
|
||||||
|
}
|
||||||
|
|
||||||
|
if (summary != null && summary.toLowerCase().contains(tagName)) {
|
||||||
|
return "Found in summary";
|
||||||
|
}
|
||||||
|
|
||||||
|
if (lowerText.contains(" " + tagName + " ") || lowerText.startsWith(tagName + " ") || lowerText.endsWith(" " + tagName)) {
|
||||||
|
return "Exact match in content";
|
||||||
|
}
|
||||||
|
|
||||||
|
String[] tagWords = tagName.split("[\\s-_]+");
|
||||||
|
for (String word : tagWords) {
|
||||||
|
if (word.length() > 2 && lowerText.contains(word)) {
|
||||||
|
return "Related keywords found";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return "Similar content";
|
||||||
|
}
|
||||||
|
|
||||||
|
private String stripHtml(String html) {
|
||||||
|
if (html == null) return "";
|
||||||
|
// Simple HTML tag removal - replace with a proper HTML parser if needed
|
||||||
|
return html.replaceAll("<[^>]+>", " ").replaceAll("\\s+", " ").trim();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,12 @@
|
|||||||
|
package com.storycove.service.exception;
|
||||||
|
|
||||||
|
public class InvalidFileException extends RuntimeException {
|
||||||
|
|
||||||
|
public InvalidFileException(String message) {
|
||||||
|
super(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
public InvalidFileException(String message, Throwable cause) {
|
||||||
|
super(message, cause);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -3,35 +3,64 @@ package com.storycove.util;
|
|||||||
import io.jsonwebtoken.Claims;
|
import io.jsonwebtoken.Claims;
|
||||||
import io.jsonwebtoken.Jwts;
|
import io.jsonwebtoken.Jwts;
|
||||||
import io.jsonwebtoken.security.Keys;
|
import io.jsonwebtoken.security.Keys;
|
||||||
|
import org.slf4j.Logger;
|
||||||
|
import org.slf4j.LoggerFactory;
|
||||||
import org.springframework.beans.factory.annotation.Value;
|
import org.springframework.beans.factory.annotation.Value;
|
||||||
import org.springframework.stereotype.Component;
|
import org.springframework.stereotype.Component;
|
||||||
|
|
||||||
|
import jakarta.annotation.PostConstruct;
|
||||||
import javax.crypto.SecretKey;
|
import javax.crypto.SecretKey;
|
||||||
|
import java.security.SecureRandom;
|
||||||
|
import java.util.Base64;
|
||||||
import java.util.Date;
|
import java.util.Date;
|
||||||
|
|
||||||
@Component
|
@Component
|
||||||
public class JwtUtil {
|
public class JwtUtil {
|
||||||
|
|
||||||
@Value("${storycove.jwt.secret}")
|
private static final Logger logger = LoggerFactory.getLogger(JwtUtil.class);
|
||||||
|
|
||||||
|
// Security: Generate new secret on each startup to invalidate all existing tokens
|
||||||
private String secret;
|
private String secret;
|
||||||
|
|
||||||
@Value("${storycove.jwt.expiration:86400000}") // 24 hours default
|
@Value("${storycove.jwt.expiration:86400000}") // 24 hours default
|
||||||
private Long expiration;
|
private Long expiration;
|
||||||
|
|
||||||
|
@PostConstruct
|
||||||
|
public void initialize() {
|
||||||
|
// Generate a new random secret on startup to invalidate all existing JWT tokens
|
||||||
|
// This ensures users must re-authenticate after application restart
|
||||||
|
SecureRandom random = new SecureRandom();
|
||||||
|
byte[] secretBytes = new byte[64]; // 512 bits
|
||||||
|
random.nextBytes(secretBytes);
|
||||||
|
this.secret = Base64.getEncoder().encodeToString(secretBytes);
|
||||||
|
|
||||||
|
logger.info("JWT secret rotated on startup - all existing tokens invalidated");
|
||||||
|
logger.info("Users will need to re-authenticate after application restart for security");
|
||||||
|
}
|
||||||
|
|
||||||
private SecretKey getSigningKey() {
|
private SecretKey getSigningKey() {
|
||||||
return Keys.hmacShaKeyFor(secret.getBytes());
|
return Keys.hmacShaKeyFor(secret.getBytes());
|
||||||
}
|
}
|
||||||
|
|
||||||
public String generateToken() {
|
public String generateToken() {
|
||||||
|
return generateToken("user", null);
|
||||||
|
}
|
||||||
|
|
||||||
|
public String generateToken(String subject, String libraryId) {
|
||||||
Date now = new Date();
|
Date now = new Date();
|
||||||
Date expiryDate = new Date(now.getTime() + expiration);
|
Date expiryDate = new Date(now.getTime() + expiration);
|
||||||
|
|
||||||
return Jwts.builder()
|
var builder = Jwts.builder()
|
||||||
.subject("user")
|
.subject(subject)
|
||||||
.issuedAt(now)
|
.issuedAt(now)
|
||||||
.expiration(expiryDate)
|
.expiration(expiryDate);
|
||||||
.signWith(getSigningKey())
|
|
||||||
.compact();
|
// Add library context if provided
|
||||||
|
if (libraryId != null) {
|
||||||
|
builder.claim("libraryId", libraryId);
|
||||||
|
}
|
||||||
|
|
||||||
|
return builder.signWith(getSigningKey()).compact();
|
||||||
}
|
}
|
||||||
|
|
||||||
public boolean validateToken(String token) {
|
public boolean validateToken(String token) {
|
||||||
@@ -62,4 +91,13 @@ public class JwtUtil {
|
|||||||
public String getSubjectFromToken(String token) {
|
public String getSubjectFromToken(String token) {
|
||||||
return getClaimsFromToken(token).getSubject();
|
return getClaimsFromToken(token).getSubject();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public String getLibraryIdFromToken(String token) {
|
||||||
|
try {
|
||||||
|
Claims claims = getClaimsFromToken(token);
|
||||||
|
return claims.get("libraryId", String.class);
|
||||||
|
} catch (Exception e) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -16,8 +16,8 @@ spring:
|
|||||||
|
|
||||||
servlet:
|
servlet:
|
||||||
multipart:
|
multipart:
|
||||||
max-file-size: 5MB
|
max-file-size: 256MB # Increased for backup restore
|
||||||
max-request-size: 10MB
|
max-request-size: 260MB # Slightly higher to account for form data
|
||||||
|
|
||||||
server:
|
server:
|
||||||
port: 8080
|
port: 8080
|
||||||
@@ -28,10 +28,10 @@ storycove:
|
|||||||
cors:
|
cors:
|
||||||
allowed-origins: ${STORYCOVE_CORS_ALLOWED_ORIGINS:http://localhost:3000,http://localhost:6925}
|
allowed-origins: ${STORYCOVE_CORS_ALLOWED_ORIGINS:http://localhost:3000,http://localhost:6925}
|
||||||
jwt:
|
jwt:
|
||||||
secret: ${JWT_SECRET:default-secret-key}
|
secret: ${JWT_SECRET} # REQUIRED: Must be at least 32 characters, no default for security
|
||||||
expiration: 86400000 # 24 hours
|
expiration: 86400000 # 24 hours
|
||||||
auth:
|
auth:
|
||||||
password: ${APP_PASSWORD:admin}
|
password: ${APP_PASSWORD} # REQUIRED: No default password for security
|
||||||
typesense:
|
typesense:
|
||||||
api-key: ${TYPESENSE_API_KEY:xyz}
|
api-key: ${TYPESENSE_API_KEY:xyz}
|
||||||
host: ${TYPESENSE_HOST:localhost}
|
host: ${TYPESENSE_HOST:localhost}
|
||||||
@@ -43,5 +43,7 @@ storycove:
|
|||||||
|
|
||||||
logging:
|
logging:
|
||||||
level:
|
level:
|
||||||
com.storycove: DEBUG
|
com.storycove: ${LOG_LEVEL:INFO} # Use INFO for production, DEBUG for development
|
||||||
org.springframework.security: DEBUG
|
org.springframework.security: WARN # Reduce security logging
|
||||||
|
org.springframework.web: WARN
|
||||||
|
org.hibernate.SQL: ${SQL_LOG_LEVEL:WARN} # Control SQL logging separately
|
||||||
@@ -4,7 +4,7 @@
|
|||||||
"b", "strong", "i", "em", "u", "s", "strike", "del", "ins",
|
"b", "strong", "i", "em", "u", "s", "strike", "del", "ins",
|
||||||
"sup", "sub", "small", "big", "mark", "pre", "code", "kbd", "samp", "var",
|
"sup", "sub", "small", "big", "mark", "pre", "code", "kbd", "samp", "var",
|
||||||
"ul", "ol", "li", "dl", "dt", "dd",
|
"ul", "ol", "li", "dl", "dt", "dd",
|
||||||
"a", "table", "thead", "tbody", "tfoot", "tr", "th", "td", "caption", "colgroup", "col",
|
"a", "img", "table", "thead", "tbody", "tfoot", "tr", "th", "td", "caption", "colgroup", "col",
|
||||||
"blockquote", "cite", "q", "hr", "details", "summary"
|
"blockquote", "cite", "q", "hr", "details", "summary"
|
||||||
],
|
],
|
||||||
"allowedAttributes": {
|
"allowedAttributes": {
|
||||||
@@ -17,7 +17,8 @@
|
|||||||
"h4": ["class", "style"],
|
"h4": ["class", "style"],
|
||||||
"h5": ["class", "style"],
|
"h5": ["class", "style"],
|
||||||
"h6": ["class", "style"],
|
"h6": ["class", "style"],
|
||||||
"a": ["class"],
|
"a": ["class", "href", "title"],
|
||||||
|
"img": ["src", "alt", "width", "height", "class", "style"],
|
||||||
"table": ["class", "style"],
|
"table": ["class", "style"],
|
||||||
"th": ["class", "style", "colspan", "rowspan"],
|
"th": ["class", "style", "colspan", "rowspan"],
|
||||||
"td": ["class", "style", "colspan", "rowspan"],
|
"td": ["class", "style", "colspan", "rowspan"],
|
||||||
@@ -38,8 +39,13 @@
|
|||||||
"font-weight", "font-style", "text-align", "text-decoration", "margin",
|
"font-weight", "font-style", "text-align", "text-decoration", "margin",
|
||||||
"padding", "text-indent", "line-height"
|
"padding", "text-indent", "line-height"
|
||||||
],
|
],
|
||||||
"removedAttributes": {
|
"allowedProtocols": {
|
||||||
"a": ["href", "target"]
|
"a": {
|
||||||
|
"href": ["http", "https", "#", "/"]
|
||||||
|
},
|
||||||
|
"img": {
|
||||||
|
"src": ["http", "https", "data", "/", "cid"]
|
||||||
|
}
|
||||||
},
|
},
|
||||||
"description": "HTML sanitization configuration for StoryCove story content. This configuration is shared between frontend (DOMPurify) and backend (Jsoup) to ensure consistency."
|
"description": "HTML sanitization configuration for StoryCove story content. This configuration is shared between frontend (DOMPurify) and backend (Jsoup) to ensure consistency."
|
||||||
}
|
}
|
||||||
@@ -15,10 +15,12 @@ public abstract class BaseRepositoryTest {
|
|||||||
private static final PostgreSQLContainer<?> postgres;
|
private static final PostgreSQLContainer<?> postgres;
|
||||||
|
|
||||||
static {
|
static {
|
||||||
postgres = new PostgreSQLContainer<>("postgres:15-alpine")
|
@SuppressWarnings("resource") // Container is managed by shutdown hook
|
||||||
|
PostgreSQLContainer<?> container = new PostgreSQLContainer<>("postgres:15-alpine")
|
||||||
.withDatabaseName("storycove_test")
|
.withDatabaseName("storycove_test")
|
||||||
.withUsername("test")
|
.withUsername("test")
|
||||||
.withPassword("test");
|
.withPassword("test");
|
||||||
|
postgres = container;
|
||||||
postgres.start();
|
postgres.start();
|
||||||
|
|
||||||
// Add shutdown hook to properly close the container
|
// Add shutdown hook to properly close the container
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
package com.storycove.service;
|
package com.storycove.service;
|
||||||
|
|
||||||
import com.storycove.entity.Author;
|
import com.storycove.entity.Author;
|
||||||
|
import com.storycove.entity.Story;
|
||||||
import com.storycove.repository.AuthorRepository;
|
import com.storycove.repository.AuthorRepository;
|
||||||
import com.storycove.service.exception.DuplicateResourceException;
|
import com.storycove.service.exception.DuplicateResourceException;
|
||||||
import com.storycove.service.exception.ResourceNotFoundException;
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
@@ -8,7 +9,6 @@ import org.junit.jupiter.api.BeforeEach;
|
|||||||
import org.junit.jupiter.api.DisplayName;
|
import org.junit.jupiter.api.DisplayName;
|
||||||
import org.junit.jupiter.api.Test;
|
import org.junit.jupiter.api.Test;
|
||||||
import org.junit.jupiter.api.extension.ExtendWith;
|
import org.junit.jupiter.api.extension.ExtendWith;
|
||||||
import org.mockito.InjectMocks;
|
|
||||||
import org.mockito.Mock;
|
import org.mockito.Mock;
|
||||||
import org.mockito.junit.jupiter.MockitoExtension;
|
import org.mockito.junit.jupiter.MockitoExtension;
|
||||||
import org.springframework.data.domain.Page;
|
import org.springframework.data.domain.Page;
|
||||||
@@ -22,8 +22,8 @@ import java.util.UUID;
|
|||||||
|
|
||||||
import static org.junit.jupiter.api.Assertions.*;
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
import static org.mockito.ArgumentMatchers.any;
|
import static org.mockito.ArgumentMatchers.any;
|
||||||
import static org.mockito.ArgumentMatchers.anyString;
|
|
||||||
import static org.mockito.Mockito.*;
|
import static org.mockito.Mockito.*;
|
||||||
|
import static org.mockito.Mockito.times;
|
||||||
|
|
||||||
@ExtendWith(MockitoExtension.class)
|
@ExtendWith(MockitoExtension.class)
|
||||||
@DisplayName("Author Service Unit Tests")
|
@DisplayName("Author Service Unit Tests")
|
||||||
@@ -32,7 +32,6 @@ class AuthorServiceTest {
|
|||||||
@Mock
|
@Mock
|
||||||
private AuthorRepository authorRepository;
|
private AuthorRepository authorRepository;
|
||||||
|
|
||||||
@InjectMocks
|
|
||||||
private AuthorService authorService;
|
private AuthorService authorService;
|
||||||
|
|
||||||
private Author testAuthor;
|
private Author testAuthor;
|
||||||
@@ -44,6 +43,9 @@ class AuthorServiceTest {
|
|||||||
testAuthor = new Author("Test Author");
|
testAuthor = new Author("Test Author");
|
||||||
testAuthor.setId(testId);
|
testAuthor.setId(testId);
|
||||||
testAuthor.setNotes("Test notes");
|
testAuthor.setNotes("Test notes");
|
||||||
|
|
||||||
|
// Initialize service with null TypesenseService (which is allowed for tests)
|
||||||
|
authorService = new AuthorService(authorRepository, null);
|
||||||
}
|
}
|
||||||
|
|
||||||
@Test
|
@Test
|
||||||
@@ -172,7 +174,7 @@ class AuthorServiceTest {
|
|||||||
when(authorRepository.existsByName("Updated Author")).thenReturn(false);
|
when(authorRepository.existsByName("Updated Author")).thenReturn(false);
|
||||||
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
||||||
|
|
||||||
Author result = authorService.update(testId, updates);
|
authorService.update(testId, updates);
|
||||||
|
|
||||||
assertEquals("Updated Author", testAuthor.getName());
|
assertEquals("Updated Author", testAuthor.getName());
|
||||||
assertEquals("Updated notes", testAuthor.getNotes());
|
assertEquals("Updated notes", testAuthor.getNotes());
|
||||||
@@ -307,4 +309,133 @@ class AuthorServiceTest {
|
|||||||
assertEquals(5L, count);
|
assertEquals(5L, count);
|
||||||
verify(authorRepository).countRecentAuthors(any(java.time.LocalDateTime.class));
|
verify(authorRepository).countRecentAuthors(any(java.time.LocalDateTime.class));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should set author rating with validation")
|
||||||
|
void shouldSetAuthorRating() {
|
||||||
|
when(authorRepository.findById(testId)).thenReturn(Optional.of(testAuthor));
|
||||||
|
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
||||||
|
|
||||||
|
authorService.setRating(testId, 4);
|
||||||
|
|
||||||
|
assertEquals(4, testAuthor.getAuthorRating());
|
||||||
|
verify(authorRepository, times(2)).findById(testId); // Called twice: once initially, once after flush
|
||||||
|
verify(authorRepository).save(testAuthor);
|
||||||
|
verify(authorRepository).flush();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should throw exception for invalid rating range")
|
||||||
|
void shouldThrowExceptionForInvalidRating() {
|
||||||
|
assertThrows(IllegalArgumentException.class, () -> authorService.setRating(testId, 0));
|
||||||
|
assertThrows(IllegalArgumentException.class, () -> authorService.setRating(testId, 6));
|
||||||
|
|
||||||
|
verify(authorRepository, never()).findById(any());
|
||||||
|
verify(authorRepository, never()).save(any());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should handle null rating")
|
||||||
|
void shouldHandleNullRating() {
|
||||||
|
when(authorRepository.findById(testId)).thenReturn(Optional.of(testAuthor));
|
||||||
|
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
||||||
|
|
||||||
|
authorService.setRating(testId, null);
|
||||||
|
|
||||||
|
assertNull(testAuthor.getAuthorRating());
|
||||||
|
verify(authorRepository, times(2)).findById(testId); // Called twice: once initially, once after flush
|
||||||
|
verify(authorRepository).save(testAuthor);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should find all authors with stories")
|
||||||
|
void shouldFindAllAuthorsWithStories() {
|
||||||
|
List<Author> authors = List.of(testAuthor);
|
||||||
|
when(authorRepository.findAll()).thenReturn(authors);
|
||||||
|
|
||||||
|
List<Author> result = authorService.findAllWithStories();
|
||||||
|
|
||||||
|
assertEquals(1, result.size());
|
||||||
|
verify(authorRepository).findAll();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should get author rating from database")
|
||||||
|
void shouldGetAuthorRatingFromDb() {
|
||||||
|
when(authorRepository.findAuthorRatingById(testId)).thenReturn(4);
|
||||||
|
|
||||||
|
Integer rating = authorService.getAuthorRatingFromDb(testId);
|
||||||
|
|
||||||
|
assertEquals(4, rating);
|
||||||
|
verify(authorRepository).findAuthorRatingById(testId);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should calculate average story rating")
|
||||||
|
void shouldCalculateAverageStoryRating() {
|
||||||
|
// Setup test author with stories
|
||||||
|
Story story1 = new Story("Story 1");
|
||||||
|
story1.setRating(4);
|
||||||
|
Story story2 = new Story("Story 2");
|
||||||
|
story2.setRating(5);
|
||||||
|
|
||||||
|
testAuthor.getStories().add(story1);
|
||||||
|
testAuthor.getStories().add(story2);
|
||||||
|
|
||||||
|
when(authorRepository.findById(testId)).thenReturn(Optional.of(testAuthor));
|
||||||
|
|
||||||
|
Double avgRating = authorService.calculateAverageStoryRating(testId);
|
||||||
|
|
||||||
|
assertEquals(4.5, avgRating);
|
||||||
|
verify(authorRepository).findById(testId);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should find authors with stories using repository method")
|
||||||
|
void shouldFindAuthorsWithStoriesFromRepository() {
|
||||||
|
List<Author> authors = List.of(testAuthor);
|
||||||
|
when(authorRepository.findAuthorsWithStories()).thenReturn(authors);
|
||||||
|
|
||||||
|
List<Author> result = authorService.findAuthorsWithStories();
|
||||||
|
|
||||||
|
assertEquals(1, result.size());
|
||||||
|
verify(authorRepository).findAuthorsWithStories();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should find top rated authors")
|
||||||
|
void shouldFindTopRatedAuthors() {
|
||||||
|
List<Author> authors = List.of(testAuthor);
|
||||||
|
when(authorRepository.findTopRatedAuthors()).thenReturn(authors);
|
||||||
|
|
||||||
|
List<Author> result = authorService.findTopRatedAuthors();
|
||||||
|
|
||||||
|
assertEquals(1, result.size());
|
||||||
|
verify(authorRepository).findTopRatedAuthors();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should find most prolific authors")
|
||||||
|
void shouldFindMostProlificAuthors() {
|
||||||
|
List<Author> authors = List.of(testAuthor);
|
||||||
|
when(authorRepository.findMostProlificAuthors()).thenReturn(authors);
|
||||||
|
|
||||||
|
List<Author> result = authorService.findMostProlificAuthors();
|
||||||
|
|
||||||
|
assertEquals(1, result.size());
|
||||||
|
verify(authorRepository).findMostProlificAuthors();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should find authors by URL domain")
|
||||||
|
void shouldFindAuthorsByUrlDomain() {
|
||||||
|
List<Author> authors = List.of(testAuthor);
|
||||||
|
when(authorRepository.findByUrlDomain("example.com")).thenReturn(authors);
|
||||||
|
|
||||||
|
List<Author> result = authorService.findByUrlDomain("example.com");
|
||||||
|
|
||||||
|
assertEquals(1, result.size());
|
||||||
|
verify(authorRepository).findByUrlDomain("example.com");
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
@@ -0,0 +1,221 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import com.storycove.entity.Story;
|
||||||
|
import com.storycove.repository.ReadingPositionRepository;
|
||||||
|
import com.storycove.repository.StoryRepository;
|
||||||
|
import com.storycove.repository.TagRepository;
|
||||||
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
|
import org.junit.jupiter.api.BeforeEach;
|
||||||
|
import org.junit.jupiter.api.DisplayName;
|
||||||
|
import org.junit.jupiter.api.Test;
|
||||||
|
import org.junit.jupiter.api.extension.ExtendWith;
|
||||||
|
import org.mockito.Mock;
|
||||||
|
import org.mockito.junit.jupiter.MockitoExtension;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
import static org.junit.jupiter.api.Assertions.*;
|
||||||
|
import static org.mockito.ArgumentMatchers.any;
|
||||||
|
import static org.mockito.Mockito.*;
|
||||||
|
|
||||||
|
@ExtendWith(MockitoExtension.class)
|
||||||
|
@DisplayName("Story Service Unit Tests - Reading Progress")
|
||||||
|
class StoryServiceTest {
|
||||||
|
|
||||||
|
@Mock
|
||||||
|
private StoryRepository storyRepository;
|
||||||
|
|
||||||
|
@Mock
|
||||||
|
private TagRepository tagRepository;
|
||||||
|
|
||||||
|
@Mock
|
||||||
|
private ReadingPositionRepository readingPositionRepository;
|
||||||
|
|
||||||
|
private StoryService storyService;
|
||||||
|
private Story testStory;
|
||||||
|
private UUID testId;
|
||||||
|
|
||||||
|
@BeforeEach
|
||||||
|
void setUp() {
|
||||||
|
testId = UUID.randomUUID();
|
||||||
|
testStory = new Story("Test Story");
|
||||||
|
testStory.setId(testId);
|
||||||
|
testStory.setContentHtml("<p>Test content for reading progress tracking</p>");
|
||||||
|
|
||||||
|
// Create StoryService with only required repositories, all services can be null for these tests
|
||||||
|
storyService = new StoryService(
|
||||||
|
storyRepository,
|
||||||
|
tagRepository,
|
||||||
|
readingPositionRepository, // added for foreign key constraint handling
|
||||||
|
null, // authorService - not needed for reading progress tests
|
||||||
|
null, // tagService - not needed for reading progress tests
|
||||||
|
null, // seriesService - not needed for reading progress tests
|
||||||
|
null, // sanitizationService - not needed for reading progress tests
|
||||||
|
null // typesenseService - will test both with and without
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should update reading progress successfully")
|
||||||
|
void shouldUpdateReadingProgress() {
|
||||||
|
Integer position = 150;
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||||
|
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||||
|
|
||||||
|
Story result = storyService.updateReadingProgress(testId, position);
|
||||||
|
|
||||||
|
assertEquals(position, result.getReadingPosition());
|
||||||
|
assertNotNull(result.getLastReadAt());
|
||||||
|
verify(storyRepository).findById(testId);
|
||||||
|
verify(storyRepository).save(testStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should update reading progress with zero position")
|
||||||
|
void shouldUpdateReadingProgressWithZeroPosition() {
|
||||||
|
Integer position = 0;
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||||
|
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||||
|
|
||||||
|
Story result = storyService.updateReadingProgress(testId, position);
|
||||||
|
|
||||||
|
assertEquals(0, result.getReadingPosition());
|
||||||
|
assertNotNull(result.getLastReadAt());
|
||||||
|
verify(storyRepository).save(testStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should throw exception for negative reading position")
|
||||||
|
void shouldThrowExceptionForNegativeReadingPosition() {
|
||||||
|
Integer position = -1;
|
||||||
|
|
||||||
|
assertThrows(IllegalArgumentException.class,
|
||||||
|
() -> storyService.updateReadingProgress(testId, position));
|
||||||
|
|
||||||
|
verify(storyRepository, never()).findById(any());
|
||||||
|
verify(storyRepository, never()).save(any());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should handle null reading position")
|
||||||
|
void shouldHandleNullReadingPosition() {
|
||||||
|
Integer position = null;
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||||
|
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||||
|
|
||||||
|
Story result = storyService.updateReadingProgress(testId, position);
|
||||||
|
|
||||||
|
assertNull(result.getReadingPosition());
|
||||||
|
assertNotNull(result.getLastReadAt());
|
||||||
|
verify(storyRepository).save(testStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should throw exception when story not found for reading progress update")
|
||||||
|
void shouldThrowExceptionWhenStoryNotFoundForReadingProgress() {
|
||||||
|
Integer position = 100;
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.empty());
|
||||||
|
|
||||||
|
assertThrows(ResourceNotFoundException.class,
|
||||||
|
() -> storyService.updateReadingProgress(testId, position));
|
||||||
|
|
||||||
|
verify(storyRepository).findById(testId);
|
||||||
|
verify(storyRepository, never()).save(any());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should mark story as read")
|
||||||
|
void shouldMarkStoryAsRead() {
|
||||||
|
Boolean isRead = true;
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||||
|
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||||
|
|
||||||
|
Story result = storyService.updateReadingStatus(testId, isRead);
|
||||||
|
|
||||||
|
assertTrue(result.getIsRead());
|
||||||
|
assertNotNull(result.getLastReadAt());
|
||||||
|
// When marked as read, position should be set to content length
|
||||||
|
assertTrue(result.getReadingPosition() > 0);
|
||||||
|
verify(storyRepository).findById(testId);
|
||||||
|
verify(storyRepository).save(testStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should mark story as unread")
|
||||||
|
void shouldMarkStoryAsUnread() {
|
||||||
|
Boolean isRead = false;
|
||||||
|
// First mark story as read to test transition
|
||||||
|
testStory.markAsRead();
|
||||||
|
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||||
|
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||||
|
|
||||||
|
Story result = storyService.updateReadingStatus(testId, isRead);
|
||||||
|
|
||||||
|
assertFalse(result.getIsRead());
|
||||||
|
assertNotNull(result.getLastReadAt());
|
||||||
|
verify(storyRepository).save(testStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should handle null reading status")
|
||||||
|
void shouldHandleNullReadingStatus() {
|
||||||
|
Boolean isRead = null;
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||||
|
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||||
|
|
||||||
|
Story result = storyService.updateReadingStatus(testId, isRead);
|
||||||
|
|
||||||
|
assertFalse(result.getIsRead());
|
||||||
|
assertNotNull(result.getLastReadAt());
|
||||||
|
verify(storyRepository).save(testStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should throw exception when story not found for reading status update")
|
||||||
|
void shouldThrowExceptionWhenStoryNotFoundForReadingStatus() {
|
||||||
|
Boolean isRead = true;
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.empty());
|
||||||
|
|
||||||
|
assertThrows(ResourceNotFoundException.class,
|
||||||
|
() -> storyService.updateReadingStatus(testId, isRead));
|
||||||
|
|
||||||
|
verify(storyRepository).findById(testId);
|
||||||
|
verify(storyRepository, never()).save(any());
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should update lastReadAt timestamp when updating progress")
|
||||||
|
void shouldUpdateLastReadAtWhenUpdatingProgress() {
|
||||||
|
Integer position = 50;
|
||||||
|
LocalDateTime beforeUpdate = LocalDateTime.now().minusMinutes(1);
|
||||||
|
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||||
|
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||||
|
|
||||||
|
Story result = storyService.updateReadingProgress(testId, position);
|
||||||
|
|
||||||
|
assertNotNull(result.getLastReadAt());
|
||||||
|
assertTrue(result.getLastReadAt().isAfter(beforeUpdate));
|
||||||
|
verify(storyRepository).save(testStory);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
@DisplayName("Should update lastReadAt timestamp when updating status")
|
||||||
|
void shouldUpdateLastReadAtWhenUpdatingStatus() {
|
||||||
|
Boolean isRead = true;
|
||||||
|
LocalDateTime beforeUpdate = LocalDateTime.now().minusMinutes(1);
|
||||||
|
|
||||||
|
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||||
|
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||||
|
|
||||||
|
Story result = storyService.updateReadingStatus(testId, isRead);
|
||||||
|
|
||||||
|
assertNotNull(result.getLastReadAt());
|
||||||
|
assertTrue(result.getLastReadAt().isAfter(beforeUpdate));
|
||||||
|
verify(storyRepository).save(testStory);
|
||||||
|
}
|
||||||
|
}
|
||||||
7
backend/test-fixed-export.epub
Normal file
7
backend/test-fixed-export.epub
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
<html>
|
||||||
|
<head><title>502 Bad Gateway</title></head>
|
||||||
|
<body>
|
||||||
|
<center><h1>502 Bad Gateway</h1></center>
|
||||||
|
<hr><center>nginx/1.29.0</center>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
4
cookies.txt
Normal file
4
cookies.txt
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
# Netscape HTTP Cookie File
|
||||||
|
# https://curl.se/docs/http-cookies.html
|
||||||
|
# This file was generated by libcurl! Edit at your own risk.
|
||||||
|
|
||||||
@@ -42,6 +42,7 @@ services:
|
|||||||
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
|
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
|
||||||
volumes:
|
volumes:
|
||||||
- images_data:/app/images
|
- images_data:/app/images
|
||||||
|
- library_config:/app/config
|
||||||
depends_on:
|
depends_on:
|
||||||
- postgres
|
- postgres
|
||||||
- typesense
|
- typesense
|
||||||
@@ -51,6 +52,8 @@ services:
|
|||||||
postgres:
|
postgres:
|
||||||
image: postgres:15-alpine
|
image: postgres:15-alpine
|
||||||
# No port mapping - only accessible within the Docker network
|
# No port mapping - only accessible within the Docker network
|
||||||
|
ports:
|
||||||
|
- "5432:5432"
|
||||||
environment:
|
environment:
|
||||||
- POSTGRES_DB=storycove
|
- POSTGRES_DB=storycove
|
||||||
- POSTGRES_USER=storycove
|
- POSTGRES_USER=storycove
|
||||||
@@ -61,7 +64,7 @@ services:
|
|||||||
- storycove-network
|
- storycove-network
|
||||||
|
|
||||||
typesense:
|
typesense:
|
||||||
image: typesense/typesense:0.25.0
|
image: typesense/typesense:29.0
|
||||||
# No port mapping - only accessible within the Docker network
|
# No port mapping - only accessible within the Docker network
|
||||||
environment:
|
environment:
|
||||||
- TYPESENSE_API_KEY=${TYPESENSE_API_KEY}
|
- TYPESENSE_API_KEY=${TYPESENSE_API_KEY}
|
||||||
@@ -75,6 +78,7 @@ volumes:
|
|||||||
postgres_data:
|
postgres_data:
|
||||||
typesense_data:
|
typesense_data:
|
||||||
images_data:
|
images_data:
|
||||||
|
library_config:
|
||||||
|
|
||||||
configs:
|
configs:
|
||||||
nginx_config:
|
nginx_config:
|
||||||
@@ -91,7 +95,7 @@ configs:
|
|||||||
}
|
}
|
||||||
server {
|
server {
|
||||||
listen 80;
|
listen 80;
|
||||||
client_max_body_size 10M;
|
client_max_body_size 256M;
|
||||||
location / {
|
location / {
|
||||||
proxy_pass http://frontend;
|
proxy_pass http://frontend;
|
||||||
proxy_http_version 1.1;
|
proxy_http_version 1.1;
|
||||||
|
|||||||
@@ -1,13 +1,58 @@
|
|||||||
FROM node:18-alpine
|
# Multi-stage build for better layer caching and smaller final image
|
||||||
|
FROM node:18-alpine AS deps
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
|
|
||||||
COPY package*.json ./
|
# Install dumb-init early
|
||||||
RUN npm ci --omit=dev
|
RUN apk add --no-cache dumb-init
|
||||||
|
|
||||||
|
# Copy package files first to leverage Docker layer caching
|
||||||
|
COPY package*.json ./
|
||||||
|
|
||||||
|
# Install dependencies with optimized settings
|
||||||
|
RUN npm ci --prefer-offline --no-audit --frozen-lockfile
|
||||||
|
|
||||||
|
# Build stage
|
||||||
|
FROM node:18-alpine AS builder
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy dependencies from deps stage
|
||||||
|
COPY --from=deps /app/node_modules ./node_modules
|
||||||
COPY . .
|
COPY . .
|
||||||
|
|
||||||
|
# Set Node.js memory limit for build
|
||||||
|
ENV NODE_OPTIONS="--max-old-space-size=1024"
|
||||||
|
ENV NEXT_TELEMETRY_DISABLED=1
|
||||||
|
|
||||||
|
# Build the application
|
||||||
RUN npm run build
|
RUN npm run build
|
||||||
|
|
||||||
|
# Production stage
|
||||||
|
FROM node:18-alpine AS runner
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
ENV NODE_ENV=production
|
||||||
|
ENV NEXT_TELEMETRY_DISABLED=1
|
||||||
|
|
||||||
|
# Install dumb-init for proper signal handling
|
||||||
|
RUN apk add --no-cache dumb-init
|
||||||
|
|
||||||
|
# Create non-root user for security
|
||||||
|
RUN addgroup -g 1001 -S nodejs
|
||||||
|
RUN adduser -S nextjs -u 1001
|
||||||
|
|
||||||
|
# Copy necessary files from builder stage
|
||||||
|
COPY --from=builder /app/next.config.js* ./
|
||||||
|
COPY --from=builder /app/public ./public
|
||||||
|
COPY --from=builder /app/package.json ./package.json
|
||||||
|
|
||||||
|
# Copy built application
|
||||||
|
COPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./
|
||||||
|
COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static
|
||||||
|
|
||||||
|
USER nextjs
|
||||||
|
|
||||||
EXPOSE 3000
|
EXPOSE 3000
|
||||||
|
|
||||||
CMD ["npm", "start"]
|
# Use dumb-init to handle signals properly
|
||||||
|
ENTRYPOINT ["dumb-init", "--"]
|
||||||
|
CMD ["node", "server.js"]
|
||||||
42
frontend/Dockerfile.alternative
Normal file
42
frontend/Dockerfile.alternative
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
# Multi-stage build for better caching and smaller final image
|
||||||
|
FROM node:18-alpine AS dependencies
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
COPY package*.json ./
|
||||||
|
RUN npm ci
|
||||||
|
|
||||||
|
FROM node:18-alpine AS builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
COPY --from=dependencies /app/node_modules ./node_modules
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Increase memory limit for build
|
||||||
|
ENV NODE_OPTIONS="--max-old-space-size=2048"
|
||||||
|
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
FROM node:18-alpine AS runner
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install dumb-init
|
||||||
|
RUN apk add --no-cache dumb-init
|
||||||
|
|
||||||
|
# Create non-root user
|
||||||
|
RUN addgroup -g 1001 -S nodejs
|
||||||
|
RUN adduser -S nextjs -u 1001
|
||||||
|
|
||||||
|
# Copy necessary files
|
||||||
|
COPY --from=builder /app/public ./public
|
||||||
|
COPY --from=builder /app/.next/standalone ./
|
||||||
|
COPY --from=builder /app/.next/static ./.next/static
|
||||||
|
|
||||||
|
# Set correct permissions
|
||||||
|
RUN chown -R nextjs:nodejs /app
|
||||||
|
USER nextjs
|
||||||
|
|
||||||
|
EXPOSE 3000
|
||||||
|
|
||||||
|
ENTRYPOINT ["dumb-init", "--"]
|
||||||
|
CMD ["node", "server.js"]
|
||||||
@@ -1,12 +1,21 @@
|
|||||||
/** @type {import('next').NextConfig} */
|
/** @type {import('next').NextConfig} */
|
||||||
const nextConfig = {
|
const nextConfig = {
|
||||||
async rewrites() {
|
// Enable standalone output for optimized Docker builds
|
||||||
return [
|
output: 'standalone',
|
||||||
{
|
// Removed Next.js rewrites since nginx handles all API routing
|
||||||
source: '/api/:path*',
|
webpack: (config, { isServer }) => {
|
||||||
destination: 'http://backend:8080/api/:path*',
|
// Exclude cheerio and its dependencies from client-side bundling
|
||||||
},
|
if (!isServer) {
|
||||||
];
|
config.resolve.fallback = {
|
||||||
|
...config.resolve.fallback,
|
||||||
|
fs: false,
|
||||||
|
net: false,
|
||||||
|
tls: false,
|
||||||
|
'undici': false,
|
||||||
|
};
|
||||||
|
config.externals.push('cheerio', 'server-only');
|
||||||
|
}
|
||||||
|
return config;
|
||||||
},
|
},
|
||||||
images: {
|
images: {
|
||||||
domains: ['localhost'],
|
domains: ['localhost'],
|
||||||
|
|||||||
237
frontend/package-lock.json
generated
237
frontend/package-lock.json
generated
@@ -8,14 +8,17 @@
|
|||||||
"name": "storycove-frontend",
|
"name": "storycove-frontend",
|
||||||
"version": "0.1.0",
|
"version": "0.1.0",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
"@heroicons/react": "^2.2.0",
|
||||||
"autoprefixer": "^10.4.16",
|
"autoprefixer": "^10.4.16",
|
||||||
"axios": "^1.6.0",
|
"axios": "^1.11.0",
|
||||||
"dompurify": "^3.0.5",
|
"cheerio": "^1.0.0-rc.12",
|
||||||
|
"dompurify": "^3.2.6",
|
||||||
"next": "14.0.0",
|
"next": "14.0.0",
|
||||||
"postcss": "^8.4.31",
|
"postcss": "^8.4.31",
|
||||||
"react": "^18",
|
"react": "^18",
|
||||||
"react-dom": "^18",
|
"react-dom": "^18",
|
||||||
"react-dropzone": "^14.2.3",
|
"react-dropzone": "^14.2.3",
|
||||||
|
"server-only": "^0.0.1",
|
||||||
"tailwindcss": "^3.3.0"
|
"tailwindcss": "^3.3.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
@@ -137,6 +140,15 @@
|
|||||||
"node": "^12.22.0 || ^14.17.0 || >=16.0.0"
|
"node": "^12.22.0 || ^14.17.0 || >=16.0.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/@heroicons/react": {
|
||||||
|
"version": "2.2.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/@heroicons/react/-/react-2.2.0.tgz",
|
||||||
|
"integrity": "sha512-LMcepvRaS9LYHJGsF0zzmgKCUim/X3N/DQKc4jepAXJ7l8QxJ1PmxJzqplF2Z3FE4PqBAIGyJAQ/w4B5dsqbtQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"peerDependencies": {
|
||||||
|
"react": ">= 16 || ^19.0.0-rc"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/@humanwhocodes/config-array": {
|
"node_modules/@humanwhocodes/config-array": {
|
||||||
"version": "0.13.0",
|
"version": "0.13.0",
|
||||||
"resolved": "https://registry.npmjs.org/@humanwhocodes/config-array/-/config-array-0.13.0.tgz",
|
"resolved": "https://registry.npmjs.org/@humanwhocodes/config-array/-/config-array-0.13.0.tgz",
|
||||||
@@ -1360,13 +1372,13 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
"node_modules/axios": {
|
"node_modules/axios": {
|
||||||
"version": "1.10.0",
|
"version": "1.11.0",
|
||||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.10.0.tgz",
|
"resolved": "https://registry.npmjs.org/axios/-/axios-1.11.0.tgz",
|
||||||
"integrity": "sha512-/1xYAC4MP/HEG+3duIhFr4ZQXR4sQXOIe+o6sdqzeykGLx6Upp/1p8MHqhINOvGeP7xyNHe7tsiJByc4SSVUxw==",
|
"integrity": "sha512-1Lx3WLFQWm3ooKDYZD1eXmoGO9fxYQjrycfHFC8P0sCfQVXyROp0p9PFWBehewBOdCwHc+f/b8I0fMto5eSfwA==",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"follow-redirects": "^1.15.6",
|
"follow-redirects": "^1.15.6",
|
||||||
"form-data": "^4.0.0",
|
"form-data": "^4.0.4",
|
||||||
"proxy-from-env": "^1.1.0"
|
"proxy-from-env": "^1.1.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
@@ -1398,6 +1410,12 @@
|
|||||||
"url": "https://github.com/sponsors/sindresorhus"
|
"url": "https://github.com/sponsors/sindresorhus"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/boolbase": {
|
||||||
|
"version": "1.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/boolbase/-/boolbase-1.0.0.tgz",
|
||||||
|
"integrity": "sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww==",
|
||||||
|
"license": "ISC"
|
||||||
|
},
|
||||||
"node_modules/brace-expansion": {
|
"node_modules/brace-expansion": {
|
||||||
"version": "1.1.12",
|
"version": "1.1.12",
|
||||||
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz",
|
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz",
|
||||||
@@ -1569,6 +1587,44 @@
|
|||||||
"url": "https://github.com/chalk/chalk?sponsor=1"
|
"url": "https://github.com/chalk/chalk?sponsor=1"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/cheerio": {
|
||||||
|
"version": "1.0.0-rc.12",
|
||||||
|
"resolved": "https://registry.npmjs.org/cheerio/-/cheerio-1.0.0-rc.12.tgz",
|
||||||
|
"integrity": "sha512-VqR8m68vM46BNnuZ5NtnGBKIE/DfN0cRIzg9n40EIq9NOv90ayxLBXA8fXC5gquFRGJSTRqBq25Jt2ECLR431Q==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"cheerio-select": "^2.1.0",
|
||||||
|
"dom-serializer": "^2.0.0",
|
||||||
|
"domhandler": "^5.0.3",
|
||||||
|
"domutils": "^3.0.1",
|
||||||
|
"htmlparser2": "^8.0.1",
|
||||||
|
"parse5": "^7.0.0",
|
||||||
|
"parse5-htmlparser2-tree-adapter": "^7.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 6"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/cheeriojs/cheerio?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/cheerio-select": {
|
||||||
|
"version": "2.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/cheerio-select/-/cheerio-select-2.1.0.tgz",
|
||||||
|
"integrity": "sha512-9v9kG0LvzrlcungtnJtpGNxY+fzECQKhK4EGJX2vByejiMX84MFNQw4UxPJl3bFbTMw+Dfs37XaIkCwTZfLh4g==",
|
||||||
|
"license": "BSD-2-Clause",
|
||||||
|
"dependencies": {
|
||||||
|
"boolbase": "^1.0.0",
|
||||||
|
"css-select": "^5.1.0",
|
||||||
|
"css-what": "^6.1.0",
|
||||||
|
"domelementtype": "^2.3.0",
|
||||||
|
"domhandler": "^5.0.3",
|
||||||
|
"domutils": "^3.0.1"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/fb55"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/chokidar": {
|
"node_modules/chokidar": {
|
||||||
"version": "3.6.0",
|
"version": "3.6.0",
|
||||||
"resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.6.0.tgz",
|
"resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.6.0.tgz",
|
||||||
@@ -1671,6 +1727,34 @@
|
|||||||
"node": ">= 8"
|
"node": ">= 8"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/css-select": {
|
||||||
|
"version": "5.2.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/css-select/-/css-select-5.2.2.tgz",
|
||||||
|
"integrity": "sha512-TizTzUddG/xYLA3NXodFM0fSbNizXjOKhqiQQwvhlspadZokn1KDy0NZFS0wuEubIYAV5/c1/lAr0TaaFXEXzw==",
|
||||||
|
"license": "BSD-2-Clause",
|
||||||
|
"dependencies": {
|
||||||
|
"boolbase": "^1.0.0",
|
||||||
|
"css-what": "^6.1.0",
|
||||||
|
"domhandler": "^5.0.2",
|
||||||
|
"domutils": "^3.0.1",
|
||||||
|
"nth-check": "^2.0.1"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/fb55"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/css-what": {
|
||||||
|
"version": "6.2.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/css-what/-/css-what-6.2.2.tgz",
|
||||||
|
"integrity": "sha512-u/O3vwbptzhMs3L1fQE82ZSLHQQfto5gyZzwteVIEyeaY5Fc7R4dapF/BvRoSYFeqfBk4m0V1Vafq5Pjv25wvA==",
|
||||||
|
"license": "BSD-2-Clause",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 6"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/fb55"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/cssesc": {
|
"node_modules/cssesc": {
|
||||||
"version": "3.0.0",
|
"version": "3.0.0",
|
||||||
"resolved": "https://registry.npmjs.org/cssesc/-/cssesc-3.0.0.tgz",
|
"resolved": "https://registry.npmjs.org/cssesc/-/cssesc-3.0.0.tgz",
|
||||||
@@ -1859,6 +1943,47 @@
|
|||||||
"node": ">=6.0.0"
|
"node": ">=6.0.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/dom-serializer": {
|
||||||
|
"version": "2.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-2.0.0.tgz",
|
||||||
|
"integrity": "sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"domelementtype": "^2.3.0",
|
||||||
|
"domhandler": "^5.0.2",
|
||||||
|
"entities": "^4.2.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/cheeriojs/dom-serializer?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/domelementtype": {
|
||||||
|
"version": "2.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-2.3.0.tgz",
|
||||||
|
"integrity": "sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==",
|
||||||
|
"funding": [
|
||||||
|
{
|
||||||
|
"type": "github",
|
||||||
|
"url": "https://github.com/sponsors/fb55"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"license": "BSD-2-Clause"
|
||||||
|
},
|
||||||
|
"node_modules/domhandler": {
|
||||||
|
"version": "5.0.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/domhandler/-/domhandler-5.0.3.tgz",
|
||||||
|
"integrity": "sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==",
|
||||||
|
"license": "BSD-2-Clause",
|
||||||
|
"dependencies": {
|
||||||
|
"domelementtype": "^2.3.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/fb55/domhandler?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/dompurify": {
|
"node_modules/dompurify": {
|
||||||
"version": "3.2.6",
|
"version": "3.2.6",
|
||||||
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-3.2.6.tgz",
|
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-3.2.6.tgz",
|
||||||
@@ -1868,6 +1993,20 @@
|
|||||||
"@types/trusted-types": "^2.0.7"
|
"@types/trusted-types": "^2.0.7"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/domutils": {
|
||||||
|
"version": "3.2.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/domutils/-/domutils-3.2.2.tgz",
|
||||||
|
"integrity": "sha512-6kZKyUajlDuqlHKVX1w7gyslj9MPIXzIFiz/rGu35uC1wMi+kMhQwGhl4lt9unC9Vb9INnY9Z3/ZA3+FhASLaw==",
|
||||||
|
"license": "BSD-2-Clause",
|
||||||
|
"dependencies": {
|
||||||
|
"dom-serializer": "^2.0.0",
|
||||||
|
"domelementtype": "^2.3.0",
|
||||||
|
"domhandler": "^5.0.3"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/fb55/domutils?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/dunder-proto": {
|
"node_modules/dunder-proto": {
|
||||||
"version": "1.0.1",
|
"version": "1.0.1",
|
||||||
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
||||||
@@ -1900,6 +2039,18 @@
|
|||||||
"integrity": "sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==",
|
"integrity": "sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==",
|
||||||
"license": "MIT"
|
"license": "MIT"
|
||||||
},
|
},
|
||||||
|
"node_modules/entities": {
|
||||||
|
"version": "4.5.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz",
|
||||||
|
"integrity": "sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw==",
|
||||||
|
"license": "BSD-2-Clause",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.12"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/es-abstract": {
|
"node_modules/es-abstract": {
|
||||||
"version": "1.24.0",
|
"version": "1.24.0",
|
||||||
"resolved": "https://registry.npmjs.org/es-abstract/-/es-abstract-1.24.0.tgz",
|
"resolved": "https://registry.npmjs.org/es-abstract/-/es-abstract-1.24.0.tgz",
|
||||||
@@ -3096,6 +3247,25 @@
|
|||||||
"node": ">= 0.4"
|
"node": ">= 0.4"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/htmlparser2": {
|
||||||
|
"version": "8.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-8.0.2.tgz",
|
||||||
|
"integrity": "sha512-GYdjWKDkbRLkZ5geuHs5NY1puJ+PXwP7+fHPRz06Eirsb9ugf6d8kkXav6ADhcODhFFPMIXyxkxSuMf3D6NCFA==",
|
||||||
|
"funding": [
|
||||||
|
"https://github.com/fb55/htmlparser2?sponsor=1",
|
||||||
|
{
|
||||||
|
"type": "github",
|
||||||
|
"url": "https://github.com/sponsors/fb55"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"domelementtype": "^2.3.0",
|
||||||
|
"domhandler": "^5.0.3",
|
||||||
|
"domutils": "^3.0.1",
|
||||||
|
"entities": "^4.4.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/ignore": {
|
"node_modules/ignore": {
|
||||||
"version": "5.3.2",
|
"version": "5.3.2",
|
||||||
"resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz",
|
"resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz",
|
||||||
@@ -4063,6 +4233,18 @@
|
|||||||
"node": ">=0.10.0"
|
"node": ">=0.10.0"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/nth-check": {
|
||||||
|
"version": "2.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/nth-check/-/nth-check-2.1.1.tgz",
|
||||||
|
"integrity": "sha512-lqjrjmaOoAnWfMmBPL+XNnynZh2+swxiX3WUE0s4yEHI6m+AwrK2UZOimIRl3X/4QctVqS8AiZjFqyOGrMXb/w==",
|
||||||
|
"license": "BSD-2-Clause",
|
||||||
|
"dependencies": {
|
||||||
|
"boolbase": "^1.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/fb55/nth-check?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/object-assign": {
|
"node_modules/object-assign": {
|
||||||
"version": "4.1.1",
|
"version": "4.1.1",
|
||||||
"resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
|
"resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
|
||||||
@@ -4291,6 +4473,43 @@
|
|||||||
"node": ">=6"
|
"node": ">=6"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/parse5": {
|
||||||
|
"version": "7.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/parse5/-/parse5-7.3.0.tgz",
|
||||||
|
"integrity": "sha512-IInvU7fabl34qmi9gY8XOVxhYyMyuH2xUNpb2q8/Y+7552KlejkRvqvD19nMoUW/uQGGbqNpA6Tufu5FL5BZgw==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"entities": "^6.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/inikulin/parse5?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/parse5-htmlparser2-tree-adapter": {
|
||||||
|
"version": "7.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/parse5-htmlparser2-tree-adapter/-/parse5-htmlparser2-tree-adapter-7.1.0.tgz",
|
||||||
|
"integrity": "sha512-ruw5xyKs6lrpo9x9rCZqZZnIUntICjQAd0Wsmp396Ul9lN/h+ifgVV1x1gZHi8euej6wTfpqX8j+BFQxF0NS/g==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"domhandler": "^5.0.3",
|
||||||
|
"parse5": "^7.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/inikulin/parse5?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/parse5/node_modules/entities": {
|
||||||
|
"version": "6.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/entities/-/entities-6.0.1.tgz",
|
||||||
|
"integrity": "sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g==",
|
||||||
|
"license": "BSD-2-Clause",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.12"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||||
|
}
|
||||||
|
},
|
||||||
"node_modules/path-exists": {
|
"node_modules/path-exists": {
|
||||||
"version": "4.0.0",
|
"version": "4.0.0",
|
||||||
"resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz",
|
"resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz",
|
||||||
@@ -4843,6 +5062,12 @@
|
|||||||
"node": ">=10"
|
"node": ">=10"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/server-only": {
|
||||||
|
"version": "0.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/server-only/-/server-only-0.0.1.tgz",
|
||||||
|
"integrity": "sha512-qepMx2JxAa5jjfzxG79yPPq+8BuFToHd1hm7kI+Z4zAq1ftQiP7HcxMhDDItrbtwVeLg/cY2JnKnrcFkmiswNA==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
"node_modules/set-function-length": {
|
"node_modules/set-function-length": {
|
||||||
"version": "1.2.2",
|
"version": "1.2.2",
|
||||||
"resolved": "https://registry.npmjs.org/set-function-length/-/set-function-length-1.2.2.tgz",
|
"resolved": "https://registry.npmjs.org/set-function-length/-/set-function-length-1.2.2.tgz",
|
||||||
|
|||||||
@@ -10,23 +10,26 @@
|
|||||||
"type-check": "tsc --noEmit"
|
"type-check": "tsc --noEmit"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
"@heroicons/react": "^2.2.0",
|
||||||
|
"autoprefixer": "^10.4.16",
|
||||||
|
"axios": "^1.11.0",
|
||||||
|
"cheerio": "^1.0.0-rc.12",
|
||||||
|
"dompurify": "^3.2.6",
|
||||||
"next": "14.0.0",
|
"next": "14.0.0",
|
||||||
|
"postcss": "^8.4.31",
|
||||||
"react": "^18",
|
"react": "^18",
|
||||||
"react-dom": "^18",
|
"react-dom": "^18",
|
||||||
"axios": "^1.6.0",
|
|
||||||
"dompurify": "^3.0.5",
|
|
||||||
"react-dropzone": "^14.2.3",
|
"react-dropzone": "^14.2.3",
|
||||||
"tailwindcss": "^3.3.0",
|
"server-only": "^0.0.1",
|
||||||
"autoprefixer": "^10.4.16",
|
"tailwindcss": "^3.3.0"
|
||||||
"postcss": "^8.4.31"
|
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"typescript": "^5",
|
"@types/dompurify": "^3.0.5",
|
||||||
"@types/node": "^20",
|
"@types/node": "^20",
|
||||||
"@types/react": "^18",
|
"@types/react": "^18",
|
||||||
"@types/react-dom": "^18",
|
"@types/react-dom": "^18",
|
||||||
"@types/dompurify": "^3.0.5",
|
|
||||||
"eslint": "^8",
|
"eslint": "^8",
|
||||||
"eslint-config-next": "14.0.0"
|
"eslint-config-next": "14.0.0",
|
||||||
|
"typescript": "^5"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,32 +1,177 @@
|
|||||||
'use client';
|
'use client';
|
||||||
|
|
||||||
import { useState, useRef } from 'react';
|
import { useState, useEffect } from 'react';
|
||||||
import { useRouter } from 'next/navigation';
|
import { useRouter, useSearchParams } from 'next/navigation';
|
||||||
import AppLayout from '../../components/layout/AppLayout';
|
import { useAuth } from '../../contexts/AuthContext';
|
||||||
|
import ImportLayout from '../../components/layout/ImportLayout';
|
||||||
import { Input, Textarea } from '../../components/ui/Input';
|
import { Input, Textarea } from '../../components/ui/Input';
|
||||||
import Button from '../../components/ui/Button';
|
import Button from '../../components/ui/Button';
|
||||||
import TagInput from '../../components/stories/TagInput';
|
import TagInput from '../../components/stories/TagInput';
|
||||||
import RichTextEditor from '../../components/stories/RichTextEditor';
|
import RichTextEditor from '../../components/stories/RichTextEditor';
|
||||||
import ImageUpload from '../../components/ui/ImageUpload';
|
import ImageUpload from '../../components/ui/ImageUpload';
|
||||||
import { storyApi } from '../../lib/api';
|
import AuthorSelector from '../../components/stories/AuthorSelector';
|
||||||
|
import SeriesSelector from '../../components/stories/SeriesSelector';
|
||||||
|
import { storyApi, authorApi } from '../../lib/api';
|
||||||
|
|
||||||
export default function AddStoryPage() {
|
export default function AddStoryPage() {
|
||||||
const [formData, setFormData] = useState({
|
const [formData, setFormData] = useState({
|
||||||
title: '',
|
title: '',
|
||||||
summary: '',
|
summary: '',
|
||||||
authorName: '',
|
authorName: '',
|
||||||
|
authorId: undefined as string | undefined,
|
||||||
contentHtml: '',
|
contentHtml: '',
|
||||||
sourceUrl: '',
|
sourceUrl: '',
|
||||||
tags: [] as string[],
|
tags: [] as string[],
|
||||||
seriesName: '',
|
seriesName: '',
|
||||||
|
seriesId: undefined as string | undefined,
|
||||||
volume: '',
|
volume: '',
|
||||||
});
|
});
|
||||||
|
|
||||||
const [coverImage, setCoverImage] = useState<File | null>(null);
|
const [coverImage, setCoverImage] = useState<File | null>(null);
|
||||||
const [loading, setLoading] = useState(false);
|
const [loading, setLoading] = useState(false);
|
||||||
|
const [processingImages, setProcessingImages] = useState(false);
|
||||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||||
|
const [duplicateWarning, setDuplicateWarning] = useState<{
|
||||||
|
show: boolean;
|
||||||
|
count: number;
|
||||||
|
duplicates: Array<{
|
||||||
|
id: string;
|
||||||
|
title: string;
|
||||||
|
authorName: string;
|
||||||
|
createdAt: string;
|
||||||
|
}>;
|
||||||
|
}>({ show: false, count: 0, duplicates: [] });
|
||||||
|
const [checkingDuplicates, setCheckingDuplicates] = useState(false);
|
||||||
|
|
||||||
const router = useRouter();
|
const router = useRouter();
|
||||||
|
const searchParams = useSearchParams();
|
||||||
|
const { isAuthenticated } = useAuth();
|
||||||
|
|
||||||
|
// Handle URL parameters
|
||||||
|
useEffect(() => {
|
||||||
|
const authorId = searchParams.get('authorId');
|
||||||
|
const from = searchParams.get('from');
|
||||||
|
|
||||||
|
// Pre-fill author if authorId is provided in URL
|
||||||
|
if (authorId) {
|
||||||
|
const loadAuthor = async () => {
|
||||||
|
try {
|
||||||
|
const author = await authorApi.getAuthor(authorId);
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
authorName: author.name,
|
||||||
|
authorId: author.id
|
||||||
|
}));
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to load author:', error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
loadAuthor();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle URL import data
|
||||||
|
if (from === 'url-import') {
|
||||||
|
const title = searchParams.get('title') || '';
|
||||||
|
const summary = searchParams.get('summary') || '';
|
||||||
|
const author = searchParams.get('author') || '';
|
||||||
|
const sourceUrl = searchParams.get('sourceUrl') || '';
|
||||||
|
const tagsParam = searchParams.get('tags');
|
||||||
|
const content = searchParams.get('content') || '';
|
||||||
|
|
||||||
|
let tags: string[] = [];
|
||||||
|
try {
|
||||||
|
tags = tagsParam ? JSON.parse(tagsParam) : [];
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to parse tags:', error);
|
||||||
|
tags = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
title,
|
||||||
|
summary,
|
||||||
|
authorName: author,
|
||||||
|
authorId: undefined, // Reset author ID when importing from URL
|
||||||
|
contentHtml: content,
|
||||||
|
sourceUrl,
|
||||||
|
tags
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Show success message
|
||||||
|
setErrors({ success: 'Story data imported successfully! Review and edit as needed before saving.' });
|
||||||
|
}
|
||||||
|
}, [searchParams]);
|
||||||
|
|
||||||
|
// Load pending story data from bulk combine operation
|
||||||
|
useEffect(() => {
|
||||||
|
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
|
||||||
|
if (fromBulkCombine) {
|
||||||
|
const pendingStoryData = localStorage.getItem('pendingStory');
|
||||||
|
if (pendingStoryData) {
|
||||||
|
try {
|
||||||
|
const storyData = JSON.parse(pendingStoryData);
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
title: storyData.title || '',
|
||||||
|
authorName: storyData.author || '',
|
||||||
|
authorId: undefined, // Reset author ID for bulk combined stories
|
||||||
|
contentHtml: storyData.content || '',
|
||||||
|
sourceUrl: storyData.sourceUrl || '',
|
||||||
|
summary: storyData.summary || '',
|
||||||
|
tags: storyData.tags || []
|
||||||
|
}));
|
||||||
|
// Clear the pending data
|
||||||
|
localStorage.removeItem('pendingStory');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to load pending story data:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}, [searchParams]);
|
||||||
|
|
||||||
|
// Check for duplicates when title and author are both present
|
||||||
|
useEffect(() => {
|
||||||
|
const checkDuplicates = async () => {
|
||||||
|
const title = formData.title.trim();
|
||||||
|
const authorName = formData.authorName.trim();
|
||||||
|
|
||||||
|
// Don't check if user isn't authenticated or if title/author are empty
|
||||||
|
if (!isAuthenticated || !title || !authorName) {
|
||||||
|
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Debounce the check to avoid too many API calls
|
||||||
|
const timeoutId = setTimeout(async () => {
|
||||||
|
try {
|
||||||
|
setCheckingDuplicates(true);
|
||||||
|
const result = await storyApi.checkDuplicate(title, authorName);
|
||||||
|
|
||||||
|
if (result.hasDuplicates) {
|
||||||
|
setDuplicateWarning({
|
||||||
|
show: true,
|
||||||
|
count: result.count,
|
||||||
|
duplicates: result.duplicates
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to check for duplicates:', error);
|
||||||
|
// Clear any existing duplicate warnings on error
|
||||||
|
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||||
|
// Don't show error to user as this is just a helpful warning
|
||||||
|
// Authentication errors will be handled by the API interceptor
|
||||||
|
} finally {
|
||||||
|
setCheckingDuplicates(false);
|
||||||
|
}
|
||||||
|
}, 500); // 500ms debounce
|
||||||
|
|
||||||
|
return () => clearTimeout(timeoutId);
|
||||||
|
};
|
||||||
|
|
||||||
|
checkDuplicates();
|
||||||
|
}, [formData.title, formData.authorName, isAuthenticated]);
|
||||||
|
|
||||||
const handleInputChange = (field: string) => (
|
const handleInputChange = (field: string) => (
|
||||||
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
|
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
|
||||||
@@ -53,6 +198,32 @@ export default function AddStoryPage() {
|
|||||||
setFormData(prev => ({ ...prev, tags }));
|
setFormData(prev => ({ ...prev, tags }));
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const handleAuthorChange = (authorName: string, authorId?: string) => {
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
authorName,
|
||||||
|
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Clear error when user changes author
|
||||||
|
if (errors.authorName) {
|
||||||
|
setErrors(prev => ({ ...prev, authorName: '' }));
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSeriesChange = (seriesName: string, seriesId?: string) => {
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
seriesName,
|
||||||
|
seriesId: seriesId // This will be undefined if creating new series, which clears the existing ID
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Clear error when user changes series
|
||||||
|
if (errors.seriesName) {
|
||||||
|
setErrors(prev => ({ ...prev, seriesName: '' }));
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
const validateForm = () => {
|
const validateForm = () => {
|
||||||
const newErrors: Record<string, string> = {};
|
const newErrors: Record<string, string> = {};
|
||||||
|
|
||||||
@@ -80,6 +251,25 @@ export default function AddStoryPage() {
|
|||||||
return Object.keys(newErrors).length === 0;
|
return Object.keys(newErrors).length === 0;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Helper function to detect external images in HTML content
|
||||||
|
const hasExternalImages = (htmlContent: string): boolean => {
|
||||||
|
if (!htmlContent) return false;
|
||||||
|
|
||||||
|
// Create a temporary DOM element to parse HTML
|
||||||
|
const tempDiv = document.createElement('div');
|
||||||
|
tempDiv.innerHTML = htmlContent;
|
||||||
|
|
||||||
|
const images = tempDiv.querySelectorAll('img');
|
||||||
|
for (let i = 0; i < images.length; i++) {
|
||||||
|
const img = images[i];
|
||||||
|
const src = img.getAttribute('src');
|
||||||
|
if (src && (src.startsWith('http://') || src.startsWith('https://'))) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
};
|
||||||
|
|
||||||
const handleSubmit = async (e: React.FormEvent) => {
|
const handleSubmit = async (e: React.FormEvent) => {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
|
|
||||||
@@ -97,19 +287,57 @@ export default function AddStoryPage() {
|
|||||||
contentHtml: formData.contentHtml,
|
contentHtml: formData.contentHtml,
|
||||||
sourceUrl: formData.sourceUrl || undefined,
|
sourceUrl: formData.sourceUrl || undefined,
|
||||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||||
seriesName: formData.seriesName || undefined,
|
// Send seriesId if we have it (existing series), otherwise send seriesName (new series)
|
||||||
authorName: formData.authorName || undefined,
|
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
|
||||||
|
// Send authorId if we have it (existing author), otherwise send authorName (new author)
|
||||||
|
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||||
};
|
};
|
||||||
|
|
||||||
const story = await storyApi.createStory(storyData);
|
const story = await storyApi.createStory(storyData);
|
||||||
|
|
||||||
|
// Process images if there are external images in the content
|
||||||
|
if (hasExternalImages(formData.contentHtml)) {
|
||||||
|
try {
|
||||||
|
setProcessingImages(true);
|
||||||
|
const imageResult = await storyApi.processContentImages(story.id, formData.contentHtml);
|
||||||
|
|
||||||
|
// If images were processed and content was updated, save the updated content
|
||||||
|
if (imageResult.processedContent !== formData.contentHtml) {
|
||||||
|
await storyApi.updateStory(story.id, {
|
||||||
|
title: formData.title,
|
||||||
|
summary: formData.summary || undefined,
|
||||||
|
contentHtml: imageResult.processedContent,
|
||||||
|
sourceUrl: formData.sourceUrl || undefined,
|
||||||
|
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||||
|
...(formData.seriesId ? { seriesId: formData.seriesId } : { seriesName: formData.seriesName || undefined }),
|
||||||
|
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||||
|
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Show success message with image processing info
|
||||||
|
if (imageResult.downloadedImages.length > 0) {
|
||||||
|
console.log(`Successfully processed ${imageResult.downloadedImages.length} images`);
|
||||||
|
}
|
||||||
|
if (imageResult.warnings && imageResult.warnings.length > 0) {
|
||||||
|
console.warn('Image processing warnings:', imageResult.warnings);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (imageError) {
|
||||||
|
console.error('Failed to process images:', imageError);
|
||||||
|
// Don't fail the entire operation if image processing fails
|
||||||
|
// The story was created successfully, just without processed images
|
||||||
|
} finally {
|
||||||
|
setProcessingImages(false);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// If there's a cover image, upload it separately
|
// If there's a cover image, upload it separately
|
||||||
if (coverImage) {
|
if (coverImage) {
|
||||||
await storyApi.uploadCover(story.id, coverImage);
|
await storyApi.uploadCover(story.id, coverImage);
|
||||||
}
|
}
|
||||||
|
|
||||||
router.push(`/stories/${story.id}`);
|
router.push(`/stories/${story.id}/detail`);
|
||||||
} catch (error: any) {
|
} catch (error: any) {
|
||||||
console.error('Failed to create story:', error);
|
console.error('Failed to create story:', error);
|
||||||
const errorMessage = error.response?.data?.message || 'Failed to create story';
|
const errorMessage = error.response?.data?.message || 'Failed to create story';
|
||||||
@@ -120,14 +348,16 @@ export default function AddStoryPage() {
|
|||||||
};
|
};
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<AppLayout>
|
<ImportLayout
|
||||||
<div className="max-w-4xl mx-auto">
|
title="Add New Story"
|
||||||
<div className="mb-8">
|
description="Add a story to your personal collection"
|
||||||
<h1 className="text-3xl font-bold theme-header">Add New Story</h1>
|
>
|
||||||
<p className="theme-text mt-2">
|
{/* Success Message */}
|
||||||
Add a story to your personal collection
|
{errors.success && (
|
||||||
</p>
|
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg mb-6">
|
||||||
|
<p className="text-green-800 dark:text-green-200">{errors.success}</p>
|
||||||
</div>
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
<form onSubmit={handleSubmit} className="space-y-6">
|
<form onSubmit={handleSubmit} className="space-y-6">
|
||||||
{/* Title */}
|
{/* Title */}
|
||||||
@@ -140,16 +370,56 @@ export default function AddStoryPage() {
|
|||||||
required
|
required
|
||||||
/>
|
/>
|
||||||
|
|
||||||
{/* Author */}
|
{/* Author Selector */}
|
||||||
<Input
|
<AuthorSelector
|
||||||
label="Author *"
|
label="Author *"
|
||||||
value={formData.authorName}
|
value={formData.authorName}
|
||||||
onChange={handleInputChange('authorName')}
|
onChange={handleAuthorChange}
|
||||||
placeholder="Enter the author's name"
|
placeholder="Select or enter author name"
|
||||||
error={errors.authorName}
|
error={errors.authorName}
|
||||||
required
|
required
|
||||||
/>
|
/>
|
||||||
|
|
||||||
|
{/* Duplicate Warning */}
|
||||||
|
{duplicateWarning.show && (
|
||||||
|
<div className="p-4 bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg">
|
||||||
|
<div className="flex items-start gap-3">
|
||||||
|
<div className="text-yellow-600 dark:text-yellow-400 mt-0.5">
|
||||||
|
⚠️
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<h4 className="font-medium text-yellow-800 dark:text-yellow-200">
|
||||||
|
Potential Duplicate Detected
|
||||||
|
</h4>
|
||||||
|
<p className="text-sm text-yellow-700 dark:text-yellow-300 mt-1">
|
||||||
|
Found {duplicateWarning.count} existing {duplicateWarning.count === 1 ? 'story' : 'stories'} with the same title and author:
|
||||||
|
</p>
|
||||||
|
<ul className="mt-2 space-y-1">
|
||||||
|
{duplicateWarning.duplicates.map((duplicate, index) => (
|
||||||
|
<li key={duplicate.id} className="text-sm text-yellow-700 dark:text-yellow-300">
|
||||||
|
• <span className="font-medium">{duplicate.title}</span> by {duplicate.authorName}
|
||||||
|
<span className="text-xs ml-2">
|
||||||
|
(added {new Date(duplicate.createdAt).toLocaleDateString()})
|
||||||
|
</span>
|
||||||
|
</li>
|
||||||
|
))}
|
||||||
|
</ul>
|
||||||
|
<p className="text-xs text-yellow-600 dark:text-yellow-400 mt-2">
|
||||||
|
You can still create this story if it's different from the existing ones.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Checking indicator */}
|
||||||
|
{checkingDuplicates && (
|
||||||
|
<div className="flex items-center gap-2 text-sm theme-text">
|
||||||
|
<div className="animate-spin w-4 h-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||||
|
Checking for duplicates...
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
{/* Summary */}
|
{/* Summary */}
|
||||||
<div>
|
<div>
|
||||||
<label className="block text-sm font-medium theme-header mb-2">
|
<label className="block text-sm font-medium theme-header mb-2">
|
||||||
@@ -173,7 +443,7 @@ export default function AddStoryPage() {
|
|||||||
</label>
|
</label>
|
||||||
<ImageUpload
|
<ImageUpload
|
||||||
onImageSelect={setCoverImage}
|
onImageSelect={setCoverImage}
|
||||||
accept="image/jpeg,image/png,image/webp"
|
accept="image/jpeg,image/png"
|
||||||
maxSizeMB={5}
|
maxSizeMB={5}
|
||||||
aspectRatio="3:4"
|
aspectRatio="3:4"
|
||||||
placeholder="Drop a cover image here or click to select"
|
placeholder="Drop a cover image here or click to select"
|
||||||
@@ -190,7 +460,11 @@ export default function AddStoryPage() {
|
|||||||
onChange={handleContentChange}
|
onChange={handleContentChange}
|
||||||
placeholder="Write or paste your story content here..."
|
placeholder="Write or paste your story content here..."
|
||||||
error={errors.contentHtml}
|
error={errors.contentHtml}
|
||||||
|
enableImageProcessing={false}
|
||||||
/>
|
/>
|
||||||
|
<p className="text-sm theme-text mt-2">
|
||||||
|
💡 <strong>Tip:</strong> If you paste content with images, they'll be automatically downloaded and stored locally when you save the story.
|
||||||
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Tags */}
|
{/* Tags */}
|
||||||
@@ -207,12 +481,13 @@ export default function AddStoryPage() {
|
|||||||
|
|
||||||
{/* Series and Volume */}
|
{/* Series and Volume */}
|
||||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||||
<Input
|
<SeriesSelector
|
||||||
label="Series (optional)"
|
label="Series (optional)"
|
||||||
value={formData.seriesName}
|
value={formData.seriesName}
|
||||||
onChange={handleInputChange('seriesName')}
|
onChange={handleSeriesChange}
|
||||||
placeholder="Enter series name if part of a series"
|
placeholder="Select or enter series name if part of a series"
|
||||||
error={errors.seriesName}
|
error={errors.seriesName}
|
||||||
|
authorId={formData.authorId}
|
||||||
/>
|
/>
|
||||||
|
|
||||||
<Input
|
<Input
|
||||||
@@ -235,6 +510,18 @@ export default function AddStoryPage() {
|
|||||||
placeholder="https://example.com/original-story-url"
|
placeholder="https://example.com/original-story-url"
|
||||||
/>
|
/>
|
||||||
|
|
||||||
|
{/* Image Processing Indicator */}
|
||||||
|
{processingImages && (
|
||||||
|
<div className="p-4 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg">
|
||||||
|
<div className="flex items-center gap-3">
|
||||||
|
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
|
||||||
|
<p className="text-blue-800 dark:text-blue-200">
|
||||||
|
Processing and downloading images...
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
{/* Submit Error */}
|
{/* Submit Error */}
|
||||||
{errors.submit && (
|
{errors.submit && (
|
||||||
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||||
@@ -258,11 +545,10 @@ export default function AddStoryPage() {
|
|||||||
loading={loading}
|
loading={loading}
|
||||||
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
|
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
|
||||||
>
|
>
|
||||||
Add Story
|
{processingImages ? 'Processing Images...' : 'Add Story'}
|
||||||
</Button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
</form>
|
</form>
|
||||||
</div>
|
</ImportLayout>
|
||||||
</AppLayout>
|
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -269,7 +269,7 @@ export default function EditAuthorPage() {
|
|||||||
</label>
|
</label>
|
||||||
<ImageUpload
|
<ImageUpload
|
||||||
onImageSelect={setAvatarImage}
|
onImageSelect={setAvatarImage}
|
||||||
accept="image/jpeg,image/png,image/webp"
|
accept="image/jpeg,image/png"
|
||||||
maxSizeMB={5}
|
maxSizeMB={5}
|
||||||
aspectRatio="1:1"
|
aspectRatio="1:1"
|
||||||
placeholder="Drop an avatar image here or click to select"
|
placeholder="Drop an avatar image here or click to select"
|
||||||
|
|||||||
@@ -207,9 +207,14 @@ export default function AuthorDetailPage() {
|
|||||||
<div className="lg:col-span-2 space-y-6">
|
<div className="lg:col-span-2 space-y-6">
|
||||||
<div className="flex items-center justify-between">
|
<div className="flex items-center justify-between">
|
||||||
<h2 className="text-2xl font-semibold theme-header">Stories</h2>
|
<h2 className="text-2xl font-semibold theme-header">Stories</h2>
|
||||||
|
<div className="flex items-center gap-4">
|
||||||
<p className="theme-text">
|
<p className="theme-text">
|
||||||
{stories.length} {stories.length === 1 ? 'story' : 'stories'}
|
{stories.length} {stories.length === 1 ? 'story' : 'stories'}
|
||||||
</p>
|
</p>
|
||||||
|
<Button href={`/add-story?authorId=${authorId}`}>
|
||||||
|
Add Story
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{stories.length === 0 ? (
|
{stories.length === 0 ? (
|
||||||
|
|||||||
@@ -14,6 +14,7 @@ export default function AuthorsPage() {
|
|||||||
const [authors, setAuthors] = useState<Author[]>([]);
|
const [authors, setAuthors] = useState<Author[]>([]);
|
||||||
const [filteredAuthors, setFilteredAuthors] = useState<Author[]>([]);
|
const [filteredAuthors, setFilteredAuthors] = useState<Author[]>([]);
|
||||||
const [loading, setLoading] = useState(true);
|
const [loading, setLoading] = useState(true);
|
||||||
|
const [searchLoading, setSearchLoading] = useState(false);
|
||||||
const [searchQuery, setSearchQuery] = useState('');
|
const [searchQuery, setSearchQuery] = useState('');
|
||||||
const [viewMode, setViewMode] = useState<'grid' | 'list'>('grid');
|
const [viewMode, setViewMode] = useState<'grid' | 'list'>('grid');
|
||||||
const [sortBy, setSortBy] = useState('name');
|
const [sortBy, setSortBy] = useState('name');
|
||||||
@@ -24,9 +25,16 @@ export default function AuthorsPage() {
|
|||||||
const ITEMS_PER_PAGE = 50; // Safe limit under Typesense's 250 limit
|
const ITEMS_PER_PAGE = 50; // Safe limit under Typesense's 250 limit
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
|
const debounceTimer = setTimeout(() => {
|
||||||
const loadAuthors = async () => {
|
const loadAuthors = async () => {
|
||||||
try {
|
try {
|
||||||
|
// Use searchLoading for background search, loading only for initial load
|
||||||
|
const isInitialLoad = authors.length === 0 && !searchQuery && currentPage === 0;
|
||||||
|
if (isInitialLoad) {
|
||||||
setLoading(true);
|
setLoading(true);
|
||||||
|
} else {
|
||||||
|
setSearchLoading(true);
|
||||||
|
}
|
||||||
const searchResults = await authorApi.searchAuthorsTypesense({
|
const searchResults = await authorApi.searchAuthorsTypesense({
|
||||||
q: searchQuery || '*',
|
q: searchQuery || '*',
|
||||||
page: currentPage,
|
page: currentPage,
|
||||||
@@ -64,10 +72,14 @@ export default function AuthorsPage() {
|
|||||||
}
|
}
|
||||||
} finally {
|
} finally {
|
||||||
setLoading(false);
|
setLoading(false);
|
||||||
|
setSearchLoading(false);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
loadAuthors();
|
loadAuthors();
|
||||||
|
}, searchQuery ? 500 : 0); // 500ms debounce for search, immediate for other changes
|
||||||
|
|
||||||
|
return () => clearTimeout(debounceTimer);
|
||||||
}, [searchQuery, sortBy, sortOrder, currentPage]);
|
}, [searchQuery, sortBy, sortOrder, currentPage]);
|
||||||
|
|
||||||
// Reset pagination when search or sort changes
|
// Reset pagination when search or sort changes
|
||||||
@@ -133,13 +145,18 @@ export default function AuthorsPage() {
|
|||||||
|
|
||||||
{/* Search and Sort Controls */}
|
{/* Search and Sort Controls */}
|
||||||
<div className="flex flex-col md:flex-row gap-4">
|
<div className="flex flex-col md:flex-row gap-4">
|
||||||
<div className="flex-1 max-w-md">
|
<div className="flex-1 max-w-md relative">
|
||||||
<Input
|
<Input
|
||||||
type="search"
|
type="search"
|
||||||
placeholder="Search authors..."
|
placeholder="Search authors..."
|
||||||
value={searchQuery}
|
value={searchQuery}
|
||||||
onChange={(e) => setSearchQuery(e.target.value)}
|
onChange={(e) => setSearchQuery(e.target.value)}
|
||||||
/>
|
/>
|
||||||
|
{searchLoading && (
|
||||||
|
<div className="absolute right-3 top-1/2 transform -translate-y-1/2">
|
||||||
|
<div className="animate-spin h-4 w-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="flex gap-2">
|
<div className="flex gap-2">
|
||||||
|
|||||||
@@ -26,19 +26,27 @@ export default function CollectionsPage() {
|
|||||||
const [totalCollections, setTotalCollections] = useState(0);
|
const [totalCollections, setTotalCollections] = useState(0);
|
||||||
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
||||||
|
|
||||||
// Load tags for filtering
|
|
||||||
useEffect(() => {
|
|
||||||
const loadTags = async () => {
|
|
||||||
try {
|
|
||||||
const tagsResult = await tagApi.getTags({ page: 0, size: 1000 });
|
|
||||||
setTags(tagsResult?.content || []);
|
|
||||||
} catch (error) {
|
|
||||||
console.error('Failed to load tags:', error);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
loadTags();
|
// Extract tags from current collection results with counts
|
||||||
}, []);
|
const extractTagsFromResults = (collections: Collection[]): Tag[] => {
|
||||||
|
const tagCounts: { [key: string]: number } = {};
|
||||||
|
|
||||||
|
collections.forEach(collection => {
|
||||||
|
collection.tagNames?.forEach(tagName => {
|
||||||
|
if (tagCounts[tagName]) {
|
||||||
|
tagCounts[tagName]++;
|
||||||
|
} else {
|
||||||
|
tagCounts[tagName] = 1;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
return Object.entries(tagCounts).map(([tagName, count]) => ({
|
||||||
|
id: tagName, // Use tag name as ID since we don't have actual IDs from search results
|
||||||
|
name: tagName,
|
||||||
|
collectionCount: count
|
||||||
|
}));
|
||||||
|
};
|
||||||
|
|
||||||
// Load collections with search and filters
|
// Load collections with search and filters
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
@@ -55,9 +63,14 @@ export default function CollectionsPage() {
|
|||||||
archived: showArchived,
|
archived: showArchived,
|
||||||
});
|
});
|
||||||
|
|
||||||
setCollections(result?.results || []);
|
const currentCollections = result?.results || [];
|
||||||
|
setCollections(currentCollections);
|
||||||
setTotalPages(Math.ceil((result?.totalHits || 0) / pageSize));
|
setTotalPages(Math.ceil((result?.totalHits || 0) / pageSize));
|
||||||
setTotalCollections(result?.totalHits || 0);
|
setTotalCollections(result?.totalHits || 0);
|
||||||
|
|
||||||
|
// Always update tags based on current search results (including initial wildcard search)
|
||||||
|
const resultTags = extractTagsFromResults(currentCollections);
|
||||||
|
setTags(resultTags);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Failed to load collections:', error);
|
console.error('Failed to load collections:', error);
|
||||||
setCollections([]);
|
setCollections([]);
|
||||||
@@ -223,6 +236,7 @@ export default function CollectionsPage() {
|
|||||||
tags={tags}
|
tags={tags}
|
||||||
selectedTags={selectedTags}
|
selectedTags={selectedTags}
|
||||||
onTagToggle={handleTagToggle}
|
onTagToggle={handleTagToggle}
|
||||||
|
showCollectionCount={true}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
|||||||
@@ -85,13 +85,28 @@
|
|||||||
line-height: 1.7;
|
line-height: 1.7;
|
||||||
}
|
}
|
||||||
|
|
||||||
.reading-content h1,
|
.reading-content h1 {
|
||||||
.reading-content h2,
|
@apply text-2xl font-bold mt-8 mb-4 theme-header;
|
||||||
.reading-content h3,
|
}
|
||||||
.reading-content h4,
|
|
||||||
.reading-content h5,
|
.reading-content h2 {
|
||||||
|
@apply text-xl font-bold mt-6 mb-3 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.reading-content h3 {
|
||||||
|
@apply text-lg font-semibold mt-6 mb-3 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.reading-content h4 {
|
||||||
|
@apply text-base font-semibold mt-4 mb-2 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.reading-content h5 {
|
||||||
|
@apply text-sm font-semibold mt-4 mb-2 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
.reading-content h6 {
|
.reading-content h6 {
|
||||||
@apply font-bold mt-8 mb-4 theme-header;
|
@apply text-xs font-semibold mt-4 mb-2 theme-header uppercase tracking-wide;
|
||||||
}
|
}
|
||||||
|
|
||||||
.reading-content p {
|
.reading-content p {
|
||||||
@@ -118,4 +133,107 @@
|
|||||||
.reading-content em {
|
.reading-content em {
|
||||||
@apply italic;
|
@apply italic;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/* Image styling for story content */
|
||||||
|
.reading-content img {
|
||||||
|
@apply max-w-full h-auto mx-auto my-6 rounded-lg shadow-sm;
|
||||||
|
max-height: 80vh; /* Prevent images from being too tall */
|
||||||
|
display: block;
|
||||||
|
}
|
||||||
|
|
||||||
|
.reading-content img[align="left"] {
|
||||||
|
@apply float-left mr-4 mb-4 ml-0;
|
||||||
|
max-width: 50%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.reading-content img[align="right"] {
|
||||||
|
@apply float-right ml-4 mb-4 mr-0;
|
||||||
|
max-width: 50%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.reading-content img[align="center"] {
|
||||||
|
@apply block mx-auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Editor content styling - same as reading content but for the rich text editor */
|
||||||
|
.editor-content h1 {
|
||||||
|
@apply text-2xl font-bold mt-8 mb-4 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content h2 {
|
||||||
|
@apply text-xl font-bold mt-6 mb-3 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content h3 {
|
||||||
|
@apply text-lg font-semibold mt-6 mb-3 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content h4 {
|
||||||
|
@apply text-base font-semibold mt-4 mb-2 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content h5 {
|
||||||
|
@apply text-sm font-semibold mt-4 mb-2 theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content h6 {
|
||||||
|
@apply text-xs font-semibold mt-4 mb-2 theme-header uppercase tracking-wide;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content p {
|
||||||
|
@apply mb-4 theme-text;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content blockquote {
|
||||||
|
@apply border-l-4 pl-4 italic my-6 theme-border theme-text;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content ul,
|
||||||
|
.editor-content ol {
|
||||||
|
@apply mb-4 pl-6 theme-text;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content li {
|
||||||
|
@apply mb-2;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content strong {
|
||||||
|
@apply font-semibold theme-header;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content em {
|
||||||
|
@apply italic;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Image styling for editor content */
|
||||||
|
.editor-content img {
|
||||||
|
@apply max-w-full h-auto mx-auto my-4 rounded border;
|
||||||
|
max-height: 60vh; /* Slightly smaller for editor */
|
||||||
|
display: block;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content img[align="left"] {
|
||||||
|
@apply float-left mr-4 mb-4 ml-0;
|
||||||
|
max-width: 50%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content img[align="right"] {
|
||||||
|
@apply float-right ml-4 mb-4 mr-0;
|
||||||
|
max-width: 50%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.editor-content img[align="center"] {
|
||||||
|
@apply block mx-auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Loading placeholder for images being processed */
|
||||||
|
.image-processing-placeholder {
|
||||||
|
@apply bg-gray-100 dark:bg-gray-800 animate-pulse rounded border-2 border-dashed border-gray-300 dark:border-gray-600 flex items-center justify-center;
|
||||||
|
min-height: 200px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.image-processing-placeholder::before {
|
||||||
|
content: "🖼️ Processing image...";
|
||||||
|
@apply text-gray-500 dark:text-gray-400 text-sm;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
380
frontend/src/app/import/bulk/page.tsx
Normal file
380
frontend/src/app/import/bulk/page.tsx
Normal file
@@ -0,0 +1,380 @@
|
|||||||
|
'use client';
|
||||||
|
|
||||||
|
import { useState } from 'react';
|
||||||
|
import { useRouter } from 'next/navigation';
|
||||||
|
import Link from 'next/link';
|
||||||
|
import BulkImportProgress from '@/components/BulkImportProgress';
|
||||||
|
import ImportLayout from '@/components/layout/ImportLayout';
|
||||||
|
import Button from '@/components/ui/Button';
|
||||||
|
import { Textarea } from '@/components/ui/Input';
|
||||||
|
|
||||||
|
interface ImportResult {
|
||||||
|
url: string;
|
||||||
|
status: 'imported' | 'skipped' | 'error';
|
||||||
|
reason?: string;
|
||||||
|
title?: string;
|
||||||
|
author?: string;
|
||||||
|
error?: string;
|
||||||
|
storyId?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface BulkImportResponse {
|
||||||
|
results: ImportResult[];
|
||||||
|
summary: {
|
||||||
|
total: number;
|
||||||
|
imported: number;
|
||||||
|
skipped: number;
|
||||||
|
errors: number;
|
||||||
|
};
|
||||||
|
combinedStory?: {
|
||||||
|
title: string;
|
||||||
|
author: string;
|
||||||
|
content: string;
|
||||||
|
summary?: string;
|
||||||
|
sourceUrl: string;
|
||||||
|
tags?: string[];
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function BulkImportPage() {
|
||||||
|
const router = useRouter();
|
||||||
|
const [urls, setUrls] = useState('');
|
||||||
|
const [combineIntoOne, setCombineIntoOne] = useState(false);
|
||||||
|
const [isLoading, setIsLoading] = useState(false);
|
||||||
|
const [results, setResults] = useState<BulkImportResponse | null>(null);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
const [sessionId, setSessionId] = useState<string | null>(null);
|
||||||
|
const [showProgress, setShowProgress] = useState(false);
|
||||||
|
|
||||||
|
const handleSubmit = async (e: React.FormEvent) => {
|
||||||
|
e.preventDefault();
|
||||||
|
|
||||||
|
if (!urls.trim()) {
|
||||||
|
setError('Please enter at least one URL');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
setResults(null);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Parse URLs from textarea (one per line)
|
||||||
|
const urlList = urls
|
||||||
|
.split('\n')
|
||||||
|
.map(url => url.trim())
|
||||||
|
.filter(url => url.length > 0);
|
||||||
|
|
||||||
|
if (urlList.length === 0) {
|
||||||
|
setError('Please enter at least one valid URL');
|
||||||
|
setIsLoading(false);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (urlList.length > 200) {
|
||||||
|
setError('Maximum 200 URLs allowed per bulk import');
|
||||||
|
setIsLoading(false);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate session ID for progress tracking
|
||||||
|
const newSessionId = `bulk-import-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
||||||
|
setSessionId(newSessionId);
|
||||||
|
setShowProgress(true);
|
||||||
|
|
||||||
|
// Get auth token for server-side API calls
|
||||||
|
const token = localStorage.getItem('auth-token');
|
||||||
|
|
||||||
|
const response = await fetch('/scrape/bulk', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
'Authorization': token ? `Bearer ${token}` : '',
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ urls: urlList, combineIntoOne, sessionId: newSessionId }),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const errorData = await response.json();
|
||||||
|
throw new Error(errorData.error || 'Failed to start bulk import');
|
||||||
|
}
|
||||||
|
|
||||||
|
const startData = await response.json();
|
||||||
|
console.log('Bulk import started:', startData);
|
||||||
|
|
||||||
|
// The progress component will handle the rest via SSE
|
||||||
|
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Bulk import error:', err);
|
||||||
|
setError(err instanceof Error ? err.message : 'Failed to import stories');
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleReset = () => {
|
||||||
|
setUrls('');
|
||||||
|
setCombineIntoOne(false);
|
||||||
|
setResults(null);
|
||||||
|
setError(null);
|
||||||
|
setSessionId(null);
|
||||||
|
setShowProgress(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleProgressComplete = (data?: any) => {
|
||||||
|
// Progress component will handle this when the operation completes
|
||||||
|
setShowProgress(false);
|
||||||
|
setIsLoading(false);
|
||||||
|
|
||||||
|
// Handle completion data
|
||||||
|
if (data) {
|
||||||
|
if (data.combinedStory && combineIntoOne) {
|
||||||
|
// For combine mode, redirect to import page with the combined content
|
||||||
|
localStorage.setItem('pendingStory', JSON.stringify(data.combinedStory));
|
||||||
|
router.push('/add-story?from=bulk-combine');
|
||||||
|
return;
|
||||||
|
} else if (data.results && data.summary) {
|
||||||
|
// For individual mode, show the results
|
||||||
|
setResults({
|
||||||
|
results: data.results,
|
||||||
|
summary: data.summary
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback: just hide progress and let user know it completed
|
||||||
|
console.log('Import completed successfully');
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleProgressError = (errorMessage: string) => {
|
||||||
|
setError(errorMessage);
|
||||||
|
setIsLoading(false);
|
||||||
|
setShowProgress(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const getStatusColor = (status: string) => {
|
||||||
|
switch (status) {
|
||||||
|
case 'imported': return 'text-green-700 bg-green-50 border-green-200';
|
||||||
|
case 'skipped': return 'text-yellow-700 bg-yellow-50 border-yellow-200';
|
||||||
|
case 'error': return 'text-red-700 bg-red-50 border-red-200';
|
||||||
|
default: return 'text-gray-700 bg-gray-50 border-gray-200';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const getStatusIcon = (status: string) => {
|
||||||
|
switch (status) {
|
||||||
|
case 'imported': return '✓';
|
||||||
|
case 'skipped': return '⚠';
|
||||||
|
case 'error': return '✗';
|
||||||
|
default: return '';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<ImportLayout
|
||||||
|
title="Bulk Import Stories"
|
||||||
|
description="Import multiple stories at once by providing a list of URLs"
|
||||||
|
>
|
||||||
|
|
||||||
|
{!results ? (
|
||||||
|
// Import Form
|
||||||
|
<form onSubmit={handleSubmit} className="space-y-6">
|
||||||
|
<div>
|
||||||
|
<label htmlFor="urls" className="block text-sm font-medium theme-header mb-2">
|
||||||
|
Story URLs
|
||||||
|
</label>
|
||||||
|
<p className="text-sm theme-text mb-3">
|
||||||
|
Enter one URL per line. Maximum 200 URLs per import.
|
||||||
|
</p>
|
||||||
|
<Textarea
|
||||||
|
id="urls"
|
||||||
|
value={urls}
|
||||||
|
onChange={(e) => setUrls(e.target.value)}
|
||||||
|
placeholder="https://example.com/story1
|
||||||
|
https://example.com/story2
|
||||||
|
https://example.com/story3"
|
||||||
|
rows={12}
|
||||||
|
disabled={isLoading}
|
||||||
|
/>
|
||||||
|
<p className="mt-2 text-sm theme-text">
|
||||||
|
URLs: {urls.split('\n').filter(url => url.trim().length > 0).length}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
id="combine-into-one"
|
||||||
|
type="checkbox"
|
||||||
|
checked={combineIntoOne}
|
||||||
|
onChange={(e) => setCombineIntoOne(e.target.checked)}
|
||||||
|
className="h-4 w-4 theme-accent focus:ring-theme-accent theme-border rounded"
|
||||||
|
disabled={isLoading}
|
||||||
|
/>
|
||||||
|
<label htmlFor="combine-into-one" className="ml-2 block text-sm theme-text">
|
||||||
|
Combine all URL content into a single story
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{combineIntoOne && (
|
||||||
|
<div className="bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg p-4">
|
||||||
|
<div className="text-sm text-blue-800 dark:text-blue-200">
|
||||||
|
<p className="font-medium mb-2">Combined Story Mode:</p>
|
||||||
|
<ul className="list-disc list-inside space-y-1 text-blue-700 dark:text-blue-300">
|
||||||
|
<li>All URLs will be scraped and their content combined into one story</li>
|
||||||
|
<li>Story title and author will be taken from the first URL</li>
|
||||||
|
<li>Import will fail if any URL has no content (title/author can be empty)</li>
|
||||||
|
<li>You'll be redirected to the story creation page to review and edit</li>
|
||||||
|
{urls.split('\n').filter(url => url.trim().length > 0).length > 50 && (
|
||||||
|
<li className="text-yellow-700 dark:text-yellow-300 font-medium">⚠️ Large imports (50+ URLs) may take several minutes and could be truncated if too large</li>
|
||||||
|
)}
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{error && (
|
||||||
|
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg p-4">
|
||||||
|
<div className="flex">
|
||||||
|
<div className="ml-3">
|
||||||
|
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">Error</h3>
|
||||||
|
<div className="mt-2 text-sm text-red-700 dark:text-red-300">
|
||||||
|
{error}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="flex gap-4">
|
||||||
|
<Button
|
||||||
|
type="submit"
|
||||||
|
disabled={isLoading || !urls.trim()}
|
||||||
|
loading={isLoading}
|
||||||
|
>
|
||||||
|
{isLoading ? 'Importing...' : 'Start Import'}
|
||||||
|
</Button>
|
||||||
|
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
variant="secondary"
|
||||||
|
onClick={handleReset}
|
||||||
|
disabled={isLoading}
|
||||||
|
>
|
||||||
|
Clear
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Progress Component */}
|
||||||
|
{showProgress && sessionId && (
|
||||||
|
<BulkImportProgress
|
||||||
|
sessionId={sessionId}
|
||||||
|
onComplete={handleProgressComplete}
|
||||||
|
onError={handleProgressError}
|
||||||
|
combineMode={combineIntoOne}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Fallback loading indicator if progress isn't shown yet */}
|
||||||
|
{isLoading && !showProgress && (
|
||||||
|
<div className="bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg p-4">
|
||||||
|
<div className="flex items-center">
|
||||||
|
<div className="animate-spin rounded-full h-5 w-5 border-b-2 border-theme-accent mr-3"></div>
|
||||||
|
<div>
|
||||||
|
<p className="text-sm font-medium text-blue-800 dark:text-blue-200">Starting import...</p>
|
||||||
|
<p className="text-sm text-blue-600 dark:text-blue-300">
|
||||||
|
Preparing to process {urls.split('\n').filter(url => url.trim().length > 0).length} URLs.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</form>
|
||||||
|
) : (
|
||||||
|
// Results
|
||||||
|
<div className="space-y-6">
|
||||||
|
{/* Summary */}
|
||||||
|
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||||
|
<h2 className="text-xl font-semibold theme-header mb-4">Import Summary</h2>
|
||||||
|
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
|
||||||
|
<div className="text-center">
|
||||||
|
<div className="text-2xl font-bold theme-header">{results.summary.total}</div>
|
||||||
|
<div className="text-sm theme-text">Total URLs</div>
|
||||||
|
</div>
|
||||||
|
<div className="text-center">
|
||||||
|
<div className="text-2xl font-bold text-green-600 dark:text-green-400">{results.summary.imported}</div>
|
||||||
|
<div className="text-sm theme-text">Imported</div>
|
||||||
|
</div>
|
||||||
|
<div className="text-center">
|
||||||
|
<div className="text-2xl font-bold text-yellow-600 dark:text-yellow-400">{results.summary.skipped}</div>
|
||||||
|
<div className="text-sm theme-text">Skipped</div>
|
||||||
|
</div>
|
||||||
|
<div className="text-center">
|
||||||
|
<div className="text-2xl font-bold text-red-600 dark:text-red-400">{results.summary.errors}</div>
|
||||||
|
<div className="text-sm theme-text">Errors</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Detailed Results */}
|
||||||
|
<div className="theme-card theme-shadow rounded-lg">
|
||||||
|
<div className="px-6 py-4 border-b theme-border">
|
||||||
|
<h3 className="text-lg font-medium theme-header">Detailed Results</h3>
|
||||||
|
</div>
|
||||||
|
<div className="divide-y theme-border">
|
||||||
|
{results.results.map((result, index) => (
|
||||||
|
<div key={index} className="p-6">
|
||||||
|
<div className="flex items-start justify-between">
|
||||||
|
<div className="flex-1 min-w-0">
|
||||||
|
<div className="flex items-center gap-2 mb-2">
|
||||||
|
<span className={`inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium border ${getStatusColor(result.status)}`}>
|
||||||
|
{getStatusIcon(result.status)} {result.status.charAt(0).toUpperCase() + result.status.slice(1)}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<p className="text-sm theme-header font-medium truncate mb-1">
|
||||||
|
{result.url}
|
||||||
|
</p>
|
||||||
|
|
||||||
|
{result.title && result.author && (
|
||||||
|
<p className="text-sm theme-text mb-1">
|
||||||
|
"{result.title}" by {result.author}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{result.reason && (
|
||||||
|
<p className="text-sm theme-text">
|
||||||
|
{result.reason}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{result.error && (
|
||||||
|
<p className="text-sm text-red-600 dark:text-red-400">
|
||||||
|
Error: {result.error}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Actions */}
|
||||||
|
<div className="flex gap-4">
|
||||||
|
<Button onClick={handleReset}>
|
||||||
|
Import More URLs
|
||||||
|
</Button>
|
||||||
|
|
||||||
|
<Button
|
||||||
|
variant="secondary"
|
||||||
|
onClick={() => router.push('/library')}
|
||||||
|
>
|
||||||
|
View Stories
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</ImportLayout>
|
||||||
|
);
|
||||||
|
}
|
||||||
409
frontend/src/app/import/epub/page.tsx
Normal file
409
frontend/src/app/import/epub/page.tsx
Normal file
@@ -0,0 +1,409 @@
|
|||||||
|
'use client';
|
||||||
|
|
||||||
|
import { useState } from 'react';
|
||||||
|
import { useRouter } from 'next/navigation';
|
||||||
|
import { DocumentArrowUpIcon } from '@heroicons/react/24/outline';
|
||||||
|
import Button from '@/components/ui/Button';
|
||||||
|
import { Input } from '@/components/ui/Input';
|
||||||
|
import ImportLayout from '@/components/layout/ImportLayout';
|
||||||
|
|
||||||
|
interface EPUBImportResponse {
|
||||||
|
success: boolean;
|
||||||
|
message: string;
|
||||||
|
storyId?: string;
|
||||||
|
storyTitle?: string;
|
||||||
|
totalChapters?: number;
|
||||||
|
wordCount?: number;
|
||||||
|
readingPosition?: any;
|
||||||
|
warnings?: string[];
|
||||||
|
errors?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function EPUBImportPage() {
|
||||||
|
const router = useRouter();
|
||||||
|
const [selectedFile, setSelectedFile] = useState<File | null>(null);
|
||||||
|
const [isLoading, setIsLoading] = useState(false);
|
||||||
|
const [isValidating, setIsValidating] = useState(false);
|
||||||
|
const [validationResult, setValidationResult] = useState<any>(null);
|
||||||
|
const [importResult, setImportResult] = useState<EPUBImportResponse | null>(null);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
|
||||||
|
// Import options
|
||||||
|
const [authorName, setAuthorName] = useState<string>('');
|
||||||
|
const [seriesName, setSeriesName] = useState<string>('');
|
||||||
|
const [seriesVolume, setSeriesVolume] = useState<string>('');
|
||||||
|
const [tags, setTags] = useState<string>('');
|
||||||
|
const [preserveReadingPosition, setPreserveReadingPosition] = useState(true);
|
||||||
|
const [overwriteExisting, setOverwriteExisting] = useState(false);
|
||||||
|
const [createMissingAuthor, setCreateMissingAuthor] = useState(true);
|
||||||
|
const [createMissingSeries, setCreateMissingSeries] = useState(true);
|
||||||
|
|
||||||
|
const handleFileChange = async (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||||
|
const file = e.target.files?.[0];
|
||||||
|
if (file) {
|
||||||
|
setSelectedFile(file);
|
||||||
|
setValidationResult(null);
|
||||||
|
setImportResult(null);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
if (file.name.toLowerCase().endsWith('.epub')) {
|
||||||
|
await validateFile(file);
|
||||||
|
} else {
|
||||||
|
setError('Please select a valid EPUB file (.epub extension)');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const validateFile = async (file: File) => {
|
||||||
|
setIsValidating(true);
|
||||||
|
try {
|
||||||
|
const token = localStorage.getItem('auth-token');
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', file);
|
||||||
|
|
||||||
|
const response = await fetch('/api/stories/epub/validate', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Authorization': token ? `Bearer ${token}` : '',
|
||||||
|
},
|
||||||
|
body: formData,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (response.ok) {
|
||||||
|
const result = await response.json();
|
||||||
|
setValidationResult(result);
|
||||||
|
if (!result.valid) {
|
||||||
|
setError('EPUB file validation failed: ' + result.errors.join(', '));
|
||||||
|
}
|
||||||
|
} else if (response.status === 401 || response.status === 403) {
|
||||||
|
setError('Authentication required. Please log in.');
|
||||||
|
} else {
|
||||||
|
setError('Failed to validate EPUB file');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
setError('Error validating EPUB file: ' + (err as Error).message);
|
||||||
|
} finally {
|
||||||
|
setIsValidating(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSubmit = async (e: React.FormEvent) => {
|
||||||
|
e.preventDefault();
|
||||||
|
|
||||||
|
if (!selectedFile) {
|
||||||
|
setError('Please select an EPUB file');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (validationResult && !validationResult.valid) {
|
||||||
|
setError('Cannot import invalid EPUB file');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const token = localStorage.getItem('auth-token');
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', selectedFile);
|
||||||
|
|
||||||
|
if (authorName) formData.append('authorName', authorName);
|
||||||
|
if (seriesName) formData.append('seriesName', seriesName);
|
||||||
|
if (seriesVolume) formData.append('seriesVolume', seriesVolume);
|
||||||
|
if (tags) {
|
||||||
|
const tagList = tags.split(',').map(t => t.trim()).filter(t => t.length > 0);
|
||||||
|
tagList.forEach(tag => formData.append('tags', tag));
|
||||||
|
}
|
||||||
|
|
||||||
|
formData.append('preserveReadingPosition', preserveReadingPosition.toString());
|
||||||
|
formData.append('overwriteExisting', overwriteExisting.toString());
|
||||||
|
formData.append('createMissingAuthor', createMissingAuthor.toString());
|
||||||
|
formData.append('createMissingSeries', createMissingSeries.toString());
|
||||||
|
|
||||||
|
const response = await fetch('/api/stories/epub/import', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Authorization': token ? `Bearer ${token}` : '',
|
||||||
|
},
|
||||||
|
body: formData,
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = await response.json();
|
||||||
|
|
||||||
|
if (response.ok && result.success) {
|
||||||
|
setImportResult(result);
|
||||||
|
} else if (response.status === 401 || response.status === 403) {
|
||||||
|
setError('Authentication required. Please log in.');
|
||||||
|
} else {
|
||||||
|
setError(result.message || 'Failed to import EPUB');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
setError('Error importing EPUB: ' + (err as Error).message);
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const resetForm = () => {
|
||||||
|
setSelectedFile(null);
|
||||||
|
setValidationResult(null);
|
||||||
|
setImportResult(null);
|
||||||
|
setError(null);
|
||||||
|
setAuthorName('');
|
||||||
|
setSeriesName('');
|
||||||
|
setSeriesVolume('');
|
||||||
|
setTags('');
|
||||||
|
};
|
||||||
|
|
||||||
|
if (importResult?.success) {
|
||||||
|
return (
|
||||||
|
<ImportLayout
|
||||||
|
title="EPUB Import Successful"
|
||||||
|
description="Your EPUB has been successfully imported into StoryCove"
|
||||||
|
>
|
||||||
|
<div className="space-y-6">
|
||||||
|
<div className="bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg p-6">
|
||||||
|
<h2 className="text-xl font-semibold text-green-600 dark:text-green-400 mb-2">Import Completed</h2>
|
||||||
|
<p className="theme-text">
|
||||||
|
Your EPUB has been successfully imported into StoryCove.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<span className="font-semibold theme-header">Story Title:</span>
|
||||||
|
<p className="theme-text">{importResult.storyTitle}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{importResult.wordCount && (
|
||||||
|
<div>
|
||||||
|
<span className="font-semibold theme-header">Word Count:</span>
|
||||||
|
<p className="theme-text">{importResult.wordCount.toLocaleString()} words</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{importResult.totalChapters && (
|
||||||
|
<div>
|
||||||
|
<span className="font-semibold theme-header">Chapters:</span>
|
||||||
|
<p className="theme-text">{importResult.totalChapters}</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{importResult.warnings && importResult.warnings.length > 0 && (
|
||||||
|
<div className="bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg p-4">
|
||||||
|
<strong className="text-yellow-800 dark:text-yellow-200">Warnings:</strong>
|
||||||
|
<ul className="list-disc list-inside mt-2 text-yellow-700 dark:text-yellow-300">
|
||||||
|
{importResult.warnings.map((warning, index) => (
|
||||||
|
<li key={index}>{warning}</li>
|
||||||
|
))}
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="flex gap-4 mt-6">
|
||||||
|
<Button
|
||||||
|
onClick={() => router.push(`/stories/${importResult.storyId}`)}
|
||||||
|
>
|
||||||
|
View Story
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
onClick={resetForm}
|
||||||
|
variant="secondary"
|
||||||
|
>
|
||||||
|
Import Another EPUB
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</ImportLayout>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<ImportLayout
|
||||||
|
title="Import EPUB"
|
||||||
|
description="Upload an EPUB file to import it as a story into your library"
|
||||||
|
>
|
||||||
|
{error && (
|
||||||
|
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg p-4 mb-6">
|
||||||
|
<p className="text-red-800 dark:text-red-200">{error}</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<form onSubmit={handleSubmit} className="space-y-6">
|
||||||
|
{/* File Upload */}
|
||||||
|
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||||
|
<div className="mb-4">
|
||||||
|
<h3 className="text-lg font-semibold theme-header mb-2">Select EPUB File</h3>
|
||||||
|
<p className="theme-text">
|
||||||
|
Choose an EPUB file from your device to import.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<label htmlFor="epub-file" className="block text-sm font-medium theme-header mb-1">EPUB File</label>
|
||||||
|
<Input
|
||||||
|
id="epub-file"
|
||||||
|
type="file"
|
||||||
|
accept=".epub,application/epub+zip"
|
||||||
|
onChange={handleFileChange}
|
||||||
|
disabled={isLoading || isValidating}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{selectedFile && (
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<DocumentArrowUpIcon className="h-5 w-5 theme-text" />
|
||||||
|
<span className="text-sm theme-text">
|
||||||
|
{selectedFile.name} ({(selectedFile.size / 1024 / 1024).toFixed(2)} MB)
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{isValidating && (
|
||||||
|
<div className="text-sm theme-accent">
|
||||||
|
Validating EPUB file...
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{validationResult && (
|
||||||
|
<div className="text-sm">
|
||||||
|
{validationResult.valid ? (
|
||||||
|
<span className="inline-flex items-center px-2 py-1 rounded text-xs font-medium bg-green-100 dark:bg-green-900/20 text-green-800 dark:text-green-200">
|
||||||
|
Valid EPUB
|
||||||
|
</span>
|
||||||
|
) : (
|
||||||
|
<span className="inline-flex items-center px-2 py-1 rounded text-xs font-medium bg-red-100 dark:bg-red-900/20 text-red-800 dark:text-red-200">
|
||||||
|
Invalid EPUB
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Import Options */}
|
||||||
|
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||||
|
<div className="mb-4">
|
||||||
|
<h3 className="text-lg font-semibold theme-header mb-2">Import Options</h3>
|
||||||
|
<p className="theme-text">
|
||||||
|
Configure how the EPUB should be imported.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<label htmlFor="author-name" className="block text-sm font-medium theme-header mb-1">Author Name (Override)</label>
|
||||||
|
<Input
|
||||||
|
id="author-name"
|
||||||
|
value={authorName}
|
||||||
|
onChange={(e) => setAuthorName(e.target.value)}
|
||||||
|
placeholder="Leave empty to use EPUB metadata"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label htmlFor="series-name" className="block text-sm font-medium theme-header mb-1">Series Name</label>
|
||||||
|
<Input
|
||||||
|
id="series-name"
|
||||||
|
value={seriesName}
|
||||||
|
onChange={(e) => setSeriesName(e.target.value)}
|
||||||
|
placeholder="Optional: Add to a series"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{seriesName && (
|
||||||
|
<div>
|
||||||
|
<label htmlFor="series-volume" className="block text-sm font-medium theme-header mb-1">Series Volume</label>
|
||||||
|
<Input
|
||||||
|
id="series-volume"
|
||||||
|
type="number"
|
||||||
|
value={seriesVolume}
|
||||||
|
onChange={(e) => setSeriesVolume(e.target.value)}
|
||||||
|
placeholder="Volume number in series"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label htmlFor="tags" className="block text-sm font-medium theme-header mb-1">Tags</label>
|
||||||
|
<Input
|
||||||
|
id="tags"
|
||||||
|
value={tags}
|
||||||
|
onChange={(e) => setTags(e.target.value)}
|
||||||
|
placeholder="Comma-separated tags (e.g., fantasy, adventure, romance)"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="space-y-3">
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id="preserve-reading-position"
|
||||||
|
checked={preserveReadingPosition}
|
||||||
|
onChange={(e) => setPreserveReadingPosition(e.target.checked)}
|
||||||
|
className="mr-2"
|
||||||
|
/>
|
||||||
|
<label htmlFor="preserve-reading-position" className="text-sm theme-text">
|
||||||
|
Preserve reading position from EPUB metadata
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id="create-missing-author"
|
||||||
|
checked={createMissingAuthor}
|
||||||
|
onChange={(e) => setCreateMissingAuthor(e.target.checked)}
|
||||||
|
className="mr-2"
|
||||||
|
/>
|
||||||
|
<label htmlFor="create-missing-author" className="text-sm theme-text">
|
||||||
|
Create author if not found
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id="create-missing-series"
|
||||||
|
checked={createMissingSeries}
|
||||||
|
onChange={(e) => setCreateMissingSeries(e.target.checked)}
|
||||||
|
className="mr-2"
|
||||||
|
/>
|
||||||
|
<label htmlFor="create-missing-series" className="text-sm theme-text">
|
||||||
|
Create series if not found
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id="overwrite-existing"
|
||||||
|
checked={overwriteExisting}
|
||||||
|
onChange={(e) => setOverwriteExisting(e.target.checked)}
|
||||||
|
className="mr-2"
|
||||||
|
/>
|
||||||
|
<label htmlFor="overwrite-existing" className="text-sm theme-text">
|
||||||
|
Overwrite existing story with same title and author
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Submit Button */}
|
||||||
|
<div className="flex justify-end">
|
||||||
|
<Button
|
||||||
|
type="submit"
|
||||||
|
disabled={!selectedFile || isLoading || isValidating || (validationResult && !validationResult.valid)}
|
||||||
|
loading={isLoading}
|
||||||
|
>
|
||||||
|
{isLoading ? 'Importing...' : 'Import EPUB'}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
</ImportLayout>
|
||||||
|
);
|
||||||
|
}
|
||||||
113
frontend/src/app/import/page.tsx
Normal file
113
frontend/src/app/import/page.tsx
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
'use client';
|
||||||
|
|
||||||
|
import { useState } from 'react';
|
||||||
|
import { useRouter } from 'next/navigation';
|
||||||
|
import ImportLayout from '../../components/layout/ImportLayout';
|
||||||
|
import { Input } from '../../components/ui/Input';
|
||||||
|
import Button from '../../components/ui/Button';
|
||||||
|
|
||||||
|
export default function ImportFromUrlPage() {
|
||||||
|
const [importUrl, setImportUrl] = useState('');
|
||||||
|
const [scraping, setScraping] = useState(false);
|
||||||
|
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||||
|
|
||||||
|
const router = useRouter();
|
||||||
|
|
||||||
|
const handleImportFromUrl = async () => {
|
||||||
|
if (!importUrl.trim()) {
|
||||||
|
setErrors({ importUrl: 'URL is required' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
setScraping(true);
|
||||||
|
setErrors({});
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch('/scrape/story', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ url: importUrl }),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const errorData = await response.json();
|
||||||
|
throw new Error(errorData.error || 'Failed to scrape story');
|
||||||
|
}
|
||||||
|
|
||||||
|
const scrapedStory = await response.json();
|
||||||
|
|
||||||
|
// Redirect to add-story page with pre-filled data
|
||||||
|
const queryParams = new URLSearchParams({
|
||||||
|
from: 'url-import',
|
||||||
|
title: scrapedStory.title || '',
|
||||||
|
summary: scrapedStory.summary || '',
|
||||||
|
author: scrapedStory.author || '',
|
||||||
|
sourceUrl: scrapedStory.sourceUrl || importUrl,
|
||||||
|
tags: JSON.stringify(scrapedStory.tags || []),
|
||||||
|
content: scrapedStory.content || ''
|
||||||
|
});
|
||||||
|
|
||||||
|
router.push(`/add-story?${queryParams.toString()}`);
|
||||||
|
} catch (error: any) {
|
||||||
|
console.error('Failed to import story:', error);
|
||||||
|
setErrors({ importUrl: error.message });
|
||||||
|
} finally {
|
||||||
|
setScraping(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<ImportLayout
|
||||||
|
title="Import Story from URL"
|
||||||
|
description="Import a single story from a website"
|
||||||
|
>
|
||||||
|
<div className="space-y-6">
|
||||||
|
<div className="bg-gray-50 dark:bg-gray-800/50 rounded-lg p-6">
|
||||||
|
<h3 className="text-lg font-medium theme-header mb-4">Import Story from URL</h3>
|
||||||
|
<p className="theme-text text-sm mb-4">
|
||||||
|
Enter a URL from a supported story site to automatically extract the story content, title, author, and other metadata. After importing, you'll be able to review and edit the data before saving.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<div className="space-y-4">
|
||||||
|
<Input
|
||||||
|
label="Story URL"
|
||||||
|
type="url"
|
||||||
|
value={importUrl}
|
||||||
|
onChange={(e) => setImportUrl(e.target.value)}
|
||||||
|
placeholder="https://example.com/story-url"
|
||||||
|
error={errors.importUrl}
|
||||||
|
disabled={scraping}
|
||||||
|
/>
|
||||||
|
|
||||||
|
<div className="flex gap-3">
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
onClick={handleImportFromUrl}
|
||||||
|
loading={scraping}
|
||||||
|
disabled={!importUrl.trim() || scraping}
|
||||||
|
>
|
||||||
|
{scraping ? 'Importing...' : 'Import Story'}
|
||||||
|
</Button>
|
||||||
|
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
variant="ghost"
|
||||||
|
href="/add-story"
|
||||||
|
disabled={scraping}
|
||||||
|
>
|
||||||
|
Enter Manually Instead
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="text-xs theme-text">
|
||||||
|
<p className="font-medium mb-1">Supported Sites:</p>
|
||||||
|
<p>Archive of Our Own, DeviantArt, FanFiction.Net, Literotica, Royal Road, Wattpad, and more</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</ImportLayout>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -1,122 +1,240 @@
|
|||||||
'use client';
|
'use client';
|
||||||
|
|
||||||
import { useState, useEffect } from 'react';
|
import { useState, useEffect } from 'react';
|
||||||
import { searchApi, tagApi } from '../../lib/api';
|
import { useRouter, useSearchParams } from 'next/navigation';
|
||||||
import { Story, Tag } from '../../types/api';
|
import { searchApi, storyApi, tagApi } from '../../lib/api';
|
||||||
|
import { Story, Tag, FacetCount, AdvancedFilters } from '../../types/api';
|
||||||
import AppLayout from '../../components/layout/AppLayout';
|
import AppLayout from '../../components/layout/AppLayout';
|
||||||
import { Input } from '../../components/ui/Input';
|
import { Input } from '../../components/ui/Input';
|
||||||
import Button from '../../components/ui/Button';
|
import Button from '../../components/ui/Button';
|
||||||
import StoryMultiSelect from '../../components/stories/StoryMultiSelect';
|
import StoryMultiSelect from '../../components/stories/StoryMultiSelect';
|
||||||
import TagFilter from '../../components/stories/TagFilter';
|
import TagFilter from '../../components/stories/TagFilter';
|
||||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||||
|
import SidebarLayout from '../../components/library/SidebarLayout';
|
||||||
|
import ToolbarLayout from '../../components/library/ToolbarLayout';
|
||||||
|
import MinimalLayout from '../../components/library/MinimalLayout';
|
||||||
|
import { useLibraryLayout } from '../../hooks/useLibraryLayout';
|
||||||
|
|
||||||
type ViewMode = 'grid' | 'list';
|
type ViewMode = 'grid' | 'list';
|
||||||
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating';
|
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastRead';
|
||||||
|
|
||||||
export default function LibraryPage() {
|
export default function LibraryPage() {
|
||||||
|
const router = useRouter();
|
||||||
|
const searchParams = useSearchParams();
|
||||||
|
const { layout } = useLibraryLayout();
|
||||||
const [stories, setStories] = useState<Story[]>([]);
|
const [stories, setStories] = useState<Story[]>([]);
|
||||||
const [tags, setTags] = useState<Tag[]>([]);
|
const [tags, setTags] = useState<Tag[]>([]);
|
||||||
const [loading, setLoading] = useState(false);
|
const [loading, setLoading] = useState(false);
|
||||||
|
const [searchLoading, setSearchLoading] = useState(false);
|
||||||
|
const [randomLoading, setRandomLoading] = useState(false);
|
||||||
const [searchQuery, setSearchQuery] = useState('');
|
const [searchQuery, setSearchQuery] = useState('');
|
||||||
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
||||||
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
||||||
const [sortOption, setSortOption] = useState<SortOption>('createdAt');
|
const [sortOption, setSortOption] = useState<SortOption>('lastRead');
|
||||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
||||||
const [page, setPage] = useState(0);
|
const [page, setPage] = useState(0);
|
||||||
const [totalPages, setTotalPages] = useState(1);
|
const [totalPages, setTotalPages] = useState(1);
|
||||||
const [totalElements, setTotalElements] = useState(0);
|
const [totalElements, setTotalElements] = useState(0);
|
||||||
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
||||||
|
const [urlParamsProcessed, setUrlParamsProcessed] = useState(false);
|
||||||
|
const [advancedFilters, setAdvancedFilters] = useState<AdvancedFilters>({});
|
||||||
|
|
||||||
|
// Initialize filters from URL parameters
|
||||||
// Load tags for filtering
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const loadTags = async () => {
|
const tagsParam = searchParams.get('tags');
|
||||||
|
if (tagsParam) {
|
||||||
|
console.log('URL tag filter detected:', tagsParam);
|
||||||
|
// Use functional updates to ensure all state changes happen together
|
||||||
|
setSelectedTags([tagsParam]);
|
||||||
|
setPage(0); // Reset to first page when applying URL filter
|
||||||
|
}
|
||||||
|
setUrlParamsProcessed(true);
|
||||||
|
}, [searchParams]);
|
||||||
|
|
||||||
|
// Convert facet counts to Tag objects for the UI, enriched with full tag data
|
||||||
|
const [fullTags, setFullTags] = useState<Tag[]>([]);
|
||||||
|
|
||||||
|
// Fetch full tag data for enrichment
|
||||||
|
useEffect(() => {
|
||||||
|
const fetchFullTags = async () => {
|
||||||
try {
|
try {
|
||||||
const tagsResult = await tagApi.getTags({ page: 0, size: 1000 });
|
const result = await tagApi.getTags({ size: 1000 }); // Get all tags
|
||||||
setTags(tagsResult?.content || []);
|
setFullTags(result.content || []);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Failed to load tags:', error);
|
console.error('Failed to fetch full tag data:', error);
|
||||||
|
setFullTags([]);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
loadTags();
|
fetchFullTags();
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
|
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
|
||||||
|
if (!facets || !facets.tagNames) {
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
|
||||||
|
return facets.tagNames.map(facet => {
|
||||||
|
// Find the full tag data by name
|
||||||
|
const fullTag = fullTags.find(tag => tag.name.toLowerCase() === facet.value.toLowerCase());
|
||||||
|
|
||||||
|
return {
|
||||||
|
id: fullTag?.id || facet.value, // Use actual ID if available, fallback to name
|
||||||
|
name: facet.value,
|
||||||
|
storyCount: facet.count,
|
||||||
|
// Include color and other metadata from the full tag data
|
||||||
|
color: fullTag?.color,
|
||||||
|
description: fullTag?.description,
|
||||||
|
aliasCount: fullTag?.aliasCount,
|
||||||
|
createdAt: fullTag?.createdAt,
|
||||||
|
aliases: fullTag?.aliases
|
||||||
|
};
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
// Enrich existing tags when fullTags are loaded
|
||||||
|
useEffect(() => {
|
||||||
|
if (fullTags.length > 0) {
|
||||||
|
// Use functional update to get the current tags state
|
||||||
|
setTags(currentTags => {
|
||||||
|
if (currentTags.length > 0) {
|
||||||
|
// Check if tags already have color data to avoid infinite loops
|
||||||
|
const hasColors = currentTags.some(tag => tag.color);
|
||||||
|
if (!hasColors) {
|
||||||
|
// Re-enrich existing tags with color data
|
||||||
|
return currentTags.map(tag => {
|
||||||
|
const fullTag = fullTags.find(ft => ft.name.toLowerCase() === tag.name.toLowerCase());
|
||||||
|
return {
|
||||||
|
...tag,
|
||||||
|
color: fullTag?.color,
|
||||||
|
description: fullTag?.description,
|
||||||
|
aliasCount: fullTag?.aliasCount,
|
||||||
|
createdAt: fullTag?.createdAt,
|
||||||
|
aliases: fullTag?.aliases,
|
||||||
|
id: fullTag?.id || tag.id
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return currentTags; // Return unchanged if no enrichment needed
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}, [fullTags]); // Only run when fullTags change
|
||||||
|
|
||||||
// Debounce search to avoid too many API calls
|
// Debounce search to avoid too many API calls
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
|
// Don't run search until URL parameters have been processed
|
||||||
|
if (!urlParamsProcessed) return;
|
||||||
|
|
||||||
const debounceTimer = setTimeout(() => {
|
const debounceTimer = setTimeout(() => {
|
||||||
const performSearch = async () => {
|
const performSearch = async () => {
|
||||||
try {
|
try {
|
||||||
|
// Use searchLoading for background search, loading only for initial load
|
||||||
|
const isInitialLoad = stories.length === 0 && !searchQuery;
|
||||||
|
if (isInitialLoad) {
|
||||||
setLoading(true);
|
setLoading(true);
|
||||||
|
} else {
|
||||||
|
setSearchLoading(true);
|
||||||
|
}
|
||||||
|
|
||||||
// Always use search API for consistency - use '*' for match-all when no query
|
// Always use search API for consistency - use '*' for match-all when no query
|
||||||
const result = await searchApi.search({
|
const apiParams = {
|
||||||
query: searchQuery.trim() || '*',
|
query: searchQuery.trim() || '*',
|
||||||
page: page, // Use 0-based pagination consistently
|
page: page, // Use 0-based pagination consistently
|
||||||
size: 20,
|
size: 20,
|
||||||
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
||||||
sortBy: sortOption,
|
sortBy: sortOption,
|
||||||
sortDir: sortDirection,
|
sortDir: sortDirection,
|
||||||
});
|
facetBy: ['tagNames'], // Request tag facets for the filter UI
|
||||||
|
// Advanced filters
|
||||||
|
...advancedFilters
|
||||||
|
};
|
||||||
|
|
||||||
setStories(result?.results || []);
|
console.log('Performing search with params:', apiParams);
|
||||||
|
const result = await searchApi.search(apiParams);
|
||||||
|
|
||||||
|
const currentStories = result?.results || [];
|
||||||
|
setStories(currentStories);
|
||||||
setTotalPages(Math.ceil((result?.totalHits || 0) / 20));
|
setTotalPages(Math.ceil((result?.totalHits || 0) / 20));
|
||||||
setTotalElements(result?.totalHits || 0);
|
setTotalElements(result?.totalHits || 0);
|
||||||
|
|
||||||
|
// Update tags from facets - these represent all matching stories, not just current page
|
||||||
|
const resultTags = convertFacetsToTags(result?.facets);
|
||||||
|
setTags(resultTags);
|
||||||
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Failed to load stories:', error);
|
console.error('Failed to load stories:', error);
|
||||||
setStories([]);
|
setStories([]);
|
||||||
|
setTags([]);
|
||||||
} finally {
|
} finally {
|
||||||
setLoading(false);
|
setLoading(false);
|
||||||
|
setSearchLoading(false);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
performSearch();
|
performSearch();
|
||||||
}, searchQuery ? 300 : 0); // Debounce search, but not other changes
|
}, searchQuery ? 500 : 0); // Debounce search queries, but load immediately for filters/pagination
|
||||||
|
|
||||||
return () => clearTimeout(debounceTimer);
|
return () => clearTimeout(debounceTimer);
|
||||||
}, [searchQuery, selectedTags, page, sortOption, sortDirection, refreshTrigger]);
|
}, [searchQuery, selectedTags, sortOption, sortDirection, page, refreshTrigger, urlParamsProcessed, advancedFilters]);
|
||||||
|
|
||||||
// Reset page when search or filters change
|
|
||||||
const resetPage = () => {
|
|
||||||
if (page !== 0) {
|
|
||||||
setPage(0);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
const handleTagToggle = (tagName: string) => {
|
|
||||||
setSelectedTags(prev => {
|
|
||||||
const newTags = prev.includes(tagName)
|
|
||||||
? prev.filter(t => t !== tagName)
|
|
||||||
: [...prev, tagName];
|
|
||||||
resetPage();
|
|
||||||
return newTags;
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
const handleSearchChange = (e: React.ChangeEvent<HTMLInputElement>) => {
|
const handleSearchChange = (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||||
setSearchQuery(e.target.value);
|
setSearchQuery(e.target.value);
|
||||||
resetPage();
|
setPage(0);
|
||||||
};
|
};
|
||||||
|
|
||||||
const handleSortChange = (newSortOption: SortOption) => {
|
const handleStoryUpdate = () => {
|
||||||
if (newSortOption === sortOption) {
|
setRefreshTrigger(prev => prev + 1);
|
||||||
// Toggle direction if same option
|
};
|
||||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
|
||||||
|
const handleRandomStory = async () => {
|
||||||
|
if (totalElements === 0) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
setRandomLoading(true);
|
||||||
|
const randomStory = await storyApi.getRandomStory({
|
||||||
|
searchQuery: searchQuery || undefined,
|
||||||
|
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
||||||
|
...advancedFilters
|
||||||
|
});
|
||||||
|
if (randomStory) {
|
||||||
|
router.push(`/stories/${randomStory.id}`);
|
||||||
} else {
|
} else {
|
||||||
setSortOption(newSortOption);
|
alert('No stories available. Please add some stories first.');
|
||||||
setSortDirection('desc'); // Default to desc for new sort option
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to get random story:', error);
|
||||||
|
alert('Failed to get a random story. Please try again.');
|
||||||
|
} finally {
|
||||||
|
setRandomLoading(false);
|
||||||
}
|
}
|
||||||
resetPage();
|
|
||||||
};
|
};
|
||||||
|
|
||||||
const clearFilters = () => {
|
const clearFilters = () => {
|
||||||
setSearchQuery('');
|
setSearchQuery('');
|
||||||
setSelectedTags([]);
|
setSelectedTags([]);
|
||||||
resetPage();
|
setAdvancedFilters({});
|
||||||
|
setPage(0);
|
||||||
|
setRefreshTrigger(prev => prev + 1);
|
||||||
};
|
};
|
||||||
|
|
||||||
const handleStoryUpdate = () => {
|
const handleTagToggle = (tagName: string) => {
|
||||||
// Trigger reload by incrementing refresh trigger
|
setSelectedTags(prev =>
|
||||||
|
prev.includes(tagName)
|
||||||
|
? prev.filter(t => t !== tagName)
|
||||||
|
: [...prev, tagName]
|
||||||
|
);
|
||||||
|
setPage(0);
|
||||||
|
setRefreshTrigger(prev => prev + 1);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSortDirectionToggle = () => {
|
||||||
|
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleAdvancedFiltersChange = (filters: AdvancedFilters) => {
|
||||||
|
setAdvancedFilters(filters);
|
||||||
|
setPage(0);
|
||||||
setRefreshTrigger(prev => prev + 1);
|
setRefreshTrigger(prev => prev + 1);
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -130,108 +248,41 @@ export default function LibraryPage() {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const handleSortChange = (option: string) => {
|
||||||
|
setSortOption(option as SortOption);
|
||||||
|
};
|
||||||
|
|
||||||
|
const layoutProps = {
|
||||||
|
stories,
|
||||||
|
tags,
|
||||||
|
totalElements,
|
||||||
|
searchQuery,
|
||||||
|
selectedTags,
|
||||||
|
viewMode,
|
||||||
|
sortOption,
|
||||||
|
sortDirection,
|
||||||
|
advancedFilters,
|
||||||
|
onSearchChange: handleSearchChange,
|
||||||
|
onTagToggle: handleTagToggle,
|
||||||
|
onViewModeChange: setViewMode,
|
||||||
|
onSortChange: handleSortChange,
|
||||||
|
onSortDirectionToggle: handleSortDirectionToggle,
|
||||||
|
onAdvancedFiltersChange: handleAdvancedFiltersChange,
|
||||||
|
onRandomStory: handleRandomStory,
|
||||||
|
onClearFilters: clearFilters,
|
||||||
|
};
|
||||||
|
|
||||||
|
const renderContent = () => {
|
||||||
|
if (stories.length === 0 && !loading) {
|
||||||
return (
|
return (
|
||||||
<AppLayout>
|
<div className="text-center py-12 theme-card theme-shadow rounded-lg">
|
||||||
<div className="space-y-6">
|
<p className="theme-text text-lg mb-4">
|
||||||
{/* Header */}
|
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false)
|
||||||
<div className="flex flex-col sm:flex-row sm:items-center sm:justify-between gap-4">
|
? 'No stories match your search criteria.'
|
||||||
<div>
|
: 'Your library is empty.'
|
||||||
<h1 className="text-3xl font-bold theme-header">Your Story Library</h1>
|
|
||||||
<p className="theme-text mt-1">
|
|
||||||
{totalElements} {totalElements === 1 ? 'story' : 'stories'}
|
|
||||||
{searchQuery || selectedTags.length > 0 ? ` found` : ` total`}
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<Button href="/add-story">
|
|
||||||
Add New Story
|
|
||||||
</Button>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Search and Filters */}
|
|
||||||
<div className="space-y-4">
|
|
||||||
{/* Search Bar */}
|
|
||||||
<div className="flex flex-col sm:flex-row gap-4">
|
|
||||||
<div className="flex-1">
|
|
||||||
<Input
|
|
||||||
type="search"
|
|
||||||
placeholder="Search by title, author, or tags..."
|
|
||||||
value={searchQuery}
|
|
||||||
onChange={handleSearchChange}
|
|
||||||
className="w-full"
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* View Mode Toggle */}
|
|
||||||
<div className="flex items-center gap-2">
|
|
||||||
<button
|
|
||||||
onClick={() => setViewMode('grid')}
|
|
||||||
className={`p-2 rounded-lg transition-colors ${
|
|
||||||
viewMode === 'grid'
|
|
||||||
? 'theme-accent-bg text-white'
|
|
||||||
: 'theme-card theme-text hover:bg-opacity-80'
|
|
||||||
}`}
|
|
||||||
aria-label="Grid view"
|
|
||||||
>
|
|
||||||
⊞
|
|
||||||
</button>
|
|
||||||
<button
|
|
||||||
onClick={() => setViewMode('list')}
|
|
||||||
className={`p-2 rounded-lg transition-colors ${
|
|
||||||
viewMode === 'list'
|
|
||||||
? 'theme-accent-bg text-white'
|
|
||||||
: 'theme-card theme-text hover:bg-opacity-80'
|
|
||||||
}`}
|
|
||||||
aria-label="List view"
|
|
||||||
>
|
|
||||||
☰
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Sort and Tag Filters */}
|
|
||||||
<div className="flex flex-col sm:flex-row gap-4">
|
|
||||||
{/* Sort Options */}
|
|
||||||
<div className="flex items-center gap-2">
|
|
||||||
<label className="theme-text font-medium text-sm">Sort by:</label>
|
|
||||||
<select
|
|
||||||
value={sortOption}
|
|
||||||
onChange={(e) => handleSortChange(e.target.value as SortOption)}
|
|
||||||
className="px-3 py-1 rounded-lg theme-card theme-text theme-border border focus:outline-none focus:ring-2 focus:ring-theme-accent"
|
|
||||||
>
|
|
||||||
<option value="createdAt">Date Added</option>
|
|
||||||
<option value="title">Title</option>
|
|
||||||
<option value="authorName">Author</option>
|
|
||||||
<option value="rating">Rating</option>
|
|
||||||
</select>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Clear Filters */}
|
|
||||||
{(searchQuery || selectedTags.length > 0) && (
|
|
||||||
<Button variant="ghost" size="sm" onClick={clearFilters}>
|
|
||||||
Clear Filters
|
|
||||||
</Button>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Tag Filter */}
|
|
||||||
<TagFilter
|
|
||||||
tags={tags}
|
|
||||||
selectedTags={selectedTags}
|
|
||||||
onTagToggle={handleTagToggle}
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Stories Display */}
|
|
||||||
{stories.length === 0 && !loading ? (
|
|
||||||
<div className="text-center py-20">
|
|
||||||
<div className="theme-text text-lg mb-4">
|
|
||||||
{searchQuery || selectedTags.length > 0
|
|
||||||
? 'No stories match your filters'
|
|
||||||
: 'No stories in your library yet'
|
|
||||||
}
|
}
|
||||||
</div>
|
</p>
|
||||||
{searchQuery || selectedTags.length > 0 ? (
|
{searchQuery || selectedTags.length > 0 || Object.values(advancedFilters).some(v => v !== undefined && v !== '' && v !== 'all' && v !== false) ? (
|
||||||
<Button variant="ghost" onClick={clearFilters}>
|
<Button variant="ghost" onClick={clearFilters}>
|
||||||
Clear Filters
|
Clear Filters
|
||||||
</Button>
|
</Button>
|
||||||
@@ -241,14 +292,17 @@ export default function LibraryPage() {
|
|||||||
</Button>
|
</Button>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
) : (
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
<StoryMultiSelect
|
<StoryMultiSelect
|
||||||
stories={stories}
|
stories={stories}
|
||||||
viewMode={viewMode}
|
viewMode={viewMode}
|
||||||
onUpdate={handleStoryUpdate}
|
onUpdate={handleStoryUpdate}
|
||||||
allowMultiSelect={true}
|
allowMultiSelect={true}
|
||||||
/>
|
/>
|
||||||
)}
|
|
||||||
|
|
||||||
{/* Pagination */}
|
{/* Pagination */}
|
||||||
{totalPages > 1 && (
|
{totalPages > 1 && (
|
||||||
@@ -274,7 +328,19 @@ export default function LibraryPage() {
|
|||||||
</Button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
</div>
|
</>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
const LayoutComponent = layout === 'sidebar' ? SidebarLayout :
|
||||||
|
layout === 'toolbar' ? ToolbarLayout :
|
||||||
|
MinimalLayout;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<AppLayout>
|
||||||
|
<LayoutComponent {...layoutProps}>
|
||||||
|
{renderContent()}
|
||||||
|
</LayoutComponent>
|
||||||
</AppLayout>
|
</AppLayout>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
72
frontend/src/app/scrape/author/route.ts
Normal file
72
frontend/src/app/scrape/author/route.ts
Normal file
@@ -0,0 +1,72 @@
|
|||||||
|
import { NextRequest, NextResponse } from 'next/server';
|
||||||
|
|
||||||
|
export async function POST(request: NextRequest) {
|
||||||
|
try {
|
||||||
|
const body = await request.json();
|
||||||
|
const { url } = body;
|
||||||
|
|
||||||
|
if (!url || typeof url !== 'string') {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'URL is required and must be a string' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Dynamic import to prevent client-side bundling
|
||||||
|
const { StoryScraper } = await import('@/lib/scraper/scraper');
|
||||||
|
|
||||||
|
const scraper = new StoryScraper();
|
||||||
|
const stories = await scraper.scrapeAuthorPage(url);
|
||||||
|
|
||||||
|
return NextResponse.json({ stories });
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Author page scraping error:', error);
|
||||||
|
|
||||||
|
// Check if it's a ScraperError without importing at module level
|
||||||
|
if (error && typeof error === 'object' && error.constructor.name === 'ScraperError') {
|
||||||
|
return NextResponse.json(
|
||||||
|
{
|
||||||
|
error: (error as any).message,
|
||||||
|
url: (error as any).url
|
||||||
|
},
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error instanceof Error) {
|
||||||
|
// Handle specific error types
|
||||||
|
if (error.message.includes('Invalid URL')) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Invalid URL provided' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error.message.includes('not supported')) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Author page scraping is not supported for this website' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error.message.includes('HTTP 404')) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Author page not found at the provided URL' },
|
||||||
|
{ status: 404 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error.message.includes('timeout')) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Request timed out while fetching content' },
|
||||||
|
{ status: 408 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Failed to scrape author page. Please try again.' },
|
||||||
|
{ status: 500 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
93
frontend/src/app/scrape/bulk/progress/route.ts
Normal file
93
frontend/src/app/scrape/bulk/progress/route.ts
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
import { NextRequest } from 'next/server';
|
||||||
|
|
||||||
|
// Configure route timeout for long-running progress streams
|
||||||
|
export const maxDuration = 900; // 15 minutes (900 seconds)
|
||||||
|
|
||||||
|
interface ProgressUpdate {
|
||||||
|
type: 'progress' | 'completed' | 'error';
|
||||||
|
current: number;
|
||||||
|
total: number;
|
||||||
|
message: string;
|
||||||
|
url?: string;
|
||||||
|
title?: string;
|
||||||
|
author?: string;
|
||||||
|
wordCount?: number;
|
||||||
|
totalWordCount?: number;
|
||||||
|
error?: string;
|
||||||
|
combinedStory?: any;
|
||||||
|
results?: any[];
|
||||||
|
summary?: any;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Global progress storage (in production, use Redis or database)
|
||||||
|
const progressStore = new Map<string, ProgressUpdate[]>();
|
||||||
|
|
||||||
|
export async function GET(request: NextRequest) {
|
||||||
|
const searchParams = request.nextUrl.searchParams;
|
||||||
|
const sessionId = searchParams.get('sessionId');
|
||||||
|
|
||||||
|
if (!sessionId) {
|
||||||
|
return new Response('Session ID required', { status: 400 });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set up Server-Sent Events
|
||||||
|
const stream = new ReadableStream({
|
||||||
|
start(controller) {
|
||||||
|
const encoder = new TextEncoder();
|
||||||
|
|
||||||
|
// Send initial connection message
|
||||||
|
const data = `data: ${JSON.stringify({ type: 'connected', sessionId })}\n\n`;
|
||||||
|
controller.enqueue(encoder.encode(data));
|
||||||
|
|
||||||
|
// Check for progress updates every 500ms
|
||||||
|
const interval = setInterval(() => {
|
||||||
|
const updates = progressStore.get(sessionId);
|
||||||
|
if (updates && updates.length > 0) {
|
||||||
|
// Send all pending updates
|
||||||
|
updates.forEach(update => {
|
||||||
|
const data = `data: ${JSON.stringify(update)}\n\n`;
|
||||||
|
controller.enqueue(encoder.encode(data));
|
||||||
|
});
|
||||||
|
|
||||||
|
// Clear sent updates
|
||||||
|
progressStore.delete(sessionId);
|
||||||
|
|
||||||
|
// If this was a completion or error, close the stream
|
||||||
|
const lastUpdate = updates[updates.length - 1];
|
||||||
|
if (lastUpdate.type === 'completed' || lastUpdate.type === 'error') {
|
||||||
|
clearInterval(interval);
|
||||||
|
controller.close();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}, 500);
|
||||||
|
|
||||||
|
// Cleanup after timeout
|
||||||
|
setTimeout(() => {
|
||||||
|
clearInterval(interval);
|
||||||
|
progressStore.delete(sessionId);
|
||||||
|
controller.close();
|
||||||
|
}, 900000); // 15 minutes
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return new Response(stream, {
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'text/event-stream',
|
||||||
|
'Cache-Control': 'no-cache',
|
||||||
|
'Connection': 'keep-alive',
|
||||||
|
'Access-Control-Allow-Origin': '*',
|
||||||
|
'Access-Control-Allow-Headers': 'Cache-Control',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper function for other routes to send progress updates
|
||||||
|
export function sendProgressUpdate(sessionId: string, update: ProgressUpdate) {
|
||||||
|
if (!progressStore.has(sessionId)) {
|
||||||
|
progressStore.set(sessionId, []);
|
||||||
|
}
|
||||||
|
progressStore.get(sessionId)!.push(update);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Export the helper for other modules to use
|
||||||
|
export { progressStore };
|
||||||
564
frontend/src/app/scrape/bulk/route.ts
Normal file
564
frontend/src/app/scrape/bulk/route.ts
Normal file
@@ -0,0 +1,564 @@
|
|||||||
|
import { NextRequest, NextResponse } from 'next/server';
|
||||||
|
|
||||||
|
// Configure route timeout for long-running scraping operations
|
||||||
|
export const maxDuration = 900; // 15 minutes (900 seconds)
|
||||||
|
|
||||||
|
// Import progress tracking helper
|
||||||
|
async function sendProgressUpdate(sessionId: string, update: any) {
|
||||||
|
try {
|
||||||
|
// Dynamic import to avoid circular dependency
|
||||||
|
const { sendProgressUpdate: sendUpdate } = await import('./progress/route');
|
||||||
|
sendUpdate(sessionId, update);
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to send progress update:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
interface BulkImportRequest {
|
||||||
|
urls: string[];
|
||||||
|
combineIntoOne?: boolean;
|
||||||
|
sessionId?: string; // For progress tracking
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ImportResult {
|
||||||
|
url: string;
|
||||||
|
status: 'imported' | 'skipped' | 'error';
|
||||||
|
reason?: string;
|
||||||
|
title?: string;
|
||||||
|
author?: string;
|
||||||
|
error?: string;
|
||||||
|
storyId?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface BulkImportResponse {
|
||||||
|
results: ImportResult[];
|
||||||
|
summary: {
|
||||||
|
total: number;
|
||||||
|
imported: number;
|
||||||
|
skipped: number;
|
||||||
|
errors: number;
|
||||||
|
};
|
||||||
|
combinedStory?: {
|
||||||
|
title: string;
|
||||||
|
author: string;
|
||||||
|
content: string;
|
||||||
|
summary?: string;
|
||||||
|
sourceUrl: string;
|
||||||
|
tags?: string[];
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Background processing function for combined mode
|
||||||
|
async function processCombinedMode(
|
||||||
|
urls: string[],
|
||||||
|
sessionId: string,
|
||||||
|
authorization: string,
|
||||||
|
scraper: any
|
||||||
|
) {
|
||||||
|
const results: ImportResult[] = [];
|
||||||
|
let importedCount = 0;
|
||||||
|
let errorCount = 0;
|
||||||
|
|
||||||
|
const combinedContent: string[] = [];
|
||||||
|
let baseTitle = '';
|
||||||
|
let baseAuthor = '';
|
||||||
|
let baseSummary = '';
|
||||||
|
let baseSourceUrl = '';
|
||||||
|
const combinedTags = new Set<string>();
|
||||||
|
let totalWordCount = 0;
|
||||||
|
|
||||||
|
// Send initial progress update
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: 0,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Starting to scrape ${urls.length} URLs for combining...`,
|
||||||
|
totalWordCount: 0
|
||||||
|
});
|
||||||
|
|
||||||
|
for (let i = 0; i < urls.length; i++) {
|
||||||
|
const url = urls[i];
|
||||||
|
console.log(`Scraping URL ${i + 1}/${urls.length} for combine: ${url}`);
|
||||||
|
|
||||||
|
// Send progress update
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: i,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Scraping URL ${i + 1} of ${urls.length}...`,
|
||||||
|
url: url,
|
||||||
|
totalWordCount
|
||||||
|
});
|
||||||
|
|
||||||
|
try {
|
||||||
|
const trimmedUrl = url.trim();
|
||||||
|
if (!trimmedUrl) {
|
||||||
|
results.push({
|
||||||
|
url: url || 'Empty URL',
|
||||||
|
status: 'error',
|
||||||
|
error: 'Empty URL in combined mode'
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const scrapedStory = await scraper.scrapeStory(trimmedUrl);
|
||||||
|
|
||||||
|
// Check if we got content - this is required for combined mode
|
||||||
|
if (!scrapedStory.content || scrapedStory.content.trim() === '') {
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'error',
|
||||||
|
error: 'No content found - required for combined mode'
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use first URL for base metadata (title can be empty for combined mode)
|
||||||
|
if (i === 0) {
|
||||||
|
baseTitle = scrapedStory.title || 'Combined Story';
|
||||||
|
baseAuthor = scrapedStory.author || 'Unknown Author';
|
||||||
|
baseSummary = scrapedStory.summary || '';
|
||||||
|
baseSourceUrl = trimmedUrl;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add content with URL separator
|
||||||
|
combinedContent.push(`<!-- Content from: ${trimmedUrl} -->`);
|
||||||
|
if (scrapedStory.title && i > 0) {
|
||||||
|
combinedContent.push(`<h2>${scrapedStory.title}</h2>`);
|
||||||
|
}
|
||||||
|
combinedContent.push(scrapedStory.content);
|
||||||
|
combinedContent.push('<hr/>'); // Visual separator between parts
|
||||||
|
|
||||||
|
// Calculate word count for this story
|
||||||
|
const textContent = scrapedStory.content.replace(/<[^>]*>/g, ''); // Strip HTML
|
||||||
|
const wordCount = textContent.split(/\s+/).filter((word: string) => word.length > 0).length;
|
||||||
|
totalWordCount += wordCount;
|
||||||
|
|
||||||
|
// Collect tags from all stories
|
||||||
|
if (scrapedStory.tags) {
|
||||||
|
scrapedStory.tags.forEach((tag: string) => combinedTags.add(tag));
|
||||||
|
}
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'imported',
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
importedCount++;
|
||||||
|
|
||||||
|
// Send progress update with word count
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: i + 1,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Scraped "${scrapedStory.title}" (${wordCount.toLocaleString()} words)`,
|
||||||
|
url: trimmedUrl,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author,
|
||||||
|
wordCount: wordCount,
|
||||||
|
totalWordCount: totalWordCount
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Error processing URL ${url} in combined mode:`, error);
|
||||||
|
results.push({
|
||||||
|
url: url,
|
||||||
|
status: 'error',
|
||||||
|
error: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// If we have any errors, fail the entire combined operation
|
||||||
|
if (errorCount > 0) {
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'error',
|
||||||
|
current: urls.length,
|
||||||
|
total: urls.length,
|
||||||
|
message: 'Combined mode failed: some URLs could not be processed',
|
||||||
|
error: `${errorCount} URLs failed to process`
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check content size to prevent response size issues
|
||||||
|
const combinedContentString = combinedContent.join('\n');
|
||||||
|
const contentSizeInMB = new Blob([combinedContentString]).size / (1024 * 1024);
|
||||||
|
|
||||||
|
console.log(`Combined content size: ${contentSizeInMB.toFixed(2)} MB`);
|
||||||
|
console.log(`Combined content character length: ${combinedContentString.length}`);
|
||||||
|
console.log(`Combined content parts count: ${combinedContent.length}`);
|
||||||
|
|
||||||
|
// Return the combined story data via progress update
|
||||||
|
const combinedStory = {
|
||||||
|
title: baseTitle,
|
||||||
|
author: baseAuthor,
|
||||||
|
content: contentSizeInMB > 10 ?
|
||||||
|
combinedContentString.substring(0, Math.floor(combinedContentString.length * (10 / contentSizeInMB))) + '\n\n<!-- Content truncated due to size limit -->' :
|
||||||
|
combinedContentString,
|
||||||
|
summary: contentSizeInMB > 10 ? baseSummary + ' (Content truncated due to size limit)' : baseSummary,
|
||||||
|
sourceUrl: baseSourceUrl,
|
||||||
|
tags: Array.from(combinedTags)
|
||||||
|
};
|
||||||
|
|
||||||
|
// Send completion notification for combine mode
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'completed',
|
||||||
|
current: urls.length,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Combined scraping completed: ${totalWordCount.toLocaleString()} words from ${importedCount} stories`,
|
||||||
|
totalWordCount: totalWordCount,
|
||||||
|
combinedStory: combinedStory
|
||||||
|
});
|
||||||
|
|
||||||
|
console.log(`Combined scraping completed: ${importedCount} URLs combined into one story`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Background processing function for individual mode
|
||||||
|
async function processIndividualMode(
|
||||||
|
urls: string[],
|
||||||
|
sessionId: string,
|
||||||
|
authorization: string,
|
||||||
|
scraper: any
|
||||||
|
) {
|
||||||
|
const results: ImportResult[] = [];
|
||||||
|
let importedCount = 0;
|
||||||
|
let skippedCount = 0;
|
||||||
|
let errorCount = 0;
|
||||||
|
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: 0,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Starting to import ${urls.length} URLs individually...`
|
||||||
|
});
|
||||||
|
|
||||||
|
for (let i = 0; i < urls.length; i++) {
|
||||||
|
const url = urls[i];
|
||||||
|
console.log(`Processing URL ${i + 1}/${urls.length}: ${url}`);
|
||||||
|
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: i,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Processing URL ${i + 1} of ${urls.length}...`,
|
||||||
|
url: url
|
||||||
|
});
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Validate URL format
|
||||||
|
if (!url || typeof url !== 'string' || url.trim() === '') {
|
||||||
|
results.push({
|
||||||
|
url: url || 'Empty URL',
|
||||||
|
status: 'error',
|
||||||
|
error: 'Invalid URL format'
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const trimmedUrl = url.trim();
|
||||||
|
|
||||||
|
// Scrape the story
|
||||||
|
const scrapedStory = await scraper.scrapeStory(trimmedUrl);
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!scrapedStory.title || !scrapedStory.author || !scrapedStory.content) {
|
||||||
|
const missingFields = [];
|
||||||
|
if (!scrapedStory.title) missingFields.push('title');
|
||||||
|
if (!scrapedStory.author) missingFields.push('author');
|
||||||
|
if (!scrapedStory.content) missingFields.push('content');
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'skipped',
|
||||||
|
reason: `Missing required fields: ${missingFields.join(', ')}`,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
skippedCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for duplicates using query parameters
|
||||||
|
try {
|
||||||
|
const duplicateCheckUrl = `http://backend:8080/api/stories/check-duplicate`;
|
||||||
|
const params = new URLSearchParams({
|
||||||
|
title: scrapedStory.title,
|
||||||
|
authorName: scrapedStory.author
|
||||||
|
});
|
||||||
|
|
||||||
|
const duplicateCheckResponse = await fetch(`${duplicateCheckUrl}?${params.toString()}`, {
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
'Authorization': authorization,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (duplicateCheckResponse.ok) {
|
||||||
|
const duplicateResult = await duplicateCheckResponse.json();
|
||||||
|
if (duplicateResult.hasDuplicates) {
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'skipped',
|
||||||
|
reason: `Duplicate story found (${duplicateResult.count} existing)`,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
skippedCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Duplicate check failed:', error);
|
||||||
|
// Continue with import if duplicate check fails
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create the story
|
||||||
|
try {
|
||||||
|
const storyData = {
|
||||||
|
title: scrapedStory.title,
|
||||||
|
summary: scrapedStory.summary || undefined,
|
||||||
|
contentHtml: scrapedStory.content,
|
||||||
|
sourceUrl: scrapedStory.sourceUrl || trimmedUrl,
|
||||||
|
authorName: scrapedStory.author,
|
||||||
|
tagNames: scrapedStory.tags && scrapedStory.tags.length > 0 ? scrapedStory.tags : undefined,
|
||||||
|
};
|
||||||
|
|
||||||
|
const createUrl = `http://backend:8080/api/stories`;
|
||||||
|
const createResponse = await fetch(createUrl, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Authorization': authorization,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
body: JSON.stringify(storyData),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!createResponse.ok) {
|
||||||
|
const errorData = await createResponse.json();
|
||||||
|
throw new Error(errorData.message || 'Failed to create story');
|
||||||
|
}
|
||||||
|
|
||||||
|
const createdStory = await createResponse.json();
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'imported',
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author,
|
||||||
|
storyId: createdStory.id
|
||||||
|
});
|
||||||
|
importedCount++;
|
||||||
|
|
||||||
|
console.log(`Successfully imported: ${scrapedStory.title} by ${scrapedStory.author} (ID: ${createdStory.id})`);
|
||||||
|
|
||||||
|
// Send progress update for successful import
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: i + 1,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Imported "${scrapedStory.title}" by ${scrapedStory.author}`,
|
||||||
|
url: trimmedUrl,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Failed to create story for ${trimmedUrl}:`, error);
|
||||||
|
|
||||||
|
let errorMessage = 'Failed to create story';
|
||||||
|
if (error instanceof Error) {
|
||||||
|
errorMessage = error.message;
|
||||||
|
}
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'error',
|
||||||
|
error: errorMessage,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Error processing URL ${url}:`, error);
|
||||||
|
|
||||||
|
let errorMessage = 'Unknown error';
|
||||||
|
if (error instanceof Error) {
|
||||||
|
errorMessage = error.message;
|
||||||
|
}
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: url,
|
||||||
|
status: 'error',
|
||||||
|
error: errorMessage
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send completion notification
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'completed',
|
||||||
|
current: urls.length,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`,
|
||||||
|
results: results,
|
||||||
|
summary: {
|
||||||
|
total: urls.length,
|
||||||
|
imported: importedCount,
|
||||||
|
skipped: skippedCount,
|
||||||
|
errors: errorCount
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
|
||||||
|
|
||||||
|
// Trigger Typesense reindex if any stories were imported
|
||||||
|
if (importedCount > 0) {
|
||||||
|
try {
|
||||||
|
console.log('Triggering Typesense reindex after bulk import...');
|
||||||
|
const reindexUrl = `http://backend:8080/api/stories/reindex-typesense`;
|
||||||
|
const reindexResponse = await fetch(reindexUrl, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Authorization': authorization,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (reindexResponse.ok) {
|
||||||
|
const reindexResult = await reindexResponse.json();
|
||||||
|
console.log('Typesense reindex completed:', reindexResult);
|
||||||
|
} else {
|
||||||
|
console.warn('Typesense reindex failed:', reindexResponse.status);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to trigger Typesense reindex:', error);
|
||||||
|
// Don't fail the whole request if reindex fails
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Background processing function
|
||||||
|
async function processBulkImport(
|
||||||
|
urls: string[],
|
||||||
|
combineIntoOne: boolean,
|
||||||
|
sessionId: string,
|
||||||
|
authorization: string
|
||||||
|
) {
|
||||||
|
try {
|
||||||
|
// Dynamic imports to prevent client-side bundling
|
||||||
|
const { StoryScraper } = await import('@/lib/scraper/scraper');
|
||||||
|
|
||||||
|
const scraper = new StoryScraper();
|
||||||
|
|
||||||
|
console.log(`Starting bulk scraping for ${urls.length} URLs${combineIntoOne ? ' (combine mode)' : ''}`);
|
||||||
|
console.log(`Session ID: ${sessionId}`);
|
||||||
|
|
||||||
|
// Quick test to verify backend connectivity
|
||||||
|
try {
|
||||||
|
console.log(`Testing backend connectivity at: http://backend:8080/api/stories/check-duplicate`);
|
||||||
|
const testResponse = await fetch(`http://backend:8080/api/stories/check-duplicate?title=test&authorName=test`, {
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
'Authorization': authorization,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
console.log(`Backend test response status: ${testResponse.status}`);
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Backend connectivity test failed:`, error);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle combined mode
|
||||||
|
if (combineIntoOne) {
|
||||||
|
await processCombinedMode(urls, sessionId, authorization, scraper);
|
||||||
|
} else {
|
||||||
|
// Normal individual processing mode
|
||||||
|
await processIndividualMode(urls, sessionId, authorization, scraper);
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Background bulk import error:', error);
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'error',
|
||||||
|
current: 0,
|
||||||
|
total: urls.length,
|
||||||
|
message: 'Bulk import failed due to an error',
|
||||||
|
error: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function POST(request: NextRequest) {
|
||||||
|
try {
|
||||||
|
// Check for authentication
|
||||||
|
const authorization = request.headers.get('authorization');
|
||||||
|
if (!authorization) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Authentication required for bulk import' },
|
||||||
|
{ status: 401 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const body = await request.json();
|
||||||
|
const { urls, combineIntoOne = false, sessionId } = body as BulkImportRequest;
|
||||||
|
|
||||||
|
if (!urls || !Array.isArray(urls) || urls.length === 0) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'URLs array is required and must not be empty' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (urls.length > 200) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Maximum 200 URLs allowed per bulk import' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!sessionId) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Session ID is required for progress tracking' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start the background processing
|
||||||
|
processBulkImport(urls, combineIntoOne, sessionId, authorization).catch(error => {
|
||||||
|
console.error('Failed to start background processing:', error);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Return immediately with session info
|
||||||
|
return NextResponse.json({
|
||||||
|
message: 'Bulk import started',
|
||||||
|
sessionId: sessionId,
|
||||||
|
totalUrls: urls.length,
|
||||||
|
combineMode: combineIntoOne
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Bulk import initialization error:', error);
|
||||||
|
|
||||||
|
if (error instanceof Error) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: `Bulk import failed to start: ${error.message}` },
|
||||||
|
{ status: 500 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Bulk import failed to start due to an unknown error' },
|
||||||
|
{ status: 500 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
85
frontend/src/app/scrape/story/route.ts
Normal file
85
frontend/src/app/scrape/story/route.ts
Normal file
@@ -0,0 +1,85 @@
|
|||||||
|
import { NextRequest, NextResponse } from 'next/server';
|
||||||
|
|
||||||
|
export async function POST(request: NextRequest) {
|
||||||
|
try {
|
||||||
|
const body = await request.json();
|
||||||
|
const { url } = body;
|
||||||
|
|
||||||
|
if (!url || typeof url !== 'string') {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'URL is required and must be a string' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Dynamic import to prevent client-side bundling
|
||||||
|
const { StoryScraper } = await import('@/lib/scraper/scraper');
|
||||||
|
const { ScraperError } = await import('@/lib/scraper/types');
|
||||||
|
|
||||||
|
const scraper = new StoryScraper();
|
||||||
|
const story = await scraper.scrapeStory(url);
|
||||||
|
|
||||||
|
// Debug logging
|
||||||
|
console.log('Scraped story data:', {
|
||||||
|
url: url,
|
||||||
|
title: story.title,
|
||||||
|
author: story.author,
|
||||||
|
summary: story.summary,
|
||||||
|
contentLength: story.content?.length || 0,
|
||||||
|
contentPreview: story.content?.substring(0, 200) + '...',
|
||||||
|
tags: story.tags,
|
||||||
|
coverImage: story.coverImage
|
||||||
|
});
|
||||||
|
|
||||||
|
return NextResponse.json(story);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Story scraping error:', error);
|
||||||
|
|
||||||
|
// Check if it's a ScraperError without importing at module level
|
||||||
|
if (error && typeof error === 'object' && error.constructor.name === 'ScraperError') {
|
||||||
|
return NextResponse.json(
|
||||||
|
{
|
||||||
|
error: (error as any).message,
|
||||||
|
url: (error as any).url
|
||||||
|
},
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error instanceof Error) {
|
||||||
|
// Handle specific error types
|
||||||
|
if (error.message.includes('Invalid URL')) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Invalid URL provided' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error.message.includes('Unsupported site')) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'This website is not supported for scraping' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error.message.includes('HTTP 404')) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Story not found at the provided URL' },
|
||||||
|
{ status: 404 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error.message.includes('timeout')) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Request timed out while fetching content' },
|
||||||
|
{ status: 408 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Failed to scrape story. Please try again.' },
|
||||||
|
{ status: 500 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user