Various improvements & Epub support
This commit is contained in:
459
EPUB_IMPORT_EXPORT_SPECIFICATION.md
Normal file
459
EPUB_IMPORT_EXPORT_SPECIFICATION.md
Normal file
@@ -0,0 +1,459 @@
|
|||||||
|
# EPUB Import/Export Specification
|
||||||
|
|
||||||
|
## 🎉 Phase 1 Implementation Complete
|
||||||
|
|
||||||
|
**Status**: Phase 1 fully implemented and operational as of August 2025
|
||||||
|
|
||||||
|
**Key Achievements**:
|
||||||
|
- ✅ Complete EPUB import functionality with validation and error handling
|
||||||
|
- ✅ Single story EPUB export with XML validation fixes
|
||||||
|
- ✅ Reading position preservation using EPUB CFI standards
|
||||||
|
- ✅ Full frontend UI integration with navigation and authentication
|
||||||
|
- ✅ Moved export button to Story Detail View for better UX
|
||||||
|
- ✅ Added EPUB import to main Add Story menu dropdown
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
This specification defines the requirements and implementation details for importing and exporting EPUB files in StoryCove. The feature enables users to import stories from EPUB files and export their stories/collections as EPUB files with preserved reading positions.
|
||||||
|
|
||||||
|
## Scope
|
||||||
|
|
||||||
|
### In Scope
|
||||||
|
- **EPUB Import**: Parse DRM-free EPUB files and import as stories
|
||||||
|
- **EPUB Export**: Export individual stories and collections as EPUB files
|
||||||
|
- **Reading Position Preservation**: Store and restore reading positions using EPUB standards
|
||||||
|
- **Metadata Handling**: Extract and preserve story metadata (title, author, cover, etc.)
|
||||||
|
- **Content Processing**: HTML content sanitization and formatting
|
||||||
|
|
||||||
|
### Out of Scope (Phase 1)
|
||||||
|
- DRM-protected EPUB files (future consideration)
|
||||||
|
- Real-time reading position sync between devices
|
||||||
|
- Advanced EPUB features (audio, video, interactive content)
|
||||||
|
- EPUB validation beyond basic structure
|
||||||
|
|
||||||
|
## Technical Architecture
|
||||||
|
|
||||||
|
### Backend Implementation
|
||||||
|
- **Language**: Java (Spring Boot)
|
||||||
|
- **Primary Library**: EPUBLib (nl.siegmann.epublib:epublib-core:3.1)
|
||||||
|
- **Processing**: Server-side generation and parsing
|
||||||
|
- **File Handling**: Multipart file upload for import, streaming download for export
|
||||||
|
|
||||||
|
### Dependencies
|
||||||
|
```xml
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.positiondev.epublib</groupId>
|
||||||
|
<artifactId>epublib-core</artifactId>
|
||||||
|
<version>3.1</version>
|
||||||
|
</dependency>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Phase 1 Implementation Notes
|
||||||
|
- **EPUBImportService**: Implemented with full validation, metadata extraction, and reading position handling
|
||||||
|
- **EPUBExportService**: Implemented with XML validation fixes for EPUB reader compatibility
|
||||||
|
- **ReadingPosition Entity**: Created with EPUB CFI support and database indexing
|
||||||
|
- **Authentication**: All endpoints secured with JWT authentication and proper frontend integration
|
||||||
|
- **UI Integration**: Export moved to Story Detail View, Import added to main navigation menu
|
||||||
|
- **XML Compliance**: Fixed XHTML validation issues by properly formatting self-closing tags (`<br>` → `<br />`)
|
||||||
|
|
||||||
|
## EPUB Import Specification
|
||||||
|
|
||||||
|
### Supported Formats
|
||||||
|
- **EPUB 2.0** and **EPUB 3.x** formats
|
||||||
|
- **DRM-Free** files only
|
||||||
|
- **Maximum file size**: 50MB
|
||||||
|
- **Supported content**: Text-based stories with HTML content
|
||||||
|
|
||||||
|
### Import Process Flow
|
||||||
|
1. **File Upload**: User uploads EPUB file via web interface
|
||||||
|
2. **Validation**: Check file format, size, and basic EPUB structure
|
||||||
|
3. **Parsing**: Extract metadata, content, and resources using EPUBLib
|
||||||
|
4. **Content Processing**: Sanitize HTML content using existing Jsoup pipeline
|
||||||
|
5. **Story Creation**: Create Story entity with extracted data
|
||||||
|
6. **Preview**: Show extracted story details for user confirmation
|
||||||
|
7. **Finalization**: Save story to database with imported metadata
|
||||||
|
|
||||||
|
### Metadata Mapping
|
||||||
|
```java
|
||||||
|
// EPUB Metadata → StoryCove Story Entity
|
||||||
|
epub.getMetadata().getFirstTitle() → story.title
|
||||||
|
epub.getMetadata().getAuthors().get(0) → story.authorName
|
||||||
|
epub.getMetadata().getDescriptions().get(0) → story.summary
|
||||||
|
epub.getCoverImage() → story.coverPath
|
||||||
|
epub.getMetadata().getSubjects() → story.tags
|
||||||
|
```
|
||||||
|
|
||||||
|
### Content Extraction
|
||||||
|
- **Multi-chapter EPUBs**: Combine all content files into single HTML
|
||||||
|
- **Chapter separation**: Insert `<hr>` or `<h2>` tags between chapters
|
||||||
|
- **HTML sanitization**: Apply existing sanitization rules
|
||||||
|
- **Image handling**: Extract and store cover images, inline images optional
|
||||||
|
|
||||||
|
### API Endpoints
|
||||||
|
|
||||||
|
#### POST /api/stories/import-epub
|
||||||
|
```java
|
||||||
|
@PostMapping("/import-epub")
|
||||||
|
public ResponseEntity<?> importEPUB(@RequestParam("file") MultipartFile file) {
|
||||||
|
// Implementation in EPUBImportService
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Request**: Multipart file upload
|
||||||
|
**Response**:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"message": "EPUB imported successfully",
|
||||||
|
"storyId": "uuid",
|
||||||
|
"extractedData": {
|
||||||
|
"title": "Story Title",
|
||||||
|
"author": "Author Name",
|
||||||
|
"summary": "Story description",
|
||||||
|
"chapterCount": 12,
|
||||||
|
"wordCount": 45000,
|
||||||
|
"hasCovers": true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## EPUB Export Specification
|
||||||
|
|
||||||
|
### Export Types
|
||||||
|
1. **Single Story Export**: Convert one story to EPUB
|
||||||
|
2. **Collection Export**: Multiple stories as single EPUB with chapters
|
||||||
|
|
||||||
|
### EPUB Structure Generation
|
||||||
|
```
|
||||||
|
story.epub
|
||||||
|
├── mimetype
|
||||||
|
├── META-INF/
|
||||||
|
│ └── container.xml
|
||||||
|
└── OEBPS/
|
||||||
|
├── content.opf # Package metadata
|
||||||
|
├── toc.ncx # Navigation
|
||||||
|
├── stylesheet.css # Styling
|
||||||
|
├── cover.html # Cover page
|
||||||
|
├── chapter001.xhtml # Story content
|
||||||
|
├── images/
|
||||||
|
│ └── cover.jpg # Cover image
|
||||||
|
└── fonts/ (optional)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Reading Position Implementation
|
||||||
|
|
||||||
|
#### EPUB 3 CFI (Canonical Fragment Identifier)
|
||||||
|
```xml
|
||||||
|
<!-- In content.opf metadata -->
|
||||||
|
<meta property="epub-cfi" content="/6/4[chap01]!/4[body01]/10[para05]/3:142"/>
|
||||||
|
<meta property="reading-percentage" content="0.65"/>
|
||||||
|
<meta property="last-read-timestamp" content="2023-12-07T10:30:00Z"/>
|
||||||
|
```
|
||||||
|
|
||||||
|
#### StoryCove Custom Metadata (Fallback)
|
||||||
|
```xml
|
||||||
|
<meta name="storycove:reading-chapter" content="3"/>
|
||||||
|
<meta name="storycove:reading-paragraph" content="15"/>
|
||||||
|
<meta name="storycove:reading-offset" content="142"/>
|
||||||
|
<meta name="storycove:reading-percentage" content="0.65"/>
|
||||||
|
```
|
||||||
|
|
||||||
|
#### CFI Generation Logic
|
||||||
|
```java
|
||||||
|
public String generateCFI(ReadingPosition position) {
|
||||||
|
return String.format("/6/%d[chap%02d]!/4[body01]/%d[para%02d]/3:%d",
|
||||||
|
(position.getChapterIndex() * 2) + 4,
|
||||||
|
position.getChapterIndex(),
|
||||||
|
(position.getParagraphIndex() * 2) + 4,
|
||||||
|
position.getParagraphIndex(),
|
||||||
|
position.getCharacterOffset());
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### API Endpoints
|
||||||
|
|
||||||
|
#### GET /api/stories/{id}/export-epub
|
||||||
|
```java
|
||||||
|
@GetMapping("/{id}/export-epub")
|
||||||
|
public ResponseEntity<StreamingResponseBody> exportStory(@PathVariable UUID id) {
|
||||||
|
// Implementation in EPUBExportService
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response**: EPUB file download with headers:
|
||||||
|
```
|
||||||
|
Content-Type: application/epub+zip
|
||||||
|
Content-Disposition: attachment; filename="story-title.epub"
|
||||||
|
```
|
||||||
|
|
||||||
|
#### GET /api/collections/{id}/export-epub
|
||||||
|
```java
|
||||||
|
@GetMapping("/{id}/export-epub")
|
||||||
|
public ResponseEntity<StreamingResponseBody> exportCollection(@PathVariable UUID id) {
|
||||||
|
// Implementation in EPUBExportService
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response**: Multi-story EPUB with table of contents
|
||||||
|
|
||||||
|
## Data Models
|
||||||
|
|
||||||
|
### ReadingPosition Entity
|
||||||
|
```java
|
||||||
|
@Entity
|
||||||
|
@Table(name = "reading_positions")
|
||||||
|
public class ReadingPosition {
|
||||||
|
@Id
|
||||||
|
private UUID id;
|
||||||
|
|
||||||
|
@ManyToOne(fetch = FetchType.LAZY)
|
||||||
|
@JoinColumn(name = "story_id")
|
||||||
|
private Story story;
|
||||||
|
|
||||||
|
@Column(name = "chapter_index")
|
||||||
|
private Integer chapterIndex = 0;
|
||||||
|
|
||||||
|
@Column(name = "paragraph_index")
|
||||||
|
private Integer paragraphIndex = 0;
|
||||||
|
|
||||||
|
@Column(name = "character_offset")
|
||||||
|
private Integer characterOffset = 0;
|
||||||
|
|
||||||
|
@Column(name = "progress_percentage")
|
||||||
|
private Double progressPercentage = 0.0;
|
||||||
|
|
||||||
|
@Column(name = "epub_cfi")
|
||||||
|
private String canonicalFragmentIdentifier;
|
||||||
|
|
||||||
|
@Column(name = "last_read_at")
|
||||||
|
private LocalDateTime lastReadAt;
|
||||||
|
|
||||||
|
@Column(name = "device_identifier")
|
||||||
|
private String deviceIdentifier;
|
||||||
|
|
||||||
|
// Constructors, getters, setters
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### EPUB Import Request DTO
|
||||||
|
```java
|
||||||
|
public class EPUBImportRequest {
|
||||||
|
private String filename;
|
||||||
|
private Long fileSize;
|
||||||
|
private Boolean preserveChapterStructure = true;
|
||||||
|
private Boolean extractCover = true;
|
||||||
|
private String targetCollectionId; // Optional: add to specific collection
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### EPUB Export Options DTO
|
||||||
|
```java
|
||||||
|
public class EPUBExportOptions {
|
||||||
|
private Boolean includeReadingPosition = true;
|
||||||
|
private Boolean includeCoverImage = true;
|
||||||
|
private Boolean includeMetadata = true;
|
||||||
|
private String cssStylesheet; // Optional custom CSS
|
||||||
|
private EPUBVersion version = EPUBVersion.EPUB3;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Service Layer Architecture
|
||||||
|
|
||||||
|
### EPUBImportService
|
||||||
|
```java
|
||||||
|
@Service
|
||||||
|
public class EPUBImportService {
|
||||||
|
|
||||||
|
// Core import method
|
||||||
|
public Story importEPUBFile(MultipartFile file, EPUBImportRequest request);
|
||||||
|
|
||||||
|
// Helper methods
|
||||||
|
private void validateEPUBFile(MultipartFile file);
|
||||||
|
private Book parseEPUBStructure(InputStream inputStream);
|
||||||
|
private Story extractStoryData(Book epub);
|
||||||
|
private String combineChapterContent(Book epub);
|
||||||
|
private void extractAndSaveCover(Book epub, Story story);
|
||||||
|
private List<String> extractTags(Book epub);
|
||||||
|
private ReadingPosition extractReadingPosition(Book epub);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### EPUBExportService
|
||||||
|
```java
|
||||||
|
@Service
|
||||||
|
public class EPUBExportService {
|
||||||
|
|
||||||
|
// Core export methods
|
||||||
|
public byte[] exportSingleStory(UUID storyId, EPUBExportOptions options);
|
||||||
|
public byte[] exportCollection(UUID collectionId, EPUBExportOptions options);
|
||||||
|
|
||||||
|
// Helper methods
|
||||||
|
private Book createEPUBStructure(Story story, ReadingPosition position);
|
||||||
|
private Book createCollectionEPUB(Collection collection, List<ReadingPosition> positions);
|
||||||
|
private void addReadingPositionMetadata(Book book, ReadingPosition position);
|
||||||
|
private String generateCFI(ReadingPosition position);
|
||||||
|
private Resource createChapterResource(Story story);
|
||||||
|
private Resource createStylesheetResource();
|
||||||
|
private void addCoverImage(Book book, Story story);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Frontend Integration
|
||||||
|
|
||||||
|
### Import UI Flow
|
||||||
|
1. **Upload Interface**: File input with EPUB validation
|
||||||
|
2. **Progress Indicator**: Show parsing progress
|
||||||
|
3. **Preview Screen**: Display extracted metadata for confirmation
|
||||||
|
4. **Confirmation**: Allow editing of title, author, summary before saving
|
||||||
|
5. **Success**: Redirect to created story
|
||||||
|
|
||||||
|
### Export UI Flow
|
||||||
|
1. **Export Button**: Available on story detail and collection pages
|
||||||
|
2. **Options Modal**: Allow selection of export options
|
||||||
|
3. **Progress Indicator**: Show EPUB generation progress
|
||||||
|
4. **Download**: Automatic file download on completion
|
||||||
|
|
||||||
|
### Frontend API Calls
|
||||||
|
```typescript
|
||||||
|
// Import EPUB
|
||||||
|
const importEPUB = async (file: File) => {
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', file);
|
||||||
|
|
||||||
|
const response = await fetch('/api/stories/import-epub', {
|
||||||
|
method: 'POST',
|
||||||
|
body: formData,
|
||||||
|
});
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
};
|
||||||
|
|
||||||
|
// Export Story
|
||||||
|
const exportStoryEPUB = async (storyId: string) => {
|
||||||
|
const response = await fetch(`/api/stories/${storyId}/export-epub`, {
|
||||||
|
method: 'GET',
|
||||||
|
});
|
||||||
|
|
||||||
|
const blob = await response.blob();
|
||||||
|
const url = window.URL.createObjectURL(blob);
|
||||||
|
const a = document.createElement('a');
|
||||||
|
a.href = url;
|
||||||
|
a.download = `${storyTitle}.epub`;
|
||||||
|
a.click();
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Import Errors
|
||||||
|
- **Invalid EPUB format**: "Invalid EPUB file format"
|
||||||
|
- **File too large**: "File size exceeds 50MB limit"
|
||||||
|
- **DRM protected**: "DRM-protected EPUBs not supported"
|
||||||
|
- **Corrupted file**: "EPUB file appears to be corrupted"
|
||||||
|
- **No content**: "EPUB contains no readable content"
|
||||||
|
|
||||||
|
### Export Errors
|
||||||
|
- **Story not found**: "Story not found or access denied"
|
||||||
|
- **Missing content**: "Story has no content to export"
|
||||||
|
- **Generation failure**: "Failed to generate EPUB file"
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
### File Upload Security
|
||||||
|
- **File type validation**: Verify EPUB MIME type and structure
|
||||||
|
- **Size limits**: Enforce maximum file size limits
|
||||||
|
- **Content sanitization**: Apply existing HTML sanitization
|
||||||
|
- **Virus scanning**: Consider integration with antivirus scanning
|
||||||
|
|
||||||
|
### Content Security
|
||||||
|
- **HTML sanitization**: Apply existing Jsoup rules to imported content
|
||||||
|
- **Image validation**: Validate extracted cover images
|
||||||
|
- **Metadata escaping**: Escape special characters in metadata
|
||||||
|
|
||||||
|
## Testing Strategy
|
||||||
|
|
||||||
|
### Unit Tests
|
||||||
|
- EPUB parsing and validation logic
|
||||||
|
- CFI generation and parsing
|
||||||
|
- Metadata extraction accuracy
|
||||||
|
- Content sanitization
|
||||||
|
|
||||||
|
### Integration Tests
|
||||||
|
- End-to-end import/export workflow
|
||||||
|
- Reading position preservation
|
||||||
|
- Multi-story collection export
|
||||||
|
- Error handling scenarios
|
||||||
|
|
||||||
|
### Test Data
|
||||||
|
- Sample EPUB files for various scenarios
|
||||||
|
- EPUBs with and without reading positions
|
||||||
|
- Multi-chapter EPUBs
|
||||||
|
- EPUBs with covers and metadata
|
||||||
|
|
||||||
|
## Performance Considerations
|
||||||
|
|
||||||
|
### Import Performance
|
||||||
|
- **Streaming processing**: Process large EPUBs without loading entirely into memory
|
||||||
|
- **Async processing**: Consider async import for large files
|
||||||
|
- **Progress tracking**: Provide progress feedback for large imports
|
||||||
|
|
||||||
|
### Export Performance
|
||||||
|
- **Caching**: Cache generated EPUBs for repeated exports
|
||||||
|
- **Streaming**: Stream EPUB generation for large collections
|
||||||
|
- **Resource optimization**: Optimize image and content sizes
|
||||||
|
|
||||||
|
## Future Enhancements (Out of Scope)
|
||||||
|
|
||||||
|
### Phase 2 Considerations
|
||||||
|
- **DRM support**: Research legal and technical feasibility
|
||||||
|
- **Reading position sync**: Real-time sync across devices
|
||||||
|
- **Advanced EPUB features**: Enhanced typography, annotations
|
||||||
|
- **Bulk operations**: Import/export multiple EPUBs
|
||||||
|
- **EPUB validation**: Full EPUB compliance checking
|
||||||
|
|
||||||
|
### Integration Possibilities
|
||||||
|
- **Cloud storage**: Export directly to Google Drive, Dropbox
|
||||||
|
- **E-reader sync**: Direct sync with Kindle, Kobo devices
|
||||||
|
- **Reading analytics**: Track reading patterns and statistics
|
||||||
|
|
||||||
|
## Implementation Phases
|
||||||
|
|
||||||
|
### Phase 1: Core Functionality ✅ **COMPLETED**
|
||||||
|
- [x] Basic EPUB import (DRM-free)
|
||||||
|
- [x] Single story export
|
||||||
|
- [x] Reading position storage and retrieval
|
||||||
|
- [x] Frontend UI integration
|
||||||
|
|
||||||
|
### Phase 2: Enhanced Features
|
||||||
|
- [ ] Collection export
|
||||||
|
- [ ] Advanced metadata handling
|
||||||
|
- [ ] Performance optimizations
|
||||||
|
- [ ] Comprehensive error handling
|
||||||
|
|
||||||
|
### Phase 3: Advanced Features
|
||||||
|
- [ ] DRM exploration (legal research required)
|
||||||
|
- [ ] Reading position sync
|
||||||
|
- [ ] Advanced EPUB features
|
||||||
|
- [ ] Analytics and reporting
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
|
||||||
|
### Import Success Criteria ✅ **COMPLETED**
|
||||||
|
- [x] Successfully parse EPUB 2.0 and 3.x files
|
||||||
|
- [x] Extract title, author, summary, and content accurately
|
||||||
|
- [x] Preserve formatting and basic HTML structure
|
||||||
|
- [x] Handle cover images correctly
|
||||||
|
- [x] Import reading positions when present
|
||||||
|
- [x] Provide clear error messages for invalid files
|
||||||
|
|
||||||
|
### Export Success Criteria ✅ **PHASE 1 COMPLETED**
|
||||||
|
- [x] Generate valid EPUB files compatible with major readers
|
||||||
|
- [x] Include accurate metadata and content
|
||||||
|
- [x] Embed reading positions using CFI standard
|
||||||
|
- [x] Support single story export
|
||||||
|
- [ ] Support collection export *(Phase 2)*
|
||||||
|
- [ ] Generate proper table of contents for collections *(Phase 2)*
|
||||||
|
- [x] Include cover images when available
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*This specification serves as the implementation guide for the EPUB import/export feature. All implementation decisions should reference this document for consistency and completeness.*
|
||||||
@@ -84,6 +84,11 @@
|
|||||||
<artifactId>typesense-java</artifactId>
|
<artifactId>typesense-java</artifactId>
|
||||||
<version>1.3.0</version>
|
<version>1.3.0</version>
|
||||||
</dependency>
|
</dependency>
|
||||||
|
<dependency>
|
||||||
|
<groupId>com.positiondev.epublib</groupId>
|
||||||
|
<artifactId>epublib-core</artifactId>
|
||||||
|
<version>3.1</version>
|
||||||
|
</dependency>
|
||||||
|
|
||||||
<!-- Test dependencies -->
|
<!-- Test dependencies -->
|
||||||
<dependency>
|
<dependency>
|
||||||
|
|||||||
@@ -42,6 +42,8 @@ public class StoryController {
|
|||||||
private final TypesenseService typesenseService;
|
private final TypesenseService typesenseService;
|
||||||
private final CollectionService collectionService;
|
private final CollectionService collectionService;
|
||||||
private final ReadingTimeService readingTimeService;
|
private final ReadingTimeService readingTimeService;
|
||||||
|
private final EPUBImportService epubImportService;
|
||||||
|
private final EPUBExportService epubExportService;
|
||||||
|
|
||||||
public StoryController(StoryService storyService,
|
public StoryController(StoryService storyService,
|
||||||
AuthorService authorService,
|
AuthorService authorService,
|
||||||
@@ -50,7 +52,9 @@ public class StoryController {
|
|||||||
ImageService imageService,
|
ImageService imageService,
|
||||||
CollectionService collectionService,
|
CollectionService collectionService,
|
||||||
@Autowired(required = false) TypesenseService typesenseService,
|
@Autowired(required = false) TypesenseService typesenseService,
|
||||||
ReadingTimeService readingTimeService) {
|
ReadingTimeService readingTimeService,
|
||||||
|
EPUBImportService epubImportService,
|
||||||
|
EPUBExportService epubExportService) {
|
||||||
this.storyService = storyService;
|
this.storyService = storyService;
|
||||||
this.authorService = authorService;
|
this.authorService = authorService;
|
||||||
this.seriesService = seriesService;
|
this.seriesService = seriesService;
|
||||||
@@ -59,6 +63,8 @@ public class StoryController {
|
|||||||
this.collectionService = collectionService;
|
this.collectionService = collectionService;
|
||||||
this.typesenseService = typesenseService;
|
this.typesenseService = typesenseService;
|
||||||
this.readingTimeService = readingTimeService;
|
this.readingTimeService = readingTimeService;
|
||||||
|
this.epubImportService = epubImportService;
|
||||||
|
this.epubExportService = epubExportService;
|
||||||
}
|
}
|
||||||
|
|
||||||
@GetMapping
|
@GetMapping
|
||||||
@@ -533,6 +539,117 @@ public class StoryController {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// EPUB Import endpoint
|
||||||
|
@PostMapping("/epub/import")
|
||||||
|
public ResponseEntity<EPUBImportResponse> importEPUB(
|
||||||
|
@RequestParam("file") MultipartFile file,
|
||||||
|
@RequestParam(required = false) UUID authorId,
|
||||||
|
@RequestParam(required = false) String authorName,
|
||||||
|
@RequestParam(required = false) UUID seriesId,
|
||||||
|
@RequestParam(required = false) String seriesName,
|
||||||
|
@RequestParam(required = false) Integer seriesVolume,
|
||||||
|
@RequestParam(required = false) List<String> tags,
|
||||||
|
@RequestParam(defaultValue = "true") Boolean preserveReadingPosition,
|
||||||
|
@RequestParam(defaultValue = "false") Boolean overwriteExisting,
|
||||||
|
@RequestParam(defaultValue = "true") Boolean createMissingAuthor,
|
||||||
|
@RequestParam(defaultValue = "true") Boolean createMissingSeries) {
|
||||||
|
|
||||||
|
logger.info("Importing EPUB file: {}", file.getOriginalFilename());
|
||||||
|
|
||||||
|
EPUBImportRequest request = new EPUBImportRequest();
|
||||||
|
request.setEpubFile(file);
|
||||||
|
request.setAuthorId(authorId);
|
||||||
|
request.setAuthorName(authorName);
|
||||||
|
request.setSeriesId(seriesId);
|
||||||
|
request.setSeriesName(seriesName);
|
||||||
|
request.setSeriesVolume(seriesVolume);
|
||||||
|
request.setTags(tags);
|
||||||
|
request.setPreserveReadingPosition(preserveReadingPosition);
|
||||||
|
request.setOverwriteExisting(overwriteExisting);
|
||||||
|
request.setCreateMissingAuthor(createMissingAuthor);
|
||||||
|
request.setCreateMissingSeries(createMissingSeries);
|
||||||
|
|
||||||
|
try {
|
||||||
|
EPUBImportResponse response = epubImportService.importEPUB(request);
|
||||||
|
|
||||||
|
if (response.isSuccess()) {
|
||||||
|
logger.info("Successfully imported EPUB: {} (Story ID: {})",
|
||||||
|
response.getStoryTitle(), response.getStoryId());
|
||||||
|
return ResponseEntity.ok(response);
|
||||||
|
} else {
|
||||||
|
logger.warn("EPUB import failed: {}", response.getMessage());
|
||||||
|
return ResponseEntity.badRequest().body(response);
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error importing EPUB: {}", e.getMessage(), e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
.body(EPUBImportResponse.error("Internal server error: " + e.getMessage()));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// EPUB Export endpoint
|
||||||
|
@PostMapping("/epub/export")
|
||||||
|
public ResponseEntity<org.springframework.core.io.Resource> exportEPUB(
|
||||||
|
@Valid @RequestBody EPUBExportRequest request) {
|
||||||
|
|
||||||
|
logger.info("Exporting story {} to EPUB", request.getStoryId());
|
||||||
|
|
||||||
|
try {
|
||||||
|
if (!epubExportService.canExportStory(request.getStoryId())) {
|
||||||
|
return ResponseEntity.badRequest().build();
|
||||||
|
}
|
||||||
|
|
||||||
|
org.springframework.core.io.Resource resource = epubExportService.exportStoryAsEPUB(request);
|
||||||
|
Story story = storyService.findById(request.getStoryId());
|
||||||
|
String filename = epubExportService.getEPUBFilename(story);
|
||||||
|
|
||||||
|
logger.info("Successfully exported EPUB: {}", filename);
|
||||||
|
|
||||||
|
return ResponseEntity.ok()
|
||||||
|
.header("Content-Disposition", "attachment; filename=\"" + filename + "\"")
|
||||||
|
.header("Content-Type", "application/epub+zip")
|
||||||
|
.body(resource);
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error exporting EPUB: {}", e.getMessage(), e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// EPUB Export by story ID (GET endpoint)
|
||||||
|
@GetMapping("/{id}/epub")
|
||||||
|
public ResponseEntity<org.springframework.core.io.Resource> exportStoryAsEPUB(@PathVariable UUID id) {
|
||||||
|
logger.info("Exporting story {} to EPUB via GET", id);
|
||||||
|
|
||||||
|
EPUBExportRequest request = new EPUBExportRequest(id);
|
||||||
|
return exportEPUB(request);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate EPUB file
|
||||||
|
@PostMapping("/epub/validate")
|
||||||
|
public ResponseEntity<Map<String, Object>> validateEPUBFile(@RequestParam("file") MultipartFile file) {
|
||||||
|
logger.info("Validating EPUB file: {}", file.getOriginalFilename());
|
||||||
|
|
||||||
|
try {
|
||||||
|
List<String> errors = epubImportService.validateEPUBFile(file);
|
||||||
|
|
||||||
|
Map<String, Object> response = Map.of(
|
||||||
|
"valid", errors.isEmpty(),
|
||||||
|
"errors", errors,
|
||||||
|
"filename", file.getOriginalFilename(),
|
||||||
|
"size", file.getSize()
|
||||||
|
);
|
||||||
|
|
||||||
|
return ResponseEntity.ok(response);
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
logger.error("Error validating EPUB file: {}", e.getMessage(), e);
|
||||||
|
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||||
|
.body(Map.of("error", "Failed to validate EPUB file"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Request DTOs
|
// Request DTOs
|
||||||
public static class CreateStoryRequest {
|
public static class CreateStoryRequest {
|
||||||
private String title;
|
private String title;
|
||||||
|
|||||||
115
backend/src/main/java/com/storycove/dto/EPUBExportRequest.java
Normal file
115
backend/src/main/java/com/storycove/dto/EPUBExportRequest.java
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class EPUBExportRequest {
|
||||||
|
|
||||||
|
@NotNull(message = "Story ID is required")
|
||||||
|
private UUID storyId;
|
||||||
|
|
||||||
|
private String customTitle;
|
||||||
|
|
||||||
|
private String customAuthor;
|
||||||
|
|
||||||
|
private Boolean includeReadingPosition = true;
|
||||||
|
|
||||||
|
private Boolean includeCoverImage = true;
|
||||||
|
|
||||||
|
private Boolean includeMetadata = true;
|
||||||
|
|
||||||
|
private List<String> customMetadata;
|
||||||
|
|
||||||
|
private String language = "en";
|
||||||
|
|
||||||
|
private Boolean splitByChapters = false;
|
||||||
|
|
||||||
|
private Integer maxWordsPerChapter;
|
||||||
|
|
||||||
|
public EPUBExportRequest() {}
|
||||||
|
|
||||||
|
public EPUBExportRequest(UUID storyId) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getStoryId() {
|
||||||
|
return storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStoryId(UUID storyId) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCustomTitle() {
|
||||||
|
return customTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCustomTitle(String customTitle) {
|
||||||
|
this.customTitle = customTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getCustomAuthor() {
|
||||||
|
return customAuthor;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCustomAuthor(String customAuthor) {
|
||||||
|
this.customAuthor = customAuthor;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getIncludeReadingPosition() {
|
||||||
|
return includeReadingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIncludeReadingPosition(Boolean includeReadingPosition) {
|
||||||
|
this.includeReadingPosition = includeReadingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getIncludeCoverImage() {
|
||||||
|
return includeCoverImage;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIncludeCoverImage(Boolean includeCoverImage) {
|
||||||
|
this.includeCoverImage = includeCoverImage;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getIncludeMetadata() {
|
||||||
|
return includeMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setIncludeMetadata(Boolean includeMetadata) {
|
||||||
|
this.includeMetadata = includeMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getCustomMetadata() {
|
||||||
|
return customMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCustomMetadata(List<String> customMetadata) {
|
||||||
|
this.customMetadata = customMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getLanguage() {
|
||||||
|
return language;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLanguage(String language) {
|
||||||
|
this.language = language;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getSplitByChapters() {
|
||||||
|
return splitByChapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSplitByChapters(Boolean splitByChapters) {
|
||||||
|
this.splitByChapters = splitByChapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getMaxWordsPerChapter() {
|
||||||
|
return maxWordsPerChapter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setMaxWordsPerChapter(Integer maxWordsPerChapter) {
|
||||||
|
this.maxWordsPerChapter = maxWordsPerChapter;
|
||||||
|
}
|
||||||
|
}
|
||||||
123
backend/src/main/java/com/storycove/dto/EPUBImportRequest.java
Normal file
123
backend/src/main/java/com/storycove/dto/EPUBImportRequest.java
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class EPUBImportRequest {
|
||||||
|
|
||||||
|
@NotNull(message = "EPUB file is required")
|
||||||
|
private MultipartFile epubFile;
|
||||||
|
|
||||||
|
private UUID authorId;
|
||||||
|
|
||||||
|
private String authorName;
|
||||||
|
|
||||||
|
private UUID seriesId;
|
||||||
|
|
||||||
|
private String seriesName;
|
||||||
|
|
||||||
|
private Integer seriesVolume;
|
||||||
|
|
||||||
|
private List<String> tags;
|
||||||
|
|
||||||
|
private Boolean preserveReadingPosition = true;
|
||||||
|
|
||||||
|
private Boolean overwriteExisting = false;
|
||||||
|
|
||||||
|
private Boolean createMissingAuthor = true;
|
||||||
|
|
||||||
|
private Boolean createMissingSeries = true;
|
||||||
|
|
||||||
|
public EPUBImportRequest() {}
|
||||||
|
|
||||||
|
public MultipartFile getEpubFile() {
|
||||||
|
return epubFile;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEpubFile(MultipartFile epubFile) {
|
||||||
|
this.epubFile = epubFile;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getAuthorId() {
|
||||||
|
return authorId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAuthorId(UUID authorId) {
|
||||||
|
this.authorId = authorId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getAuthorName() {
|
||||||
|
return authorName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setAuthorName(String authorName) {
|
||||||
|
this.authorName = authorName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getSeriesId() {
|
||||||
|
return seriesId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSeriesId(UUID seriesId) {
|
||||||
|
this.seriesId = seriesId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getSeriesName() {
|
||||||
|
return seriesName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSeriesName(String seriesName) {
|
||||||
|
this.seriesName = seriesName;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getSeriesVolume() {
|
||||||
|
return seriesVolume;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSeriesVolume(Integer seriesVolume) {
|
||||||
|
this.seriesVolume = seriesVolume;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getTags() {
|
||||||
|
return tags;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTags(List<String> tags) {
|
||||||
|
this.tags = tags;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getPreserveReadingPosition() {
|
||||||
|
return preserveReadingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPreserveReadingPosition(Boolean preserveReadingPosition) {
|
||||||
|
this.preserveReadingPosition = preserveReadingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getOverwriteExisting() {
|
||||||
|
return overwriteExisting;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setOverwriteExisting(Boolean overwriteExisting) {
|
||||||
|
this.overwriteExisting = overwriteExisting;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getCreateMissingAuthor() {
|
||||||
|
return createMissingAuthor;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreateMissingAuthor(Boolean createMissingAuthor) {
|
||||||
|
this.createMissingAuthor = createMissingAuthor;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Boolean getCreateMissingSeries() {
|
||||||
|
return createMissingSeries;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreateMissingSeries(Boolean createMissingSeries) {
|
||||||
|
this.createMissingSeries = createMissingSeries;
|
||||||
|
}
|
||||||
|
}
|
||||||
107
backend/src/main/java/com/storycove/dto/EPUBImportResponse.java
Normal file
107
backend/src/main/java/com/storycove/dto/EPUBImportResponse.java
Normal file
@@ -0,0 +1,107 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class EPUBImportResponse {
|
||||||
|
|
||||||
|
private boolean success;
|
||||||
|
private String message;
|
||||||
|
private UUID storyId;
|
||||||
|
private String storyTitle;
|
||||||
|
private Integer totalChapters;
|
||||||
|
private Integer wordCount;
|
||||||
|
private ReadingPositionDto readingPosition;
|
||||||
|
private List<String> warnings;
|
||||||
|
private List<String> errors;
|
||||||
|
|
||||||
|
public EPUBImportResponse() {}
|
||||||
|
|
||||||
|
public EPUBImportResponse(boolean success, String message) {
|
||||||
|
this.success = success;
|
||||||
|
this.message = message;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static EPUBImportResponse success(UUID storyId, String storyTitle) {
|
||||||
|
EPUBImportResponse response = new EPUBImportResponse(true, "EPUB imported successfully");
|
||||||
|
response.setStoryId(storyId);
|
||||||
|
response.setStoryTitle(storyTitle);
|
||||||
|
return response;
|
||||||
|
}
|
||||||
|
|
||||||
|
public static EPUBImportResponse error(String message) {
|
||||||
|
return new EPUBImportResponse(false, message);
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isSuccess() {
|
||||||
|
return success;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setSuccess(boolean success) {
|
||||||
|
this.success = success;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getMessage() {
|
||||||
|
return message;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setMessage(String message) {
|
||||||
|
this.message = message;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getStoryId() {
|
||||||
|
return storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStoryId(UUID storyId) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getStoryTitle() {
|
||||||
|
return storyTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStoryTitle(String storyTitle) {
|
||||||
|
this.storyTitle = storyTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getTotalChapters() {
|
||||||
|
return totalChapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setTotalChapters(Integer totalChapters) {
|
||||||
|
this.totalChapters = totalChapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getWordCount() {
|
||||||
|
return wordCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWordCount(Integer wordCount) {
|
||||||
|
this.wordCount = wordCount;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ReadingPositionDto getReadingPosition() {
|
||||||
|
return readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setReadingPosition(ReadingPositionDto readingPosition) {
|
||||||
|
this.readingPosition = readingPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getWarnings() {
|
||||||
|
return warnings;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWarnings(List<String> warnings) {
|
||||||
|
this.warnings = warnings;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> getErrors() {
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setErrors(List<String> errors) {
|
||||||
|
this.errors = errors;
|
||||||
|
}
|
||||||
|
}
|
||||||
124
backend/src/main/java/com/storycove/dto/ReadingPositionDto.java
Normal file
124
backend/src/main/java/com/storycove/dto/ReadingPositionDto.java
Normal file
@@ -0,0 +1,124 @@
|
|||||||
|
package com.storycove.dto;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
public class ReadingPositionDto {
|
||||||
|
|
||||||
|
private UUID id;
|
||||||
|
private UUID storyId;
|
||||||
|
private Integer chapterIndex;
|
||||||
|
private String chapterTitle;
|
||||||
|
private Integer wordPosition;
|
||||||
|
private Integer characterPosition;
|
||||||
|
private Double percentageComplete;
|
||||||
|
private String epubCfi;
|
||||||
|
private String contextBefore;
|
||||||
|
private String contextAfter;
|
||||||
|
private LocalDateTime createdAt;
|
||||||
|
private LocalDateTime updatedAt;
|
||||||
|
|
||||||
|
public ReadingPositionDto() {}
|
||||||
|
|
||||||
|
public ReadingPositionDto(UUID storyId, Integer chapterIndex, Integer wordPosition) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(UUID id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public UUID getStoryId() {
|
||||||
|
return storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStoryId(UUID storyId) {
|
||||||
|
this.storyId = storyId;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getChapterIndex() {
|
||||||
|
return chapterIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChapterIndex(Integer chapterIndex) {
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getChapterTitle() {
|
||||||
|
return chapterTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChapterTitle(String chapterTitle) {
|
||||||
|
this.chapterTitle = chapterTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getWordPosition() {
|
||||||
|
return wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWordPosition(Integer wordPosition) {
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getCharacterPosition() {
|
||||||
|
return characterPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCharacterPosition(Integer characterPosition) {
|
||||||
|
this.characterPosition = characterPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Double getPercentageComplete() {
|
||||||
|
return percentageComplete;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPercentageComplete(Double percentageComplete) {
|
||||||
|
this.percentageComplete = percentageComplete;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getEpubCfi() {
|
||||||
|
return epubCfi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEpubCfi(String epubCfi) {
|
||||||
|
this.epubCfi = epubCfi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContextBefore() {
|
||||||
|
return contextBefore;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContextBefore(String contextBefore) {
|
||||||
|
this.contextBefore = contextBefore;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContextAfter() {
|
||||||
|
return contextAfter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContextAfter(String contextAfter) {
|
||||||
|
this.contextAfter = contextAfter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getCreatedAt() {
|
||||||
|
return createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedAt(LocalDateTime createdAt) {
|
||||||
|
this.createdAt = createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getUpdatedAt() {
|
||||||
|
return updatedAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUpdatedAt(LocalDateTime updatedAt) {
|
||||||
|
this.updatedAt = updatedAt;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -18,6 +18,7 @@ public class StorySearchDto {
|
|||||||
|
|
||||||
// Reading status
|
// Reading status
|
||||||
private Boolean isRead;
|
private Boolean isRead;
|
||||||
|
private LocalDateTime lastReadAt;
|
||||||
|
|
||||||
// Author info
|
// Author info
|
||||||
private UUID authorId;
|
private UUID authorId;
|
||||||
@@ -120,6 +121,14 @@ public class StorySearchDto {
|
|||||||
this.isRead = isRead;
|
this.isRead = isRead;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getLastReadAt() {
|
||||||
|
return lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||||
|
this.lastReadAt = lastReadAt;
|
||||||
|
}
|
||||||
|
|
||||||
public UUID getAuthorId() {
|
public UUID getAuthorId() {
|
||||||
return authorId;
|
return authorId;
|
||||||
}
|
}
|
||||||
|
|||||||
230
backend/src/main/java/com/storycove/entity/ReadingPosition.java
Normal file
230
backend/src/main/java/com/storycove/entity/ReadingPosition.java
Normal file
@@ -0,0 +1,230 @@
|
|||||||
|
package com.storycove.entity;
|
||||||
|
|
||||||
|
import jakarta.persistence.*;
|
||||||
|
import jakarta.validation.constraints.NotNull;
|
||||||
|
import org.hibernate.annotations.CreationTimestamp;
|
||||||
|
import org.hibernate.annotations.UpdateTimestamp;
|
||||||
|
import com.fasterxml.jackson.annotation.JsonBackReference;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
@Entity
|
||||||
|
@Table(name = "reading_positions", indexes = {
|
||||||
|
@Index(name = "idx_reading_position_story", columnList = "story_id")
|
||||||
|
})
|
||||||
|
public class ReadingPosition {
|
||||||
|
|
||||||
|
@Id
|
||||||
|
@GeneratedValue(strategy = GenerationType.UUID)
|
||||||
|
private UUID id;
|
||||||
|
|
||||||
|
@NotNull
|
||||||
|
@ManyToOne(fetch = FetchType.LAZY)
|
||||||
|
@JoinColumn(name = "story_id", nullable = false)
|
||||||
|
@JsonBackReference("story-reading-positions")
|
||||||
|
private Story story;
|
||||||
|
|
||||||
|
@Column(name = "chapter_index")
|
||||||
|
private Integer chapterIndex;
|
||||||
|
|
||||||
|
@Column(name = "chapter_title")
|
||||||
|
private String chapterTitle;
|
||||||
|
|
||||||
|
@Column(name = "word_position")
|
||||||
|
private Integer wordPosition;
|
||||||
|
|
||||||
|
@Column(name = "character_position")
|
||||||
|
private Integer characterPosition;
|
||||||
|
|
||||||
|
@Column(name = "percentage_complete")
|
||||||
|
private Double percentageComplete;
|
||||||
|
|
||||||
|
@Column(name = "epub_cfi", columnDefinition = "TEXT")
|
||||||
|
private String epubCfi;
|
||||||
|
|
||||||
|
@Column(name = "context_before", length = 500)
|
||||||
|
private String contextBefore;
|
||||||
|
|
||||||
|
@Column(name = "context_after", length = 500)
|
||||||
|
private String contextAfter;
|
||||||
|
|
||||||
|
@CreationTimestamp
|
||||||
|
@Column(name = "created_at", nullable = false, updatable = false)
|
||||||
|
private LocalDateTime createdAt;
|
||||||
|
|
||||||
|
@UpdateTimestamp
|
||||||
|
@Column(name = "updated_at", nullable = false)
|
||||||
|
private LocalDateTime updatedAt;
|
||||||
|
|
||||||
|
public ReadingPosition() {}
|
||||||
|
|
||||||
|
public ReadingPosition(Story story) {
|
||||||
|
this.story = story;
|
||||||
|
this.chapterIndex = 0;
|
||||||
|
this.wordPosition = 0;
|
||||||
|
this.characterPosition = 0;
|
||||||
|
this.percentageComplete = 0.0;
|
||||||
|
}
|
||||||
|
|
||||||
|
public ReadingPosition(Story story, Integer chapterIndex, Integer wordPosition) {
|
||||||
|
this.story = story;
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
this.characterPosition = 0;
|
||||||
|
this.percentageComplete = 0.0;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void updatePosition(Integer chapterIndex, Integer wordPosition, Integer characterPosition) {
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
this.characterPosition = characterPosition;
|
||||||
|
calculatePercentageComplete();
|
||||||
|
}
|
||||||
|
|
||||||
|
public void updatePositionWithCfi(String epubCfi, Integer chapterIndex, Integer wordPosition) {
|
||||||
|
this.epubCfi = epubCfi;
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
calculatePercentageComplete();
|
||||||
|
}
|
||||||
|
|
||||||
|
private void calculatePercentageComplete() {
|
||||||
|
if (story != null && story.getWordCount() != null && story.getWordCount() > 0) {
|
||||||
|
int totalWords = story.getWordCount();
|
||||||
|
int currentPosition = (chapterIndex != null ? chapterIndex * 1000 : 0) +
|
||||||
|
(wordPosition != null ? wordPosition : 0);
|
||||||
|
this.percentageComplete = Math.min(100.0, (double) currentPosition / totalWords * 100);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isAtBeginning() {
|
||||||
|
return (chapterIndex == null || chapterIndex == 0) &&
|
||||||
|
(wordPosition == null || wordPosition == 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean isCompleted() {
|
||||||
|
return percentageComplete != null && percentageComplete >= 95.0;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Getters and Setters
|
||||||
|
public UUID getId() {
|
||||||
|
return id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setId(UUID id) {
|
||||||
|
this.id = id;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Story getStory() {
|
||||||
|
return story;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setStory(Story story) {
|
||||||
|
this.story = story;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getChapterIndex() {
|
||||||
|
return chapterIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChapterIndex(Integer chapterIndex) {
|
||||||
|
this.chapterIndex = chapterIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getChapterTitle() {
|
||||||
|
return chapterTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setChapterTitle(String chapterTitle) {
|
||||||
|
this.chapterTitle = chapterTitle;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getWordPosition() {
|
||||||
|
return wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setWordPosition(Integer wordPosition) {
|
||||||
|
this.wordPosition = wordPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Integer getCharacterPosition() {
|
||||||
|
return characterPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCharacterPosition(Integer characterPosition) {
|
||||||
|
this.characterPosition = characterPosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Double getPercentageComplete() {
|
||||||
|
return percentageComplete;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setPercentageComplete(Double percentageComplete) {
|
||||||
|
this.percentageComplete = percentageComplete;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getEpubCfi() {
|
||||||
|
return epubCfi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setEpubCfi(String epubCfi) {
|
||||||
|
this.epubCfi = epubCfi;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContextBefore() {
|
||||||
|
return contextBefore;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContextBefore(String contextBefore) {
|
||||||
|
this.contextBefore = contextBefore;
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getContextAfter() {
|
||||||
|
return contextAfter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setContextAfter(String contextAfter) {
|
||||||
|
this.contextAfter = contextAfter;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getCreatedAt() {
|
||||||
|
return createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setCreatedAt(LocalDateTime createdAt) {
|
||||||
|
this.createdAt = createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public LocalDateTime getUpdatedAt() {
|
||||||
|
return updatedAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
public void setUpdatedAt(LocalDateTime updatedAt) {
|
||||||
|
this.updatedAt = updatedAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public boolean equals(Object o) {
|
||||||
|
if (this == o) return true;
|
||||||
|
if (!(o instanceof ReadingPosition)) return false;
|
||||||
|
ReadingPosition that = (ReadingPosition) o;
|
||||||
|
return id != null && id.equals(that.id);
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public int hashCode() {
|
||||||
|
return getClass().hashCode();
|
||||||
|
}
|
||||||
|
|
||||||
|
@Override
|
||||||
|
public String toString() {
|
||||||
|
return "ReadingPosition{" +
|
||||||
|
"id=" + id +
|
||||||
|
", storyId=" + (story != null ? story.getId() : null) +
|
||||||
|
", chapterIndex=" + chapterIndex +
|
||||||
|
", wordPosition=" + wordPosition +
|
||||||
|
", percentageComplete=" + percentageComplete +
|
||||||
|
'}';
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,57 @@
|
|||||||
|
package com.storycove.repository;
|
||||||
|
|
||||||
|
import com.storycove.entity.ReadingPosition;
|
||||||
|
import com.storycove.entity.Story;
|
||||||
|
import org.springframework.data.jpa.repository.JpaRepository;
|
||||||
|
import org.springframework.data.jpa.repository.Query;
|
||||||
|
import org.springframework.data.repository.query.Param;
|
||||||
|
import org.springframework.stereotype.Repository;
|
||||||
|
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
@Repository
|
||||||
|
public interface ReadingPositionRepository extends JpaRepository<ReadingPosition, UUID> {
|
||||||
|
|
||||||
|
Optional<ReadingPosition> findByStoryId(UUID storyId);
|
||||||
|
|
||||||
|
Optional<ReadingPosition> findByStory(Story story);
|
||||||
|
|
||||||
|
List<ReadingPosition> findByStoryIdIn(List<UUID> storyIds);
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.story.id = :storyId ORDER BY rp.updatedAt DESC")
|
||||||
|
List<ReadingPosition> findByStoryIdOrderByUpdatedAtDesc(@Param("storyId") UUID storyId);
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete >= :minPercentage")
|
||||||
|
List<ReadingPosition> findByMinimumPercentageComplete(@Param("minPercentage") Double minPercentage);
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete >= 95.0")
|
||||||
|
List<ReadingPosition> findCompletedReadings();
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete > 0 AND rp.percentageComplete < 95.0")
|
||||||
|
List<ReadingPosition> findInProgressReadings();
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.updatedAt >= :since ORDER BY rp.updatedAt DESC")
|
||||||
|
List<ReadingPosition> findRecentlyUpdated(@Param("since") LocalDateTime since);
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp ORDER BY rp.updatedAt DESC")
|
||||||
|
List<ReadingPosition> findAllOrderByUpdatedAtDesc();
|
||||||
|
|
||||||
|
@Query("SELECT COUNT(rp) FROM ReadingPosition rp WHERE rp.percentageComplete >= 95.0")
|
||||||
|
long countCompletedReadings();
|
||||||
|
|
||||||
|
@Query("SELECT COUNT(rp) FROM ReadingPosition rp WHERE rp.percentageComplete > 0 AND rp.percentageComplete < 95.0")
|
||||||
|
long countInProgressReadings();
|
||||||
|
|
||||||
|
@Query("SELECT AVG(rp.percentageComplete) FROM ReadingPosition rp WHERE rp.percentageComplete > 0")
|
||||||
|
Double findAverageReadingProgress();
|
||||||
|
|
||||||
|
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.epubCfi IS NOT NULL")
|
||||||
|
List<ReadingPosition> findPositionsWithEpubCfi();
|
||||||
|
|
||||||
|
boolean existsByStoryId(UUID storyId);
|
||||||
|
|
||||||
|
void deleteByStoryId(UUID storyId);
|
||||||
|
}
|
||||||
@@ -0,0 +1,386 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import com.storycove.dto.EPUBExportRequest;
|
||||||
|
import com.storycove.entity.ReadingPosition;
|
||||||
|
import com.storycove.entity.Story;
|
||||||
|
import com.storycove.repository.ReadingPositionRepository;
|
||||||
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
|
|
||||||
|
import nl.siegmann.epublib.domain.*;
|
||||||
|
import nl.siegmann.epublib.epub.EpubWriter;
|
||||||
|
|
||||||
|
import org.jsoup.Jsoup;
|
||||||
|
import org.jsoup.nodes.Document;
|
||||||
|
import org.jsoup.nodes.Element;
|
||||||
|
import org.jsoup.select.Elements;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.core.io.ByteArrayResource;
|
||||||
|
import org.springframework.core.io.Resource;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
|
||||||
|
import java.io.ByteArrayOutputStream;
|
||||||
|
import java.io.FileInputStream;
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.nio.file.Files;
|
||||||
|
import java.nio.file.Path;
|
||||||
|
import java.nio.file.Paths;
|
||||||
|
import java.time.LocalDateTime;
|
||||||
|
import java.time.format.DateTimeFormatter;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.UUID;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
@Transactional
|
||||||
|
public class EPUBExportService {
|
||||||
|
|
||||||
|
private final StoryService storyService;
|
||||||
|
private final ReadingPositionRepository readingPositionRepository;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
public EPUBExportService(StoryService storyService,
|
||||||
|
ReadingPositionRepository readingPositionRepository) {
|
||||||
|
this.storyService = storyService;
|
||||||
|
this.readingPositionRepository = readingPositionRepository;
|
||||||
|
}
|
||||||
|
|
||||||
|
public Resource exportStoryAsEPUB(EPUBExportRequest request) throws IOException {
|
||||||
|
Story story = storyService.findById(request.getStoryId());
|
||||||
|
|
||||||
|
Book book = createEPUBBook(story, request);
|
||||||
|
|
||||||
|
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
|
||||||
|
EpubWriter epubWriter = new EpubWriter();
|
||||||
|
epubWriter.write(book, outputStream);
|
||||||
|
|
||||||
|
return new ByteArrayResource(outputStream.toByteArray());
|
||||||
|
}
|
||||||
|
|
||||||
|
private Book createEPUBBook(Story story, EPUBExportRequest request) throws IOException {
|
||||||
|
Book book = new Book();
|
||||||
|
|
||||||
|
setupMetadata(book, story, request);
|
||||||
|
|
||||||
|
addCoverImage(book, story, request);
|
||||||
|
|
||||||
|
addContent(book, story, request);
|
||||||
|
|
||||||
|
addReadingPosition(book, story, request);
|
||||||
|
|
||||||
|
return book;
|
||||||
|
}
|
||||||
|
|
||||||
|
private void setupMetadata(Book book, Story story, EPUBExportRequest request) {
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
String title = request.getCustomTitle() != null ?
|
||||||
|
request.getCustomTitle() : story.getTitle();
|
||||||
|
metadata.addTitle(title);
|
||||||
|
|
||||||
|
String authorName = request.getCustomAuthor() != null ?
|
||||||
|
request.getCustomAuthor() :
|
||||||
|
(story.getAuthor() != null ? story.getAuthor().getName() : "Unknown Author");
|
||||||
|
metadata.addAuthor(new Author(authorName));
|
||||||
|
|
||||||
|
metadata.setLanguage(request.getLanguage() != null ? request.getLanguage() : "en");
|
||||||
|
|
||||||
|
metadata.addIdentifier(new Identifier("storycove", story.getId().toString()));
|
||||||
|
|
||||||
|
if (story.getDescription() != null) {
|
||||||
|
metadata.addDescription(story.getDescription());
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getIncludeMetadata()) {
|
||||||
|
metadata.addDate(new Date(java.util.Date.from(
|
||||||
|
story.getCreatedAt().atZone(java.time.ZoneId.systemDefault()).toInstant()
|
||||||
|
), Date.Event.CREATION));
|
||||||
|
|
||||||
|
if (story.getSeries() != null) {
|
||||||
|
// Add series and metadata info to description instead of using addMeta
|
||||||
|
StringBuilder description = new StringBuilder();
|
||||||
|
if (story.getDescription() != null) {
|
||||||
|
description.append(story.getDescription()).append("\n\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
description.append("Series: ").append(story.getSeries().getName());
|
||||||
|
if (story.getVolume() != null) {
|
||||||
|
description.append(" (Volume ").append(story.getVolume()).append(")");
|
||||||
|
}
|
||||||
|
description.append("\n");
|
||||||
|
|
||||||
|
if (story.getWordCount() != null) {
|
||||||
|
description.append("Word Count: ").append(story.getWordCount()).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (story.getRating() != null) {
|
||||||
|
description.append("Rating: ").append(story.getRating()).append("/5\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!story.getTags().isEmpty()) {
|
||||||
|
String tags = story.getTags().stream()
|
||||||
|
.map(tag -> tag.getName())
|
||||||
|
.reduce((a, b) -> a + ", " + b)
|
||||||
|
.orElse("");
|
||||||
|
description.append("Tags: ").append(tags).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
description.append("\nGenerated by StoryCove on ")
|
||||||
|
.append(LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||||
|
|
||||||
|
metadata.addDescription(description.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getCustomMetadata() != null && !request.getCustomMetadata().isEmpty()) {
|
||||||
|
// Add custom metadata to description since addMeta doesn't exist
|
||||||
|
StringBuilder customDesc = new StringBuilder();
|
||||||
|
for (String customMeta : request.getCustomMetadata()) {
|
||||||
|
String[] parts = customMeta.split(":", 2);
|
||||||
|
if (parts.length == 2) {
|
||||||
|
customDesc.append(parts[0].trim()).append(": ").append(parts[1].trim()).append("\n");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (customDesc.length() > 0) {
|
||||||
|
String existingDesc = metadata.getDescriptions().isEmpty() ? "" : metadata.getDescriptions().get(0);
|
||||||
|
metadata.addDescription(existingDesc + "\n" + customDesc.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addCoverImage(Book book, Story story, EPUBExportRequest request) {
|
||||||
|
if (!request.getIncludeCoverImage() || story.getCoverPath() == null) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
Path coverPath = Paths.get(story.getCoverPath());
|
||||||
|
if (Files.exists(coverPath)) {
|
||||||
|
byte[] coverImageData = Files.readAllBytes(coverPath);
|
||||||
|
String mimeType = Files.probeContentType(coverPath);
|
||||||
|
if (mimeType == null) {
|
||||||
|
mimeType = "image/jpeg";
|
||||||
|
}
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource coverResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(coverImageData, "cover.jpg");
|
||||||
|
|
||||||
|
book.setCoverImage(coverResource);
|
||||||
|
}
|
||||||
|
} catch (IOException e) {
|
||||||
|
// Skip cover image on error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addContent(Book book, Story story, EPUBExportRequest request) {
|
||||||
|
String content = story.getContentHtml();
|
||||||
|
if (content == null) {
|
||||||
|
content = story.getContentPlain() != null ?
|
||||||
|
"<p>" + story.getContentPlain().replace("\n", "</p><p>") + "</p>" :
|
||||||
|
"<p>No content available</p>";
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getSplitByChapters()) {
|
||||||
|
addChapterizedContent(book, content, request);
|
||||||
|
} else {
|
||||||
|
addSingleChapterContent(book, content, story);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addSingleChapterContent(Book book, String content, Story story) {
|
||||||
|
String html = createChapterHTML(story.getTitle(), content);
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter.html");
|
||||||
|
|
||||||
|
book.addSection(story.getTitle(), chapterResource);
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addChapterizedContent(Book book, String content, EPUBExportRequest request) {
|
||||||
|
Document doc = Jsoup.parse(content);
|
||||||
|
Elements chapters = doc.select("div.chapter, h1, h2, h3");
|
||||||
|
|
||||||
|
if (chapters.isEmpty()) {
|
||||||
|
List<String> paragraphs = splitByWords(content,
|
||||||
|
request.getMaxWordsPerChapter() != null ? request.getMaxWordsPerChapter() : 2000);
|
||||||
|
|
||||||
|
for (int i = 0; i < paragraphs.size(); i++) {
|
||||||
|
String chapterTitle = "Chapter " + (i + 1);
|
||||||
|
String html = createChapterHTML(chapterTitle, paragraphs.get(i));
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter" + (i + 1) + ".html");
|
||||||
|
|
||||||
|
book.addSection(chapterTitle, chapterResource);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
for (int i = 0; i < chapters.size(); i++) {
|
||||||
|
Element chapter = chapters.get(i);
|
||||||
|
String chapterTitle = chapter.text();
|
||||||
|
if (chapterTitle.trim().isEmpty()) {
|
||||||
|
chapterTitle = "Chapter " + (i + 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
String chapterContent = chapter.html();
|
||||||
|
String html = createChapterHTML(chapterTitle, chapterContent);
|
||||||
|
|
||||||
|
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||||
|
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter" + (i + 1) + ".html");
|
||||||
|
|
||||||
|
book.addSection(chapterTitle, chapterResource);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private List<String> splitByWords(String content, int maxWordsPerChapter) {
|
||||||
|
String[] words = content.split("\\s+");
|
||||||
|
List<String> chapters = new ArrayList<>();
|
||||||
|
StringBuilder currentChapter = new StringBuilder();
|
||||||
|
int wordCount = 0;
|
||||||
|
|
||||||
|
for (String word : words) {
|
||||||
|
currentChapter.append(word).append(" ");
|
||||||
|
wordCount++;
|
||||||
|
|
||||||
|
if (wordCount >= maxWordsPerChapter) {
|
||||||
|
chapters.add(currentChapter.toString().trim());
|
||||||
|
currentChapter = new StringBuilder();
|
||||||
|
wordCount = 0;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (currentChapter.length() > 0) {
|
||||||
|
chapters.add(currentChapter.toString().trim());
|
||||||
|
}
|
||||||
|
|
||||||
|
return chapters;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String createChapterHTML(String title, String content) {
|
||||||
|
return "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" +
|
||||||
|
"<!DOCTYPE html PUBLIC \"-//W3C//DTD XHTML 1.1//EN\" " +
|
||||||
|
"\"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd\">" +
|
||||||
|
"<html xmlns=\"http://www.w3.org/1999/xhtml\">" +
|
||||||
|
"<head>" +
|
||||||
|
"<title>" + escapeHtml(title) + "</title>" +
|
||||||
|
"<style type=\"text/css\">" +
|
||||||
|
"body { font-family: serif; margin: 1em; }" +
|
||||||
|
"h1 { text-align: center; }" +
|
||||||
|
"p { text-indent: 1em; margin: 0.5em 0; }" +
|
||||||
|
"</style>" +
|
||||||
|
"</head>" +
|
||||||
|
"<body>" +
|
||||||
|
"<h1>" + escapeHtml(title) + "</h1>" +
|
||||||
|
fixHtmlForXhtml(content) +
|
||||||
|
"</body>" +
|
||||||
|
"</html>";
|
||||||
|
}
|
||||||
|
|
||||||
|
private void addReadingPosition(Book book, Story story, EPUBExportRequest request) {
|
||||||
|
if (!request.getIncludeReadingPosition()) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
Optional<ReadingPosition> positionOpt = readingPositionRepository.findByStoryId(story.getId());
|
||||||
|
if (positionOpt.isPresent()) {
|
||||||
|
ReadingPosition position = positionOpt.get();
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
// Add reading position to description since addMeta doesn't exist
|
||||||
|
StringBuilder positionDesc = new StringBuilder();
|
||||||
|
if (position.getEpubCfi() != null) {
|
||||||
|
positionDesc.append("EPUB CFI: ").append(position.getEpubCfi()).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (position.getChapterIndex() != null && position.getWordPosition() != null) {
|
||||||
|
positionDesc.append("Reading Position: Chapter ")
|
||||||
|
.append(position.getChapterIndex())
|
||||||
|
.append(", Word ").append(position.getWordPosition()).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (position.getPercentageComplete() != null) {
|
||||||
|
positionDesc.append("Reading Progress: ")
|
||||||
|
.append(String.format("%.1f%%", position.getPercentageComplete())).append("\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
positionDesc.append("Last Read: ")
|
||||||
|
.append(position.getUpdatedAt().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||||
|
|
||||||
|
String existingDesc = metadata.getDescriptions().isEmpty() ? "" : metadata.getDescriptions().get(0);
|
||||||
|
metadata.addDescription(existingDesc + "\n\n--- Reading Position ---\n" + positionDesc.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private String fixHtmlForXhtml(String html) {
|
||||||
|
if (html == null) return "";
|
||||||
|
|
||||||
|
// Fix common XHTML validation issues
|
||||||
|
String fixed = html
|
||||||
|
// Fix self-closing tags to be XHTML compliant
|
||||||
|
.replaceAll("<br>", "<br />")
|
||||||
|
.replaceAll("<hr>", "<hr />")
|
||||||
|
.replaceAll("<img([^>]*)>", "<img$1 />")
|
||||||
|
.replaceAll("<input([^>]*)>", "<input$1 />")
|
||||||
|
.replaceAll("<area([^>]*)>", "<area$1 />")
|
||||||
|
.replaceAll("<base([^>]*)>", "<base$1 />")
|
||||||
|
.replaceAll("<col([^>]*)>", "<col$1 />")
|
||||||
|
.replaceAll("<embed([^>]*)>", "<embed$1 />")
|
||||||
|
.replaceAll("<link([^>]*)>", "<link$1 />")
|
||||||
|
.replaceAll("<meta([^>]*)>", "<meta$1 />")
|
||||||
|
.replaceAll("<param([^>]*)>", "<param$1 />")
|
||||||
|
.replaceAll("<source([^>]*)>", "<source$1 />")
|
||||||
|
.replaceAll("<track([^>]*)>", "<track$1 />")
|
||||||
|
.replaceAll("<wbr([^>]*)>", "<wbr$1 />");
|
||||||
|
|
||||||
|
return fixed;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String escapeHtml(String text) {
|
||||||
|
if (text == null) return "";
|
||||||
|
return text.replace("&", "&")
|
||||||
|
.replace("<", "<")
|
||||||
|
.replace(">", ">")
|
||||||
|
.replace("\"", """)
|
||||||
|
.replace("'", "'");
|
||||||
|
}
|
||||||
|
|
||||||
|
public String getEPUBFilename(Story story) {
|
||||||
|
StringBuilder filename = new StringBuilder();
|
||||||
|
|
||||||
|
if (story.getAuthor() != null) {
|
||||||
|
filename.append(sanitizeFilename(story.getAuthor().getName()))
|
||||||
|
.append(" - ");
|
||||||
|
}
|
||||||
|
|
||||||
|
filename.append(sanitizeFilename(story.getTitle()));
|
||||||
|
|
||||||
|
if (story.getSeries() != null && story.getVolume() != null) {
|
||||||
|
filename.append(" (")
|
||||||
|
.append(sanitizeFilename(story.getSeries().getName()))
|
||||||
|
.append(" ")
|
||||||
|
.append(story.getVolume())
|
||||||
|
.append(")");
|
||||||
|
}
|
||||||
|
|
||||||
|
filename.append(".epub");
|
||||||
|
|
||||||
|
return filename.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
private String sanitizeFilename(String filename) {
|
||||||
|
if (filename == null) return "unknown";
|
||||||
|
return filename.replaceAll("[^a-zA-Z0-9._\\- ]", "")
|
||||||
|
.trim()
|
||||||
|
.replaceAll("\\s+", "_");
|
||||||
|
}
|
||||||
|
|
||||||
|
public boolean canExportStory(UUID storyId) {
|
||||||
|
try {
|
||||||
|
Story story = storyService.findById(storyId);
|
||||||
|
return story.getContentHtml() != null || story.getContentPlain() != null;
|
||||||
|
} catch (ResourceNotFoundException e) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,327 @@
|
|||||||
|
package com.storycove.service;
|
||||||
|
|
||||||
|
import com.storycove.dto.EPUBImportRequest;
|
||||||
|
import com.storycove.dto.EPUBImportResponse;
|
||||||
|
import com.storycove.dto.ReadingPositionDto;
|
||||||
|
import com.storycove.entity.*;
|
||||||
|
import com.storycove.repository.ReadingPositionRepository;
|
||||||
|
import com.storycove.service.exception.InvalidFileException;
|
||||||
|
import com.storycove.service.exception.ResourceNotFoundException;
|
||||||
|
|
||||||
|
import nl.siegmann.epublib.domain.Book;
|
||||||
|
import nl.siegmann.epublib.domain.Metadata;
|
||||||
|
import nl.siegmann.epublib.domain.Resource;
|
||||||
|
import nl.siegmann.epublib.domain.SpineReference;
|
||||||
|
import nl.siegmann.epublib.epub.EpubReader;
|
||||||
|
|
||||||
|
import org.jsoup.Jsoup;
|
||||||
|
import org.jsoup.nodes.Document;
|
||||||
|
import org.springframework.beans.factory.annotation.Autowired;
|
||||||
|
import org.springframework.stereotype.Service;
|
||||||
|
import org.springframework.transaction.annotation.Transactional;
|
||||||
|
import org.springframework.web.multipart.MultipartFile;
|
||||||
|
|
||||||
|
import java.io.IOException;
|
||||||
|
import java.io.InputStream;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Optional;
|
||||||
|
import java.util.UUID;
|
||||||
|
import java.util.stream.Collectors;
|
||||||
|
|
||||||
|
@Service
|
||||||
|
@Transactional
|
||||||
|
public class EPUBImportService {
|
||||||
|
|
||||||
|
private final StoryService storyService;
|
||||||
|
private final AuthorService authorService;
|
||||||
|
private final SeriesService seriesService;
|
||||||
|
private final TagService tagService;
|
||||||
|
private final ReadingPositionRepository readingPositionRepository;
|
||||||
|
private final HtmlSanitizationService sanitizationService;
|
||||||
|
|
||||||
|
@Autowired
|
||||||
|
public EPUBImportService(StoryService storyService,
|
||||||
|
AuthorService authorService,
|
||||||
|
SeriesService seriesService,
|
||||||
|
TagService tagService,
|
||||||
|
ReadingPositionRepository readingPositionRepository,
|
||||||
|
HtmlSanitizationService sanitizationService) {
|
||||||
|
this.storyService = storyService;
|
||||||
|
this.authorService = authorService;
|
||||||
|
this.seriesService = seriesService;
|
||||||
|
this.tagService = tagService;
|
||||||
|
this.readingPositionRepository = readingPositionRepository;
|
||||||
|
this.sanitizationService = sanitizationService;
|
||||||
|
}
|
||||||
|
|
||||||
|
public EPUBImportResponse importEPUB(EPUBImportRequest request) {
|
||||||
|
try {
|
||||||
|
MultipartFile epubFile = request.getEpubFile();
|
||||||
|
|
||||||
|
if (epubFile == null || epubFile.isEmpty()) {
|
||||||
|
return EPUBImportResponse.error("EPUB file is required");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!isValidEPUBFile(epubFile)) {
|
||||||
|
return EPUBImportResponse.error("Invalid EPUB file format");
|
||||||
|
}
|
||||||
|
|
||||||
|
Book book = parseEPUBFile(epubFile);
|
||||||
|
|
||||||
|
Story story = createStoryFromEPUB(book, request);
|
||||||
|
|
||||||
|
Story savedStory = storyService.create(story);
|
||||||
|
|
||||||
|
EPUBImportResponse response = EPUBImportResponse.success(savedStory.getId(), savedStory.getTitle());
|
||||||
|
response.setWordCount(savedStory.getWordCount());
|
||||||
|
response.setTotalChapters(book.getSpine().size());
|
||||||
|
|
||||||
|
if (request.getPreserveReadingPosition() != null && request.getPreserveReadingPosition()) {
|
||||||
|
ReadingPosition readingPosition = extractReadingPosition(book, savedStory);
|
||||||
|
if (readingPosition != null) {
|
||||||
|
ReadingPosition savedPosition = readingPositionRepository.save(readingPosition);
|
||||||
|
response.setReadingPosition(convertToDto(savedPosition));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return response;
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
return EPUBImportResponse.error("Failed to import EPUB: " + e.getMessage());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private boolean isValidEPUBFile(MultipartFile file) {
|
||||||
|
String filename = file.getOriginalFilename();
|
||||||
|
if (filename == null || !filename.toLowerCase().endsWith(".epub")) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
String contentType = file.getContentType();
|
||||||
|
return "application/epub+zip".equals(contentType) ||
|
||||||
|
"application/zip".equals(contentType) ||
|
||||||
|
contentType == null;
|
||||||
|
}
|
||||||
|
|
||||||
|
private Book parseEPUBFile(MultipartFile epubFile) throws IOException {
|
||||||
|
try (InputStream inputStream = epubFile.getInputStream()) {
|
||||||
|
EpubReader epubReader = new EpubReader();
|
||||||
|
return epubReader.readEpub(inputStream);
|
||||||
|
} catch (Exception e) {
|
||||||
|
throw new InvalidFileException("Failed to parse EPUB file: " + e.getMessage());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private Story createStoryFromEPUB(Book book, EPUBImportRequest request) {
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
String title = extractTitle(metadata);
|
||||||
|
String authorName = extractAuthorName(metadata, request);
|
||||||
|
String description = extractDescription(metadata);
|
||||||
|
String content = extractContent(book);
|
||||||
|
|
||||||
|
Story story = new Story();
|
||||||
|
story.setTitle(title);
|
||||||
|
story.setDescription(description);
|
||||||
|
story.setContentHtml(sanitizationService.sanitize(content));
|
||||||
|
|
||||||
|
if (request.getAuthorId() != null) {
|
||||||
|
try {
|
||||||
|
Author author = authorService.findById(request.getAuthorId());
|
||||||
|
story.setAuthor(author);
|
||||||
|
} catch (ResourceNotFoundException e) {
|
||||||
|
if (request.getCreateMissingAuthor()) {
|
||||||
|
Author newAuthor = createAuthor(authorName);
|
||||||
|
story.setAuthor(newAuthor);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (authorName != null && request.getCreateMissingAuthor()) {
|
||||||
|
Author author = findOrCreateAuthor(authorName);
|
||||||
|
story.setAuthor(author);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getSeriesId() != null && request.getSeriesVolume() != null) {
|
||||||
|
try {
|
||||||
|
Series series = seriesService.findById(request.getSeriesId());
|
||||||
|
story.setSeries(series);
|
||||||
|
story.setVolume(request.getSeriesVolume());
|
||||||
|
} catch (ResourceNotFoundException e) {
|
||||||
|
if (request.getCreateMissingSeries() && request.getSeriesName() != null) {
|
||||||
|
Series newSeries = createSeries(request.getSeriesName());
|
||||||
|
story.setSeries(newSeries);
|
||||||
|
story.setVolume(request.getSeriesVolume());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (request.getTags() != null && !request.getTags().isEmpty()) {
|
||||||
|
for (String tagName : request.getTags()) {
|
||||||
|
Tag tag = tagService.findOrCreate(tagName);
|
||||||
|
story.addTag(tag);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return story;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractTitle(Metadata metadata) {
|
||||||
|
List<String> titles = metadata.getTitles();
|
||||||
|
if (titles != null && !titles.isEmpty()) {
|
||||||
|
return titles.get(0);
|
||||||
|
}
|
||||||
|
return "Untitled EPUB";
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractAuthorName(Metadata metadata, EPUBImportRequest request) {
|
||||||
|
if (request.getAuthorName() != null && !request.getAuthorName().trim().isEmpty()) {
|
||||||
|
return request.getAuthorName().trim();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (metadata.getAuthors() != null && !metadata.getAuthors().isEmpty()) {
|
||||||
|
return metadata.getAuthors().get(0).getFirstname() + " " + metadata.getAuthors().get(0).getLastname();
|
||||||
|
}
|
||||||
|
|
||||||
|
return "Unknown Author";
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractDescription(Metadata metadata) {
|
||||||
|
List<String> descriptions = metadata.getDescriptions();
|
||||||
|
if (descriptions != null && !descriptions.isEmpty()) {
|
||||||
|
return descriptions.get(0);
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
private String extractContent(Book book) {
|
||||||
|
StringBuilder contentBuilder = new StringBuilder();
|
||||||
|
|
||||||
|
List<SpineReference> spine = book.getSpine().getSpineReferences();
|
||||||
|
for (SpineReference spineRef : spine) {
|
||||||
|
try {
|
||||||
|
Resource resource = spineRef.getResource();
|
||||||
|
if (resource != null && resource.getData() != null) {
|
||||||
|
String html = new String(resource.getData(), "UTF-8");
|
||||||
|
|
||||||
|
Document doc = Jsoup.parse(html);
|
||||||
|
doc.select("script, style").remove();
|
||||||
|
|
||||||
|
String chapterContent = doc.body() != null ? doc.body().html() : doc.html();
|
||||||
|
|
||||||
|
contentBuilder.append("<div class=\"chapter\">")
|
||||||
|
.append(chapterContent)
|
||||||
|
.append("</div>");
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Skip this chapter on error
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return contentBuilder.toString();
|
||||||
|
}
|
||||||
|
|
||||||
|
private Author findOrCreateAuthor(String authorName) {
|
||||||
|
Optional<Author> existingAuthor = authorService.findByNameOptional(authorName);
|
||||||
|
if (existingAuthor.isPresent()) {
|
||||||
|
return existingAuthor.get();
|
||||||
|
}
|
||||||
|
return createAuthor(authorName);
|
||||||
|
}
|
||||||
|
|
||||||
|
private Author createAuthor(String authorName) {
|
||||||
|
Author author = new Author();
|
||||||
|
author.setName(authorName);
|
||||||
|
return authorService.create(author);
|
||||||
|
}
|
||||||
|
|
||||||
|
private Series createSeries(String seriesName) {
|
||||||
|
Series series = new Series();
|
||||||
|
series.setName(seriesName);
|
||||||
|
return seriesService.create(series);
|
||||||
|
}
|
||||||
|
|
||||||
|
private ReadingPosition extractReadingPosition(Book book, Story story) {
|
||||||
|
try {
|
||||||
|
Metadata metadata = book.getMetadata();
|
||||||
|
|
||||||
|
String positionMeta = metadata.getMetaAttribute("reading-position");
|
||||||
|
String cfiMeta = metadata.getMetaAttribute("epub-cfi");
|
||||||
|
|
||||||
|
ReadingPosition position = new ReadingPosition(story);
|
||||||
|
|
||||||
|
if (cfiMeta != null) {
|
||||||
|
position.setEpubCfi(cfiMeta);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (positionMeta != null) {
|
||||||
|
try {
|
||||||
|
String[] parts = positionMeta.split(":");
|
||||||
|
if (parts.length >= 2) {
|
||||||
|
position.setChapterIndex(Integer.parseInt(parts[0]));
|
||||||
|
position.setWordPosition(Integer.parseInt(parts[1]));
|
||||||
|
}
|
||||||
|
} catch (NumberFormatException e) {
|
||||||
|
// Ignore invalid position format
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return position;
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
// Return null if no reading position found
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private ReadingPositionDto convertToDto(ReadingPosition position) {
|
||||||
|
if (position == null) return null;
|
||||||
|
|
||||||
|
ReadingPositionDto dto = new ReadingPositionDto();
|
||||||
|
dto.setId(position.getId());
|
||||||
|
dto.setStoryId(position.getStory().getId());
|
||||||
|
dto.setChapterIndex(position.getChapterIndex());
|
||||||
|
dto.setChapterTitle(position.getChapterTitle());
|
||||||
|
dto.setWordPosition(position.getWordPosition());
|
||||||
|
dto.setCharacterPosition(position.getCharacterPosition());
|
||||||
|
dto.setPercentageComplete(position.getPercentageComplete());
|
||||||
|
dto.setEpubCfi(position.getEpubCfi());
|
||||||
|
dto.setContextBefore(position.getContextBefore());
|
||||||
|
dto.setContextAfter(position.getContextAfter());
|
||||||
|
dto.setCreatedAt(position.getCreatedAt());
|
||||||
|
dto.setUpdatedAt(position.getUpdatedAt());
|
||||||
|
|
||||||
|
return dto;
|
||||||
|
}
|
||||||
|
|
||||||
|
public List<String> validateEPUBFile(MultipartFile file) {
|
||||||
|
List<String> errors = new ArrayList<>();
|
||||||
|
|
||||||
|
if (file == null || file.isEmpty()) {
|
||||||
|
errors.add("EPUB file is required");
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!isValidEPUBFile(file)) {
|
||||||
|
errors.add("Invalid EPUB file format. Only .epub files are supported");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (file.getSize() > 100 * 1024 * 1024) { // 100MB limit
|
||||||
|
errors.add("EPUB file size exceeds 100MB limit");
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
Book book = parseEPUBFile(file);
|
||||||
|
if (book.getMetadata() == null) {
|
||||||
|
errors.add("EPUB file contains no metadata");
|
||||||
|
}
|
||||||
|
if (book.getSpine() == null || book.getSpine().isEmpty()) {
|
||||||
|
errors.add("EPUB file contains no readable content");
|
||||||
|
}
|
||||||
|
} catch (Exception e) {
|
||||||
|
errors.add("Failed to parse EPUB file: " + e.getMessage());
|
||||||
|
}
|
||||||
|
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -20,11 +20,11 @@ import java.util.UUID;
|
|||||||
public class ImageService {
|
public class ImageService {
|
||||||
|
|
||||||
private static final Set<String> ALLOWED_CONTENT_TYPES = Set.of(
|
private static final Set<String> ALLOWED_CONTENT_TYPES = Set.of(
|
||||||
"image/jpeg", "image/jpg", "image/png", "image/webp"
|
"image/jpeg", "image/jpg", "image/png"
|
||||||
);
|
);
|
||||||
|
|
||||||
private static final Set<String> ALLOWED_EXTENSIONS = Set.of(
|
private static final Set<String> ALLOWED_EXTENSIONS = Set.of(
|
||||||
"jpg", "jpeg", "png", "webp"
|
"jpg", "jpeg", "png"
|
||||||
);
|
);
|
||||||
|
|
||||||
@Value("${storycove.images.upload-dir:/app/images}")
|
@Value("${storycove.images.upload-dir:/app/images}")
|
||||||
|
|||||||
@@ -82,6 +82,7 @@ public class TypesenseService {
|
|||||||
new Field().name("wordCount").type("int32").facet(true).sort(true).optional(true),
|
new Field().name("wordCount").type("int32").facet(true).sort(true).optional(true),
|
||||||
new Field().name("volume").type("int32").facet(true).sort(true).optional(true),
|
new Field().name("volume").type("int32").facet(true).sort(true).optional(true),
|
||||||
new Field().name("createdAt").type("int64").facet(false).sort(true),
|
new Field().name("createdAt").type("int64").facet(false).sort(true),
|
||||||
|
new Field().name("lastReadAt").type("int64").facet(false).sort(true).optional(true),
|
||||||
new Field().name("sourceUrl").type("string").facet(false).optional(true),
|
new Field().name("sourceUrl").type("string").facet(false).optional(true),
|
||||||
new Field().name("coverPath").type("string").facet(false).optional(true)
|
new Field().name("coverPath").type("string").facet(false).optional(true)
|
||||||
);
|
);
|
||||||
@@ -392,6 +393,10 @@ public class TypesenseService {
|
|||||||
story.getCreatedAt().toEpochSecond(java.time.ZoneOffset.UTC) :
|
story.getCreatedAt().toEpochSecond(java.time.ZoneOffset.UTC) :
|
||||||
java.time.LocalDateTime.now().toEpochSecond(java.time.ZoneOffset.UTC));
|
java.time.LocalDateTime.now().toEpochSecond(java.time.ZoneOffset.UTC));
|
||||||
|
|
||||||
|
if (story.getLastReadAt() != null) {
|
||||||
|
document.put("lastReadAt", story.getLastReadAt().toEpochSecond(java.time.ZoneOffset.UTC));
|
||||||
|
}
|
||||||
|
|
||||||
if (story.getSourceUrl() != null) {
|
if (story.getSourceUrl() != null) {
|
||||||
document.put("sourceUrl", story.getSourceUrl());
|
document.put("sourceUrl", story.getSourceUrl());
|
||||||
}
|
}
|
||||||
@@ -517,6 +522,12 @@ public class TypesenseService {
|
|||||||
timestamp, 0, java.time.ZoneOffset.UTC));
|
timestamp, 0, java.time.ZoneOffset.UTC));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (doc.get("lastReadAt") != null) {
|
||||||
|
long timestamp = ((Number) doc.get("lastReadAt")).longValue();
|
||||||
|
dto.setLastReadAt(java.time.LocalDateTime.ofEpochSecond(
|
||||||
|
timestamp, 0, java.time.ZoneOffset.UTC));
|
||||||
|
}
|
||||||
|
|
||||||
// Set search-specific fields - handle null for wildcard queries
|
// Set search-specific fields - handle null for wildcard queries
|
||||||
Long textMatch = hit.getTextMatch();
|
Long textMatch = hit.getTextMatch();
|
||||||
dto.setSearchScore(textMatch != null ? textMatch : 0L);
|
dto.setSearchScore(textMatch != null ? textMatch : 0L);
|
||||||
@@ -665,6 +676,11 @@ public class TypesenseService {
|
|||||||
case "created_at":
|
case "created_at":
|
||||||
case "date":
|
case "date":
|
||||||
return "createdAt";
|
return "createdAt";
|
||||||
|
case "lastread":
|
||||||
|
case "last_read":
|
||||||
|
case "lastreadat":
|
||||||
|
case "last_read_at":
|
||||||
|
return "lastReadAt";
|
||||||
case "rating":
|
case "rating":
|
||||||
return "rating";
|
return "rating";
|
||||||
case "wordcount":
|
case "wordcount":
|
||||||
|
|||||||
@@ -0,0 +1,12 @@
|
|||||||
|
package com.storycove.service.exception;
|
||||||
|
|
||||||
|
public class InvalidFileException extends RuntimeException {
|
||||||
|
|
||||||
|
public InvalidFileException(String message) {
|
||||||
|
super(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
public InvalidFileException(String message, Throwable cause) {
|
||||||
|
super(message, cause);
|
||||||
|
}
|
||||||
|
}
|
||||||
7
backend/test-fixed-export.epub
Normal file
7
backend/test-fixed-export.epub
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
<html>
|
||||||
|
<head><title>502 Bad Gateway</title></head>
|
||||||
|
<body>
|
||||||
|
<center><h1>502 Bad Gateway</h1></center>
|
||||||
|
<hr><center>nginx/1.29.0</center>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
@@ -64,6 +64,32 @@ export default function AddStoryPage() {
|
|||||||
}
|
}
|
||||||
}, [searchParams]);
|
}, [searchParams]);
|
||||||
|
|
||||||
|
// Load pending story data from bulk combine operation
|
||||||
|
useEffect(() => {
|
||||||
|
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
|
||||||
|
if (fromBulkCombine) {
|
||||||
|
const pendingStoryData = localStorage.getItem('pendingStory');
|
||||||
|
if (pendingStoryData) {
|
||||||
|
try {
|
||||||
|
const storyData = JSON.parse(pendingStoryData);
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
title: storyData.title || '',
|
||||||
|
authorName: storyData.author || '',
|
||||||
|
contentHtml: storyData.content || '',
|
||||||
|
sourceUrl: storyData.sourceUrl || '',
|
||||||
|
summary: storyData.summary || '',
|
||||||
|
tags: storyData.tags || []
|
||||||
|
}));
|
||||||
|
// Clear the pending data
|
||||||
|
localStorage.removeItem('pendingStory');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to load pending story data:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}, [searchParams]);
|
||||||
|
|
||||||
// Check for duplicates when title and author are both present
|
// Check for duplicates when title and author are both present
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const checkDuplicates = async () => {
|
const checkDuplicates = async () => {
|
||||||
@@ -442,7 +468,7 @@ export default function AddStoryPage() {
|
|||||||
</label>
|
</label>
|
||||||
<ImageUpload
|
<ImageUpload
|
||||||
onImageSelect={setCoverImage}
|
onImageSelect={setCoverImage}
|
||||||
accept="image/jpeg,image/png,image/webp"
|
accept="image/jpeg,image/png"
|
||||||
maxSizeMB={5}
|
maxSizeMB={5}
|
||||||
aspectRatio="3:4"
|
aspectRatio="3:4"
|
||||||
placeholder="Drop a cover image here or click to select"
|
placeholder="Drop a cover image here or click to select"
|
||||||
|
|||||||
@@ -269,7 +269,7 @@ export default function EditAuthorPage() {
|
|||||||
</label>
|
</label>
|
||||||
<ImageUpload
|
<ImageUpload
|
||||||
onImageSelect={setAvatarImage}
|
onImageSelect={setAvatarImage}
|
||||||
accept="image/jpeg,image/png,image/webp"
|
accept="image/jpeg,image/png"
|
||||||
maxSizeMB={5}
|
maxSizeMB={5}
|
||||||
aspectRatio="1:1"
|
aspectRatio="1:1"
|
||||||
placeholder="Drop an avatar image here or click to select"
|
placeholder="Drop an avatar image here or click to select"
|
||||||
|
|||||||
@@ -11,16 +11,17 @@ import TagFilter from '../../components/stories/TagFilter';
|
|||||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||||
|
|
||||||
type ViewMode = 'grid' | 'list';
|
type ViewMode = 'grid' | 'list';
|
||||||
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount';
|
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastRead';
|
||||||
|
|
||||||
export default function LibraryPage() {
|
export default function LibraryPage() {
|
||||||
const [stories, setStories] = useState<Story[]>([]);
|
const [stories, setStories] = useState<Story[]>([]);
|
||||||
const [tags, setTags] = useState<Tag[]>([]);
|
const [tags, setTags] = useState<Tag[]>([]);
|
||||||
const [loading, setLoading] = useState(false);
|
const [loading, setLoading] = useState(false);
|
||||||
|
const [searchLoading, setSearchLoading] = useState(false);
|
||||||
const [searchQuery, setSearchQuery] = useState('');
|
const [searchQuery, setSearchQuery] = useState('');
|
||||||
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
||||||
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
||||||
const [sortOption, setSortOption] = useState<SortOption>('createdAt');
|
const [sortOption, setSortOption] = useState<SortOption>('lastRead');
|
||||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
||||||
const [page, setPage] = useState(0);
|
const [page, setPage] = useState(0);
|
||||||
const [totalPages, setTotalPages] = useState(1);
|
const [totalPages, setTotalPages] = useState(1);
|
||||||
@@ -47,7 +48,13 @@ export default function LibraryPage() {
|
|||||||
const debounceTimer = setTimeout(() => {
|
const debounceTimer = setTimeout(() => {
|
||||||
const performSearch = async () => {
|
const performSearch = async () => {
|
||||||
try {
|
try {
|
||||||
setLoading(true);
|
// Use searchLoading for background search, loading only for initial load
|
||||||
|
const isInitialLoad = stories.length === 0 && !searchQuery && selectedTags.length === 0;
|
||||||
|
if (isInitialLoad) {
|
||||||
|
setLoading(true);
|
||||||
|
} else {
|
||||||
|
setSearchLoading(true);
|
||||||
|
}
|
||||||
|
|
||||||
// Always use search API for consistency - use '*' for match-all when no query
|
// Always use search API for consistency - use '*' for match-all when no query
|
||||||
const result = await searchApi.search({
|
const result = await searchApi.search({
|
||||||
@@ -73,11 +80,12 @@ export default function LibraryPage() {
|
|||||||
setStories([]);
|
setStories([]);
|
||||||
} finally {
|
} finally {
|
||||||
setLoading(false);
|
setLoading(false);
|
||||||
|
setSearchLoading(false);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
performSearch();
|
performSearch();
|
||||||
}, searchQuery ? 300 : 0); // Debounce search, but not other changes
|
}, searchQuery ? 500 : 0); // 500ms debounce for search, immediate for other changes
|
||||||
|
|
||||||
return () => clearTimeout(debounceTimer);
|
return () => clearTimeout(debounceTimer);
|
||||||
}, [searchQuery, selectedTags, page, sortOption, sortDirection, refreshTrigger]);
|
}, [searchQuery, selectedTags, page, sortOption, sortDirection, refreshTrigger]);
|
||||||
@@ -154,16 +162,21 @@ export default function LibraryPage() {
|
|||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<Button href="/add-story">
|
<div className="flex gap-2">
|
||||||
Add New Story
|
<Button href="/add-story">
|
||||||
</Button>
|
Add New Story
|
||||||
|
</Button>
|
||||||
|
<Button href="/stories/import/epub" variant="secondary">
|
||||||
|
📖 Import EPUB
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Search and Filters */}
|
{/* Search and Filters */}
|
||||||
<div className="space-y-4">
|
<div className="space-y-4">
|
||||||
{/* Search Bar */}
|
{/* Search Bar */}
|
||||||
<div className="flex flex-col sm:flex-row gap-4">
|
<div className="flex flex-col sm:flex-row gap-4">
|
||||||
<div className="flex-1">
|
<div className="flex-1 relative">
|
||||||
<Input
|
<Input
|
||||||
type="search"
|
type="search"
|
||||||
placeholder="Search by title, author, or tags..."
|
placeholder="Search by title, author, or tags..."
|
||||||
@@ -171,6 +184,11 @@ export default function LibraryPage() {
|
|||||||
onChange={handleSearchChange}
|
onChange={handleSearchChange}
|
||||||
className="w-full"
|
className="w-full"
|
||||||
/>
|
/>
|
||||||
|
{searchLoading && (
|
||||||
|
<div className="absolute right-3 top-1/2 transform -translate-y-1/2">
|
||||||
|
<div className="animate-spin h-4 w-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* View Mode Toggle */}
|
{/* View Mode Toggle */}
|
||||||
@@ -215,6 +233,7 @@ export default function LibraryPage() {
|
|||||||
<option value="authorName">Author</option>
|
<option value="authorName">Author</option>
|
||||||
<option value="rating">Rating</option>
|
<option value="rating">Rating</option>
|
||||||
<option value="wordCount">Word Count</option>
|
<option value="wordCount">Word Count</option>
|
||||||
|
<option value="lastRead">Last Read</option>
|
||||||
</select>
|
</select>
|
||||||
|
|
||||||
{/* Sort Direction Toggle */}
|
{/* Sort Direction Toggle */}
|
||||||
|
|||||||
93
frontend/src/app/scrape/bulk/progress/route.ts
Normal file
93
frontend/src/app/scrape/bulk/progress/route.ts
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
import { NextRequest } from 'next/server';
|
||||||
|
|
||||||
|
// Configure route timeout for long-running progress streams
|
||||||
|
export const maxDuration = 900; // 15 minutes (900 seconds)
|
||||||
|
|
||||||
|
interface ProgressUpdate {
|
||||||
|
type: 'progress' | 'completed' | 'error';
|
||||||
|
current: number;
|
||||||
|
total: number;
|
||||||
|
message: string;
|
||||||
|
url?: string;
|
||||||
|
title?: string;
|
||||||
|
author?: string;
|
||||||
|
wordCount?: number;
|
||||||
|
totalWordCount?: number;
|
||||||
|
error?: string;
|
||||||
|
combinedStory?: any;
|
||||||
|
results?: any[];
|
||||||
|
summary?: any;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Global progress storage (in production, use Redis or database)
|
||||||
|
const progressStore = new Map<string, ProgressUpdate[]>();
|
||||||
|
|
||||||
|
export async function GET(request: NextRequest) {
|
||||||
|
const searchParams = request.nextUrl.searchParams;
|
||||||
|
const sessionId = searchParams.get('sessionId');
|
||||||
|
|
||||||
|
if (!sessionId) {
|
||||||
|
return new Response('Session ID required', { status: 400 });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set up Server-Sent Events
|
||||||
|
const stream = new ReadableStream({
|
||||||
|
start(controller) {
|
||||||
|
const encoder = new TextEncoder();
|
||||||
|
|
||||||
|
// Send initial connection message
|
||||||
|
const data = `data: ${JSON.stringify({ type: 'connected', sessionId })}\n\n`;
|
||||||
|
controller.enqueue(encoder.encode(data));
|
||||||
|
|
||||||
|
// Check for progress updates every 500ms
|
||||||
|
const interval = setInterval(() => {
|
||||||
|
const updates = progressStore.get(sessionId);
|
||||||
|
if (updates && updates.length > 0) {
|
||||||
|
// Send all pending updates
|
||||||
|
updates.forEach(update => {
|
||||||
|
const data = `data: ${JSON.stringify(update)}\n\n`;
|
||||||
|
controller.enqueue(encoder.encode(data));
|
||||||
|
});
|
||||||
|
|
||||||
|
// Clear sent updates
|
||||||
|
progressStore.delete(sessionId);
|
||||||
|
|
||||||
|
// If this was a completion or error, close the stream
|
||||||
|
const lastUpdate = updates[updates.length - 1];
|
||||||
|
if (lastUpdate.type === 'completed' || lastUpdate.type === 'error') {
|
||||||
|
clearInterval(interval);
|
||||||
|
controller.close();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}, 500);
|
||||||
|
|
||||||
|
// Cleanup after timeout
|
||||||
|
setTimeout(() => {
|
||||||
|
clearInterval(interval);
|
||||||
|
progressStore.delete(sessionId);
|
||||||
|
controller.close();
|
||||||
|
}, 900000); // 15 minutes
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return new Response(stream, {
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'text/event-stream',
|
||||||
|
'Cache-Control': 'no-cache',
|
||||||
|
'Connection': 'keep-alive',
|
||||||
|
'Access-Control-Allow-Origin': '*',
|
||||||
|
'Access-Control-Allow-Headers': 'Cache-Control',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper function for other routes to send progress updates
|
||||||
|
export function sendProgressUpdate(sessionId: string, update: ProgressUpdate) {
|
||||||
|
if (!progressStore.has(sessionId)) {
|
||||||
|
progressStore.set(sessionId, []);
|
||||||
|
}
|
||||||
|
progressStore.get(sessionId)!.push(update);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Export the helper for other modules to use
|
||||||
|
export { progressStore };
|
||||||
@@ -1,7 +1,23 @@
|
|||||||
import { NextRequest, NextResponse } from 'next/server';
|
import { NextRequest, NextResponse } from 'next/server';
|
||||||
|
|
||||||
|
// Configure route timeout for long-running scraping operations
|
||||||
|
export const maxDuration = 900; // 15 minutes (900 seconds)
|
||||||
|
|
||||||
|
// Import progress tracking helper
|
||||||
|
async function sendProgressUpdate(sessionId: string, update: any) {
|
||||||
|
try {
|
||||||
|
// Dynamic import to avoid circular dependency
|
||||||
|
const { sendProgressUpdate: sendUpdate } = await import('./progress/route');
|
||||||
|
sendUpdate(sessionId, update);
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to send progress update:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
interface BulkImportRequest {
|
interface BulkImportRequest {
|
||||||
urls: string[];
|
urls: string[];
|
||||||
|
combineIntoOne?: boolean;
|
||||||
|
sessionId?: string; // For progress tracking
|
||||||
}
|
}
|
||||||
|
|
||||||
interface ImportResult {
|
interface ImportResult {
|
||||||
@@ -22,52 +38,430 @@ interface BulkImportResponse {
|
|||||||
skipped: number;
|
skipped: number;
|
||||||
errors: number;
|
errors: number;
|
||||||
};
|
};
|
||||||
|
combinedStory?: {
|
||||||
|
title: string;
|
||||||
|
author: string;
|
||||||
|
content: string;
|
||||||
|
summary?: string;
|
||||||
|
sourceUrl: string;
|
||||||
|
tags?: string[];
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
export async function POST(request: NextRequest) {
|
// Background processing function for combined mode
|
||||||
|
async function processCombinedMode(
|
||||||
|
urls: string[],
|
||||||
|
sessionId: string,
|
||||||
|
authorization: string,
|
||||||
|
scraper: any
|
||||||
|
) {
|
||||||
|
const results: ImportResult[] = [];
|
||||||
|
let importedCount = 0;
|
||||||
|
let errorCount = 0;
|
||||||
|
|
||||||
|
const combinedContent: string[] = [];
|
||||||
|
let baseTitle = '';
|
||||||
|
let baseAuthor = '';
|
||||||
|
let baseSummary = '';
|
||||||
|
let baseSourceUrl = '';
|
||||||
|
const combinedTags = new Set<string>();
|
||||||
|
let totalWordCount = 0;
|
||||||
|
|
||||||
|
// Send initial progress update
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: 0,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Starting to scrape ${urls.length} URLs for combining...`,
|
||||||
|
totalWordCount: 0
|
||||||
|
});
|
||||||
|
|
||||||
|
for (let i = 0; i < urls.length; i++) {
|
||||||
|
const url = urls[i];
|
||||||
|
console.log(`Scraping URL ${i + 1}/${urls.length} for combine: ${url}`);
|
||||||
|
|
||||||
|
// Send progress update
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: i,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Scraping URL ${i + 1} of ${urls.length}...`,
|
||||||
|
url: url,
|
||||||
|
totalWordCount
|
||||||
|
});
|
||||||
|
|
||||||
|
try {
|
||||||
|
const trimmedUrl = url.trim();
|
||||||
|
if (!trimmedUrl) {
|
||||||
|
results.push({
|
||||||
|
url: url || 'Empty URL',
|
||||||
|
status: 'error',
|
||||||
|
error: 'Empty URL in combined mode'
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const scrapedStory = await scraper.scrapeStory(trimmedUrl);
|
||||||
|
|
||||||
|
// Check if we got content - this is required for combined mode
|
||||||
|
if (!scrapedStory.content || scrapedStory.content.trim() === '') {
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'error',
|
||||||
|
error: 'No content found - required for combined mode'
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use first URL for base metadata (title can be empty for combined mode)
|
||||||
|
if (i === 0) {
|
||||||
|
baseTitle = scrapedStory.title || 'Combined Story';
|
||||||
|
baseAuthor = scrapedStory.author || 'Unknown Author';
|
||||||
|
baseSummary = scrapedStory.summary || '';
|
||||||
|
baseSourceUrl = trimmedUrl;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add content with URL separator
|
||||||
|
combinedContent.push(`<!-- Content from: ${trimmedUrl} -->`);
|
||||||
|
if (scrapedStory.title && i > 0) {
|
||||||
|
combinedContent.push(`<h2>${scrapedStory.title}</h2>`);
|
||||||
|
}
|
||||||
|
combinedContent.push(scrapedStory.content);
|
||||||
|
combinedContent.push('<hr/>'); // Visual separator between parts
|
||||||
|
|
||||||
|
// Calculate word count for this story
|
||||||
|
const textContent = scrapedStory.content.replace(/<[^>]*>/g, ''); // Strip HTML
|
||||||
|
const wordCount = textContent.split(/\s+/).filter((word: string) => word.length > 0).length;
|
||||||
|
totalWordCount += wordCount;
|
||||||
|
|
||||||
|
// Collect tags from all stories
|
||||||
|
if (scrapedStory.tags) {
|
||||||
|
scrapedStory.tags.forEach((tag: string) => combinedTags.add(tag));
|
||||||
|
}
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'imported',
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
importedCount++;
|
||||||
|
|
||||||
|
// Send progress update with word count
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: i + 1,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Scraped "${scrapedStory.title}" (${wordCount.toLocaleString()} words)`,
|
||||||
|
url: trimmedUrl,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author,
|
||||||
|
wordCount: wordCount,
|
||||||
|
totalWordCount: totalWordCount
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Error processing URL ${url} in combined mode:`, error);
|
||||||
|
results.push({
|
||||||
|
url: url,
|
||||||
|
status: 'error',
|
||||||
|
error: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// If we have any errors, fail the entire combined operation
|
||||||
|
if (errorCount > 0) {
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'error',
|
||||||
|
current: urls.length,
|
||||||
|
total: urls.length,
|
||||||
|
message: 'Combined mode failed: some URLs could not be processed',
|
||||||
|
error: `${errorCount} URLs failed to process`
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check content size to prevent response size issues
|
||||||
|
const combinedContentString = combinedContent.join('\n');
|
||||||
|
const contentSizeInMB = new Blob([combinedContentString]).size / (1024 * 1024);
|
||||||
|
|
||||||
|
console.log(`Combined content size: ${contentSizeInMB.toFixed(2)} MB`);
|
||||||
|
console.log(`Combined content character length: ${combinedContentString.length}`);
|
||||||
|
console.log(`Combined content parts count: ${combinedContent.length}`);
|
||||||
|
|
||||||
|
// Return the combined story data via progress update
|
||||||
|
const combinedStory = {
|
||||||
|
title: baseTitle,
|
||||||
|
author: baseAuthor,
|
||||||
|
content: contentSizeInMB > 10 ?
|
||||||
|
combinedContentString.substring(0, Math.floor(combinedContentString.length * (10 / contentSizeInMB))) + '\n\n<!-- Content truncated due to size limit -->' :
|
||||||
|
combinedContentString,
|
||||||
|
summary: contentSizeInMB > 10 ? baseSummary + ' (Content truncated due to size limit)' : baseSummary,
|
||||||
|
sourceUrl: baseSourceUrl,
|
||||||
|
tags: Array.from(combinedTags)
|
||||||
|
};
|
||||||
|
|
||||||
|
// Send completion notification for combine mode
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'completed',
|
||||||
|
current: urls.length,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Combined scraping completed: ${totalWordCount.toLocaleString()} words from ${importedCount} stories`,
|
||||||
|
totalWordCount: totalWordCount,
|
||||||
|
combinedStory: combinedStory
|
||||||
|
});
|
||||||
|
|
||||||
|
console.log(`Combined scraping completed: ${importedCount} URLs combined into one story`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Background processing function for individual mode
|
||||||
|
async function processIndividualMode(
|
||||||
|
urls: string[],
|
||||||
|
sessionId: string,
|
||||||
|
authorization: string,
|
||||||
|
scraper: any
|
||||||
|
) {
|
||||||
|
const results: ImportResult[] = [];
|
||||||
|
let importedCount = 0;
|
||||||
|
let skippedCount = 0;
|
||||||
|
let errorCount = 0;
|
||||||
|
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: 0,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Starting to import ${urls.length} URLs individually...`
|
||||||
|
});
|
||||||
|
|
||||||
|
for (let i = 0; i < urls.length; i++) {
|
||||||
|
const url = urls[i];
|
||||||
|
console.log(`Processing URL ${i + 1}/${urls.length}: ${url}`);
|
||||||
|
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: i,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Processing URL ${i + 1} of ${urls.length}...`,
|
||||||
|
url: url
|
||||||
|
});
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Validate URL format
|
||||||
|
if (!url || typeof url !== 'string' || url.trim() === '') {
|
||||||
|
results.push({
|
||||||
|
url: url || 'Empty URL',
|
||||||
|
status: 'error',
|
||||||
|
error: 'Invalid URL format'
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
const trimmedUrl = url.trim();
|
||||||
|
|
||||||
|
// Scrape the story
|
||||||
|
const scrapedStory = await scraper.scrapeStory(trimmedUrl);
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!scrapedStory.title || !scrapedStory.author || !scrapedStory.content) {
|
||||||
|
const missingFields = [];
|
||||||
|
if (!scrapedStory.title) missingFields.push('title');
|
||||||
|
if (!scrapedStory.author) missingFields.push('author');
|
||||||
|
if (!scrapedStory.content) missingFields.push('content');
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'skipped',
|
||||||
|
reason: `Missing required fields: ${missingFields.join(', ')}`,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
skippedCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for duplicates using query parameters
|
||||||
|
try {
|
||||||
|
const duplicateCheckUrl = `http://backend:8080/api/stories/check-duplicate`;
|
||||||
|
const params = new URLSearchParams({
|
||||||
|
title: scrapedStory.title,
|
||||||
|
authorName: scrapedStory.author
|
||||||
|
});
|
||||||
|
|
||||||
|
const duplicateCheckResponse = await fetch(`${duplicateCheckUrl}?${params.toString()}`, {
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
'Authorization': authorization,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (duplicateCheckResponse.ok) {
|
||||||
|
const duplicateResult = await duplicateCheckResponse.json();
|
||||||
|
if (duplicateResult.hasDuplicates) {
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'skipped',
|
||||||
|
reason: `Duplicate story found (${duplicateResult.count} existing)`,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
skippedCount++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Duplicate check failed:', error);
|
||||||
|
// Continue with import if duplicate check fails
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create the story
|
||||||
|
try {
|
||||||
|
const storyData = {
|
||||||
|
title: scrapedStory.title,
|
||||||
|
summary: scrapedStory.summary || undefined,
|
||||||
|
contentHtml: scrapedStory.content,
|
||||||
|
sourceUrl: scrapedStory.sourceUrl || trimmedUrl,
|
||||||
|
authorName: scrapedStory.author,
|
||||||
|
tagNames: scrapedStory.tags && scrapedStory.tags.length > 0 ? scrapedStory.tags : undefined,
|
||||||
|
};
|
||||||
|
|
||||||
|
const createUrl = `http://backend:8080/api/stories`;
|
||||||
|
const createResponse = await fetch(createUrl, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Authorization': authorization,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
body: JSON.stringify(storyData),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!createResponse.ok) {
|
||||||
|
const errorData = await createResponse.json();
|
||||||
|
throw new Error(errorData.message || 'Failed to create story');
|
||||||
|
}
|
||||||
|
|
||||||
|
const createdStory = await createResponse.json();
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'imported',
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author,
|
||||||
|
storyId: createdStory.id
|
||||||
|
});
|
||||||
|
importedCount++;
|
||||||
|
|
||||||
|
console.log(`Successfully imported: ${scrapedStory.title} by ${scrapedStory.author} (ID: ${createdStory.id})`);
|
||||||
|
|
||||||
|
// Send progress update for successful import
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'progress',
|
||||||
|
current: i + 1,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Imported "${scrapedStory.title}" by ${scrapedStory.author}`,
|
||||||
|
url: trimmedUrl,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Failed to create story for ${trimmedUrl}:`, error);
|
||||||
|
|
||||||
|
let errorMessage = 'Failed to create story';
|
||||||
|
if (error instanceof Error) {
|
||||||
|
errorMessage = error.message;
|
||||||
|
}
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: trimmedUrl,
|
||||||
|
status: 'error',
|
||||||
|
error: errorMessage,
|
||||||
|
title: scrapedStory.title,
|
||||||
|
author: scrapedStory.author
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Error processing URL ${url}:`, error);
|
||||||
|
|
||||||
|
let errorMessage = 'Unknown error';
|
||||||
|
if (error instanceof Error) {
|
||||||
|
errorMessage = error.message;
|
||||||
|
}
|
||||||
|
|
||||||
|
results.push({
|
||||||
|
url: url,
|
||||||
|
status: 'error',
|
||||||
|
error: errorMessage
|
||||||
|
});
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send completion notification
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'completed',
|
||||||
|
current: urls.length,
|
||||||
|
total: urls.length,
|
||||||
|
message: `Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`,
|
||||||
|
results: results,
|
||||||
|
summary: {
|
||||||
|
total: urls.length,
|
||||||
|
imported: importedCount,
|
||||||
|
skipped: skippedCount,
|
||||||
|
errors: errorCount
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
|
||||||
|
|
||||||
|
// Trigger Typesense reindex if any stories were imported
|
||||||
|
if (importedCount > 0) {
|
||||||
|
try {
|
||||||
|
console.log('Triggering Typesense reindex after bulk import...');
|
||||||
|
const reindexUrl = `http://backend:8080/api/stories/reindex-typesense`;
|
||||||
|
const reindexResponse = await fetch(reindexUrl, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Authorization': authorization,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (reindexResponse.ok) {
|
||||||
|
const reindexResult = await reindexResponse.json();
|
||||||
|
console.log('Typesense reindex completed:', reindexResult);
|
||||||
|
} else {
|
||||||
|
console.warn('Typesense reindex failed:', reindexResponse.status);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to trigger Typesense reindex:', error);
|
||||||
|
// Don't fail the whole request if reindex fails
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Background processing function
|
||||||
|
async function processBulkImport(
|
||||||
|
urls: string[],
|
||||||
|
combineIntoOne: boolean,
|
||||||
|
sessionId: string,
|
||||||
|
authorization: string
|
||||||
|
) {
|
||||||
try {
|
try {
|
||||||
// Check for authentication
|
|
||||||
const authorization = request.headers.get('authorization');
|
|
||||||
if (!authorization) {
|
|
||||||
return NextResponse.json(
|
|
||||||
{ error: 'Authentication required for bulk import' },
|
|
||||||
{ status: 401 }
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
const body = await request.json();
|
|
||||||
const { urls } = body as BulkImportRequest;
|
|
||||||
|
|
||||||
if (!urls || !Array.isArray(urls) || urls.length === 0) {
|
|
||||||
return NextResponse.json(
|
|
||||||
{ error: 'URLs array is required and must not be empty' },
|
|
||||||
{ status: 400 }
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (urls.length > 50) {
|
|
||||||
return NextResponse.json(
|
|
||||||
{ error: 'Maximum 50 URLs allowed per bulk import' },
|
|
||||||
{ status: 400 }
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Dynamic imports to prevent client-side bundling
|
// Dynamic imports to prevent client-side bundling
|
||||||
const { StoryScraper } = await import('@/lib/scraper/scraper');
|
const { StoryScraper } = await import('@/lib/scraper/scraper');
|
||||||
|
|
||||||
const scraper = new StoryScraper();
|
const scraper = new StoryScraper();
|
||||||
const results: ImportResult[] = [];
|
|
||||||
let importedCount = 0;
|
|
||||||
let skippedCount = 0;
|
|
||||||
let errorCount = 0;
|
|
||||||
|
|
||||||
console.log(`Starting bulk scraping for ${urls.length} URLs`);
|
console.log(`Starting bulk scraping for ${urls.length} URLs${combineIntoOne ? ' (combine mode)' : ''}`);
|
||||||
console.log(`Environment NEXT_PUBLIC_API_URL: ${process.env.NEXT_PUBLIC_API_URL}`);
|
console.log(`Session ID: ${sessionId}`);
|
||||||
|
|
||||||
// For server-side API calls in Docker, use direct backend container URL
|
|
||||||
// Client-side calls use NEXT_PUBLIC_API_URL through nginx, but server-side needs direct container access
|
|
||||||
const serverSideApiBaseUrl = 'http://backend:8080/api';
|
|
||||||
console.log(`DEBUG: serverSideApiBaseUrl variable is: ${serverSideApiBaseUrl}`);
|
|
||||||
|
|
||||||
// Quick test to verify backend connectivity
|
// Quick test to verify backend connectivity
|
||||||
try {
|
try {
|
||||||
@@ -84,208 +478,86 @@ export async function POST(request: NextRequest) {
|
|||||||
console.error(`Backend connectivity test failed:`, error);
|
console.error(`Backend connectivity test failed:`, error);
|
||||||
}
|
}
|
||||||
|
|
||||||
for (const url of urls) {
|
// Handle combined mode
|
||||||
console.log(`Processing URL: ${url}`);
|
if (combineIntoOne) {
|
||||||
|
await processCombinedMode(urls, sessionId, authorization, scraper);
|
||||||
try {
|
} else {
|
||||||
// Validate URL format
|
// Normal individual processing mode
|
||||||
if (!url || typeof url !== 'string' || url.trim() === '') {
|
await processIndividualMode(urls, sessionId, authorization, scraper);
|
||||||
results.push({
|
|
||||||
url: url || 'Empty URL',
|
|
||||||
status: 'error',
|
|
||||||
error: 'Invalid URL format'
|
|
||||||
});
|
|
||||||
errorCount++;
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
const trimmedUrl = url.trim();
|
|
||||||
|
|
||||||
// Scrape the story
|
|
||||||
const scrapedStory = await scraper.scrapeStory(trimmedUrl);
|
|
||||||
|
|
||||||
// Validate required fields
|
|
||||||
if (!scrapedStory.title || !scrapedStory.author || !scrapedStory.content) {
|
|
||||||
const missingFields = [];
|
|
||||||
if (!scrapedStory.title) missingFields.push('title');
|
|
||||||
if (!scrapedStory.author) missingFields.push('author');
|
|
||||||
if (!scrapedStory.content) missingFields.push('content');
|
|
||||||
|
|
||||||
results.push({
|
|
||||||
url: trimmedUrl,
|
|
||||||
status: 'skipped',
|
|
||||||
reason: `Missing required fields: ${missingFields.join(', ')}`,
|
|
||||||
title: scrapedStory.title,
|
|
||||||
author: scrapedStory.author
|
|
||||||
});
|
|
||||||
skippedCount++;
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check for duplicates using query parameters
|
|
||||||
try {
|
|
||||||
// Use hardcoded backend URL for container-to-container communication
|
|
||||||
const duplicateCheckUrl = `http://backend:8080/api/stories/check-duplicate`;
|
|
||||||
console.log(`Duplicate check URL: ${duplicateCheckUrl}`);
|
|
||||||
const params = new URLSearchParams({
|
|
||||||
title: scrapedStory.title,
|
|
||||||
authorName: scrapedStory.author
|
|
||||||
});
|
|
||||||
|
|
||||||
const duplicateCheckResponse = await fetch(`${duplicateCheckUrl}?${params.toString()}`, {
|
|
||||||
method: 'GET',
|
|
||||||
headers: {
|
|
||||||
'Authorization': authorization,
|
|
||||||
'Content-Type': 'application/json',
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (duplicateCheckResponse.ok) {
|
|
||||||
const duplicateResult = await duplicateCheckResponse.json();
|
|
||||||
if (duplicateResult.hasDuplicates) {
|
|
||||||
results.push({
|
|
||||||
url: trimmedUrl,
|
|
||||||
status: 'skipped',
|
|
||||||
reason: `Duplicate story found (${duplicateResult.count} existing)`,
|
|
||||||
title: scrapedStory.title,
|
|
||||||
author: scrapedStory.author
|
|
||||||
});
|
|
||||||
skippedCount++;
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.warn('Duplicate check failed:', error);
|
|
||||||
// Continue with import if duplicate check fails
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create the story
|
|
||||||
try {
|
|
||||||
const storyData = {
|
|
||||||
title: scrapedStory.title,
|
|
||||||
summary: scrapedStory.summary || undefined,
|
|
||||||
contentHtml: scrapedStory.content,
|
|
||||||
sourceUrl: scrapedStory.sourceUrl || trimmedUrl,
|
|
||||||
authorName: scrapedStory.author,
|
|
||||||
tagNames: scrapedStory.tags && scrapedStory.tags.length > 0 ? scrapedStory.tags : undefined,
|
|
||||||
};
|
|
||||||
|
|
||||||
// Use hardcoded backend URL for container-to-container communication
|
|
||||||
const createUrl = `http://backend:8080/api/stories`;
|
|
||||||
console.log(`Create story URL: ${createUrl}`);
|
|
||||||
const createResponse = await fetch(createUrl, {
|
|
||||||
method: 'POST',
|
|
||||||
headers: {
|
|
||||||
'Authorization': authorization,
|
|
||||||
'Content-Type': 'application/json',
|
|
||||||
},
|
|
||||||
body: JSON.stringify(storyData),
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!createResponse.ok) {
|
|
||||||
const errorData = await createResponse.json();
|
|
||||||
throw new Error(errorData.message || 'Failed to create story');
|
|
||||||
}
|
|
||||||
|
|
||||||
const createdStory = await createResponse.json();
|
|
||||||
|
|
||||||
results.push({
|
|
||||||
url: trimmedUrl,
|
|
||||||
status: 'imported',
|
|
||||||
title: scrapedStory.title,
|
|
||||||
author: scrapedStory.author,
|
|
||||||
storyId: createdStory.id
|
|
||||||
});
|
|
||||||
importedCount++;
|
|
||||||
|
|
||||||
console.log(`Successfully imported: ${scrapedStory.title} by ${scrapedStory.author} (ID: ${createdStory.id})`);
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.error(`Failed to create story for ${trimmedUrl}:`, error);
|
|
||||||
|
|
||||||
let errorMessage = 'Failed to create story';
|
|
||||||
if (error instanceof Error) {
|
|
||||||
errorMessage = error.message;
|
|
||||||
}
|
|
||||||
|
|
||||||
results.push({
|
|
||||||
url: trimmedUrl,
|
|
||||||
status: 'error',
|
|
||||||
error: errorMessage,
|
|
||||||
title: scrapedStory.title,
|
|
||||||
author: scrapedStory.author
|
|
||||||
});
|
|
||||||
errorCount++;
|
|
||||||
}
|
|
||||||
|
|
||||||
} catch (error) {
|
|
||||||
console.error(`Error processing URL ${url}:`, error);
|
|
||||||
|
|
||||||
let errorMessage = 'Unknown error';
|
|
||||||
if (error instanceof Error) {
|
|
||||||
errorMessage = error.message;
|
|
||||||
}
|
|
||||||
|
|
||||||
results.push({
|
|
||||||
url: url,
|
|
||||||
status: 'error',
|
|
||||||
error: errorMessage
|
|
||||||
});
|
|
||||||
errorCount++;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const response: BulkImportResponse = {
|
|
||||||
results,
|
|
||||||
summary: {
|
|
||||||
total: urls.length,
|
|
||||||
imported: importedCount,
|
|
||||||
skipped: skippedCount,
|
|
||||||
errors: errorCount
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
console.log(`Bulk import completed:`, response.summary);
|
|
||||||
|
|
||||||
// Trigger Typesense reindex if any stories were imported
|
|
||||||
if (importedCount > 0) {
|
|
||||||
try {
|
|
||||||
console.log('Triggering Typesense reindex after bulk import...');
|
|
||||||
const reindexUrl = `http://backend:8080/api/stories/reindex-typesense`;
|
|
||||||
const reindexResponse = await fetch(reindexUrl, {
|
|
||||||
method: 'POST',
|
|
||||||
headers: {
|
|
||||||
'Authorization': authorization,
|
|
||||||
'Content-Type': 'application/json',
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (reindexResponse.ok) {
|
|
||||||
const reindexResult = await reindexResponse.json();
|
|
||||||
console.log('Typesense reindex completed:', reindexResult);
|
|
||||||
} else {
|
|
||||||
console.warn('Typesense reindex failed:', reindexResponse.status);
|
|
||||||
}
|
|
||||||
} catch (error) {
|
|
||||||
console.warn('Failed to trigger Typesense reindex:', error);
|
|
||||||
// Don't fail the whole request if reindex fails
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return NextResponse.json(response);
|
|
||||||
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Bulk import error:', error);
|
console.error('Background bulk import error:', error);
|
||||||
|
await sendProgressUpdate(sessionId, {
|
||||||
|
type: 'error',
|
||||||
|
current: 0,
|
||||||
|
total: urls.length,
|
||||||
|
message: 'Bulk import failed due to an error',
|
||||||
|
error: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function POST(request: NextRequest) {
|
||||||
|
try {
|
||||||
|
// Check for authentication
|
||||||
|
const authorization = request.headers.get('authorization');
|
||||||
|
if (!authorization) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Authentication required for bulk import' },
|
||||||
|
{ status: 401 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const body = await request.json();
|
||||||
|
const { urls, combineIntoOne = false, sessionId } = body as BulkImportRequest;
|
||||||
|
|
||||||
|
if (!urls || !Array.isArray(urls) || urls.length === 0) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'URLs array is required and must not be empty' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (urls.length > 200) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Maximum 200 URLs allowed per bulk import' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!sessionId) {
|
||||||
|
return NextResponse.json(
|
||||||
|
{ error: 'Session ID is required for progress tracking' },
|
||||||
|
{ status: 400 }
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start the background processing
|
||||||
|
processBulkImport(urls, combineIntoOne, sessionId, authorization).catch(error => {
|
||||||
|
console.error('Failed to start background processing:', error);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Return immediately with session info
|
||||||
|
return NextResponse.json({
|
||||||
|
message: 'Bulk import started',
|
||||||
|
sessionId: sessionId,
|
||||||
|
totalUrls: urls.length,
|
||||||
|
combineMode: combineIntoOne
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Bulk import initialization error:', error);
|
||||||
|
|
||||||
if (error instanceof Error) {
|
if (error instanceof Error) {
|
||||||
return NextResponse.json(
|
return NextResponse.json(
|
||||||
{ error: `Bulk import failed: ${error.message}` },
|
{ error: `Bulk import failed to start: ${error.message}` },
|
||||||
{ status: 500 }
|
{ status: 500 }
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
return NextResponse.json(
|
return NextResponse.json(
|
||||||
{ error: 'Bulk import failed due to an unknown error' },
|
{ error: 'Bulk import failed to start due to an unknown error' },
|
||||||
{ status: 500 }
|
{ status: 500 }
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,6 +21,7 @@ export default function StoryDetailPage() {
|
|||||||
const [collections, setCollections] = useState<Collection[]>([]);
|
const [collections, setCollections] = useState<Collection[]>([]);
|
||||||
const [loading, setLoading] = useState(true);
|
const [loading, setLoading] = useState(true);
|
||||||
const [updating, setUpdating] = useState(false);
|
const [updating, setUpdating] = useState(false);
|
||||||
|
const [isExporting, setIsExporting] = useState(false);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const loadStoryData = async () => {
|
const loadStoryData = async () => {
|
||||||
@@ -65,6 +66,53 @@ export default function StoryDetailPage() {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const handleEPUBExport = async () => {
|
||||||
|
if (!story) return;
|
||||||
|
|
||||||
|
setIsExporting(true);
|
||||||
|
try {
|
||||||
|
const token = localStorage.getItem('auth-token');
|
||||||
|
const response = await fetch(`/api/stories/${story.id}/epub`, {
|
||||||
|
method: 'GET',
|
||||||
|
headers: {
|
||||||
|
'Authorization': token ? `Bearer ${token}` : '',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (response.ok) {
|
||||||
|
const blob = await response.blob();
|
||||||
|
const url = window.URL.createObjectURL(blob);
|
||||||
|
const link = document.createElement('a');
|
||||||
|
link.href = url;
|
||||||
|
|
||||||
|
// Get filename from Content-Disposition header or create default
|
||||||
|
const contentDisposition = response.headers.get('Content-Disposition');
|
||||||
|
let filename = `${story.title}.epub`;
|
||||||
|
if (contentDisposition) {
|
||||||
|
const match = contentDisposition.match(/filename[^;=\n]*=((['"]).*?\2|[^;\n]*)/);
|
||||||
|
if (match && match[1]) {
|
||||||
|
filename = match[1].replace(/['"]/g, '');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
link.download = filename;
|
||||||
|
document.body.appendChild(link);
|
||||||
|
link.click();
|
||||||
|
window.URL.revokeObjectURL(url);
|
||||||
|
document.body.removeChild(link);
|
||||||
|
} else if (response.status === 401 || response.status === 403) {
|
||||||
|
alert('Authentication required. Please log in.');
|
||||||
|
} else {
|
||||||
|
throw new Error('Failed to export EPUB');
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error exporting EPUB:', error);
|
||||||
|
alert('Failed to export EPUB. Please try again.');
|
||||||
|
} finally {
|
||||||
|
setIsExporting(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
const formatDate = (dateString: string) => {
|
const formatDate = (dateString: string) => {
|
||||||
return new Date(dateString).toLocaleDateString('en-US', {
|
return new Date(dateString).toLocaleDateString('en-US', {
|
||||||
year: 'numeric',
|
year: 'numeric',
|
||||||
@@ -358,6 +406,14 @@ export default function StoryDetailPage() {
|
|||||||
>
|
>
|
||||||
📚 Start Reading
|
📚 Start Reading
|
||||||
</Button>
|
</Button>
|
||||||
|
<Button
|
||||||
|
onClick={handleEPUBExport}
|
||||||
|
variant="ghost"
|
||||||
|
size="lg"
|
||||||
|
disabled={isExporting}
|
||||||
|
>
|
||||||
|
{isExporting ? 'Exporting...' : '📖 Export EPUB'}
|
||||||
|
</Button>
|
||||||
<Button
|
<Button
|
||||||
href={`/stories/${story.id}/edit`}
|
href={`/stories/${story.id}/edit`}
|
||||||
variant="ghost"
|
variant="ghost"
|
||||||
|
|||||||
@@ -252,7 +252,7 @@ export default function EditStoryPage() {
|
|||||||
</label>
|
</label>
|
||||||
<ImageUpload
|
<ImageUpload
|
||||||
onImageSelect={setCoverImage}
|
onImageSelect={setCoverImage}
|
||||||
accept="image/jpeg,image/png,image/webp"
|
accept="image/jpeg,image/png"
|
||||||
maxSizeMB={5}
|
maxSizeMB={5}
|
||||||
aspectRatio="3:4"
|
aspectRatio="3:4"
|
||||||
placeholder="Drop a new cover image here or click to select"
|
placeholder="Drop a new cover image here or click to select"
|
||||||
|
|||||||
@@ -201,6 +201,7 @@ export default function StoryReadingPage() {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
const findNextStory = (): Story | null => {
|
const findNextStory = (): Story | null => {
|
||||||
if (!story?.seriesId || seriesStories.length <= 1) return null;
|
if (!story?.seriesId || seriesStories.length <= 1) return null;
|
||||||
|
|
||||||
|
|||||||
@@ -4,6 +4,7 @@ import { useState } from 'react';
|
|||||||
import { useRouter } from 'next/navigation';
|
import { useRouter } from 'next/navigation';
|
||||||
import Link from 'next/link';
|
import Link from 'next/link';
|
||||||
import { ArrowLeftIcon } from '@heroicons/react/24/outline';
|
import { ArrowLeftIcon } from '@heroicons/react/24/outline';
|
||||||
|
import BulkImportProgress from '@/components/BulkImportProgress';
|
||||||
|
|
||||||
interface ImportResult {
|
interface ImportResult {
|
||||||
url: string;
|
url: string;
|
||||||
@@ -23,14 +24,25 @@ interface BulkImportResponse {
|
|||||||
skipped: number;
|
skipped: number;
|
||||||
errors: number;
|
errors: number;
|
||||||
};
|
};
|
||||||
|
combinedStory?: {
|
||||||
|
title: string;
|
||||||
|
author: string;
|
||||||
|
content: string;
|
||||||
|
summary?: string;
|
||||||
|
sourceUrl: string;
|
||||||
|
tags?: string[];
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
export default function BulkImportPage() {
|
export default function BulkImportPage() {
|
||||||
const router = useRouter();
|
const router = useRouter();
|
||||||
const [urls, setUrls] = useState('');
|
const [urls, setUrls] = useState('');
|
||||||
|
const [combineIntoOne, setCombineIntoOne] = useState(false);
|
||||||
const [isLoading, setIsLoading] = useState(false);
|
const [isLoading, setIsLoading] = useState(false);
|
||||||
const [results, setResults] = useState<BulkImportResponse | null>(null);
|
const [results, setResults] = useState<BulkImportResponse | null>(null);
|
||||||
const [error, setError] = useState<string | null>(null);
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
const [sessionId, setSessionId] = useState<string | null>(null);
|
||||||
|
const [showProgress, setShowProgress] = useState(false);
|
||||||
|
|
||||||
const handleSubmit = async (e: React.FormEvent) => {
|
const handleSubmit = async (e: React.FormEvent) => {
|
||||||
e.preventDefault();
|
e.preventDefault();
|
||||||
@@ -57,12 +69,17 @@ export default function BulkImportPage() {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (urlList.length > 50) {
|
if (urlList.length > 200) {
|
||||||
setError('Maximum 50 URLs allowed per bulk import');
|
setError('Maximum 200 URLs allowed per bulk import');
|
||||||
setIsLoading(false);
|
setIsLoading(false);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Generate session ID for progress tracking
|
||||||
|
const newSessionId = `bulk-import-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
||||||
|
setSessionId(newSessionId);
|
||||||
|
setShowProgress(true);
|
||||||
|
|
||||||
// Get auth token for server-side API calls
|
// Get auth token for server-side API calls
|
||||||
const token = localStorage.getItem('auth-token');
|
const token = localStorage.getItem('auth-token');
|
||||||
|
|
||||||
@@ -72,16 +89,18 @@ export default function BulkImportPage() {
|
|||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
'Authorization': token ? `Bearer ${token}` : '',
|
'Authorization': token ? `Bearer ${token}` : '',
|
||||||
},
|
},
|
||||||
body: JSON.stringify({ urls: urlList }),
|
body: JSON.stringify({ urls: urlList, combineIntoOne, sessionId: newSessionId }),
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
const errorData = await response.json();
|
const errorData = await response.json();
|
||||||
throw new Error(errorData.error || 'Bulk import failed');
|
throw new Error(errorData.error || 'Failed to start bulk import');
|
||||||
}
|
}
|
||||||
|
|
||||||
const data: BulkImportResponse = await response.json();
|
const startData = await response.json();
|
||||||
setResults(data);
|
console.log('Bulk import started:', startData);
|
||||||
|
|
||||||
|
// The progress component will handle the rest via SSE
|
||||||
|
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
console.error('Bulk import error:', err);
|
console.error('Bulk import error:', err);
|
||||||
@@ -93,8 +112,43 @@ export default function BulkImportPage() {
|
|||||||
|
|
||||||
const handleReset = () => {
|
const handleReset = () => {
|
||||||
setUrls('');
|
setUrls('');
|
||||||
|
setCombineIntoOne(false);
|
||||||
setResults(null);
|
setResults(null);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
setSessionId(null);
|
||||||
|
setShowProgress(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleProgressComplete = (data?: any) => {
|
||||||
|
// Progress component will handle this when the operation completes
|
||||||
|
setShowProgress(false);
|
||||||
|
setIsLoading(false);
|
||||||
|
|
||||||
|
// Handle completion data
|
||||||
|
if (data) {
|
||||||
|
if (data.combinedStory && combineIntoOne) {
|
||||||
|
// For combine mode, redirect to add story page with the combined content
|
||||||
|
localStorage.setItem('pendingStory', JSON.stringify(data.combinedStory));
|
||||||
|
router.push('/add-story?from=bulk-combine');
|
||||||
|
return;
|
||||||
|
} else if (data.results && data.summary) {
|
||||||
|
// For individual mode, show the results
|
||||||
|
setResults({
|
||||||
|
results: data.results,
|
||||||
|
summary: data.summary
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback: just hide progress and let user know it completed
|
||||||
|
console.log('Import completed successfully');
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleProgressError = (errorMessage: string) => {
|
||||||
|
setError(errorMessage);
|
||||||
|
setIsLoading(false);
|
||||||
|
setShowProgress(false);
|
||||||
};
|
};
|
||||||
|
|
||||||
const getStatusColor = (status: string) => {
|
const getStatusColor = (status: string) => {
|
||||||
@@ -145,7 +199,7 @@ export default function BulkImportPage() {
|
|||||||
Story URLs
|
Story URLs
|
||||||
</label>
|
</label>
|
||||||
<p className="text-sm text-gray-500 mb-3">
|
<p className="text-sm text-gray-500 mb-3">
|
||||||
Enter one URL per line. Maximum 50 URLs per import.
|
Enter one URL per line. Maximum 200 URLs per import.
|
||||||
</p>
|
</p>
|
||||||
<textarea
|
<textarea
|
||||||
id="urls"
|
id="urls"
|
||||||
@@ -160,6 +214,37 @@ export default function BulkImportPage() {
|
|||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
id="combine-into-one"
|
||||||
|
type="checkbox"
|
||||||
|
checked={combineIntoOne}
|
||||||
|
onChange={(e) => setCombineIntoOne(e.target.checked)}
|
||||||
|
className="h-4 w-4 text-blue-600 focus:ring-blue-500 border-gray-300 rounded"
|
||||||
|
disabled={isLoading}
|
||||||
|
/>
|
||||||
|
<label htmlFor="combine-into-one" className="ml-2 block text-sm text-gray-700">
|
||||||
|
Combine all URL content into a single story
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{combineIntoOne && (
|
||||||
|
<div className="bg-blue-50 border border-blue-200 rounded-md p-4">
|
||||||
|
<div className="text-sm text-blue-800">
|
||||||
|
<p className="font-medium mb-2">Combined Story Mode:</p>
|
||||||
|
<ul className="list-disc list-inside space-y-1 text-blue-700">
|
||||||
|
<li>All URLs will be scraped and their content combined into one story</li>
|
||||||
|
<li>Story title and author will be taken from the first URL</li>
|
||||||
|
<li>Import will fail if any URL has no content (title/author can be empty)</li>
|
||||||
|
<li>You'll be redirected to the story creation page to review and edit</li>
|
||||||
|
{urls.split('\n').filter(url => url.trim().length > 0).length > 50 && (
|
||||||
|
<li className="text-yellow-700 font-medium">⚠️ Large imports (50+ URLs) may take several minutes and could be truncated if too large</li>
|
||||||
|
)}
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
{error && (
|
{error && (
|
||||||
<div className="bg-red-50 border border-red-200 rounded-md p-4">
|
<div className="bg-red-50 border border-red-200 rounded-md p-4">
|
||||||
<div className="flex">
|
<div className="flex">
|
||||||
@@ -192,14 +277,25 @@ export default function BulkImportPage() {
|
|||||||
</button>
|
</button>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{isLoading && (
|
{/* Progress Component */}
|
||||||
|
{showProgress && sessionId && (
|
||||||
|
<BulkImportProgress
|
||||||
|
sessionId={sessionId}
|
||||||
|
onComplete={handleProgressComplete}
|
||||||
|
onError={handleProgressError}
|
||||||
|
combineMode={combineIntoOne}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Fallback loading indicator if progress isn't shown yet */}
|
||||||
|
{isLoading && !showProgress && (
|
||||||
<div className="bg-blue-50 border border-blue-200 rounded-md p-4">
|
<div className="bg-blue-50 border border-blue-200 rounded-md p-4">
|
||||||
<div className="flex items-center">
|
<div className="flex items-center">
|
||||||
<div className="animate-spin rounded-full h-5 w-5 border-b-2 border-blue-600 mr-3"></div>
|
<div className="animate-spin rounded-full h-5 w-5 border-b-2 border-blue-600 mr-3"></div>
|
||||||
<div>
|
<div>
|
||||||
<p className="text-sm font-medium text-blue-800">Processing URLs...</p>
|
<p className="text-sm font-medium text-blue-800">Starting import...</p>
|
||||||
<p className="text-sm text-blue-600">
|
<p className="text-sm text-blue-600">
|
||||||
This may take a few minutes depending on the number of URLs and response times of the source websites.
|
Preparing to process {urls.split('\n').filter(url => url.trim().length > 0).length} URLs.
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
432
frontend/src/app/stories/import/epub/page.tsx
Normal file
432
frontend/src/app/stories/import/epub/page.tsx
Normal file
@@ -0,0 +1,432 @@
|
|||||||
|
'use client';
|
||||||
|
|
||||||
|
import { useState } from 'react';
|
||||||
|
import { useRouter } from 'next/navigation';
|
||||||
|
import Link from 'next/link';
|
||||||
|
import { ArrowLeftIcon, DocumentArrowUpIcon } from '@heroicons/react/24/outline';
|
||||||
|
import Button from '@/components/ui/Button';
|
||||||
|
import { Input } from '@/components/ui/Input';
|
||||||
|
|
||||||
|
interface EPUBImportResponse {
|
||||||
|
success: boolean;
|
||||||
|
message: string;
|
||||||
|
storyId?: string;
|
||||||
|
storyTitle?: string;
|
||||||
|
totalChapters?: number;
|
||||||
|
wordCount?: number;
|
||||||
|
readingPosition?: any;
|
||||||
|
warnings?: string[];
|
||||||
|
errors?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function EPUBImportPage() {
|
||||||
|
const router = useRouter();
|
||||||
|
const [selectedFile, setSelectedFile] = useState<File | null>(null);
|
||||||
|
const [isLoading, setIsLoading] = useState(false);
|
||||||
|
const [isValidating, setIsValidating] = useState(false);
|
||||||
|
const [validationResult, setValidationResult] = useState<any>(null);
|
||||||
|
const [importResult, setImportResult] = useState<EPUBImportResponse | null>(null);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
|
||||||
|
// Import options
|
||||||
|
const [authorName, setAuthorName] = useState<string>('');
|
||||||
|
const [seriesName, setSeriesName] = useState<string>('');
|
||||||
|
const [seriesVolume, setSeriesVolume] = useState<string>('');
|
||||||
|
const [tags, setTags] = useState<string>('');
|
||||||
|
const [preserveReadingPosition, setPreserveReadingPosition] = useState(true);
|
||||||
|
const [overwriteExisting, setOverwriteExisting] = useState(false);
|
||||||
|
const [createMissingAuthor, setCreateMissingAuthor] = useState(true);
|
||||||
|
const [createMissingSeries, setCreateMissingSeries] = useState(true);
|
||||||
|
|
||||||
|
const handleFileChange = async (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||||
|
const file = e.target.files?.[0];
|
||||||
|
if (file) {
|
||||||
|
setSelectedFile(file);
|
||||||
|
setValidationResult(null);
|
||||||
|
setImportResult(null);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
if (file.name.toLowerCase().endsWith('.epub')) {
|
||||||
|
await validateFile(file);
|
||||||
|
} else {
|
||||||
|
setError('Please select a valid EPUB file (.epub extension)');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const validateFile = async (file: File) => {
|
||||||
|
setIsValidating(true);
|
||||||
|
try {
|
||||||
|
const token = localStorage.getItem('auth-token');
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', file);
|
||||||
|
|
||||||
|
const response = await fetch('/api/stories/epub/validate', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Authorization': token ? `Bearer ${token}` : '',
|
||||||
|
},
|
||||||
|
body: formData,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (response.ok) {
|
||||||
|
const result = await response.json();
|
||||||
|
setValidationResult(result);
|
||||||
|
if (!result.valid) {
|
||||||
|
setError('EPUB file validation failed: ' + result.errors.join(', '));
|
||||||
|
}
|
||||||
|
} else if (response.status === 401 || response.status === 403) {
|
||||||
|
setError('Authentication required. Please log in.');
|
||||||
|
} else {
|
||||||
|
setError('Failed to validate EPUB file');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
setError('Error validating EPUB file: ' + (err as Error).message);
|
||||||
|
} finally {
|
||||||
|
setIsValidating(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSubmit = async (e: React.FormEvent) => {
|
||||||
|
e.preventDefault();
|
||||||
|
|
||||||
|
if (!selectedFile) {
|
||||||
|
setError('Please select an EPUB file');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (validationResult && !validationResult.valid) {
|
||||||
|
setError('Cannot import invalid EPUB file');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
setIsLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const token = localStorage.getItem('auth-token');
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('file', selectedFile);
|
||||||
|
|
||||||
|
if (authorName) formData.append('authorName', authorName);
|
||||||
|
if (seriesName) formData.append('seriesName', seriesName);
|
||||||
|
if (seriesVolume) formData.append('seriesVolume', seriesVolume);
|
||||||
|
if (tags) {
|
||||||
|
const tagList = tags.split(',').map(t => t.trim()).filter(t => t.length > 0);
|
||||||
|
tagList.forEach(tag => formData.append('tags', tag));
|
||||||
|
}
|
||||||
|
|
||||||
|
formData.append('preserveReadingPosition', preserveReadingPosition.toString());
|
||||||
|
formData.append('overwriteExisting', overwriteExisting.toString());
|
||||||
|
formData.append('createMissingAuthor', createMissingAuthor.toString());
|
||||||
|
formData.append('createMissingSeries', createMissingSeries.toString());
|
||||||
|
|
||||||
|
const response = await fetch('/api/stories/epub/import', {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Authorization': token ? `Bearer ${token}` : '',
|
||||||
|
},
|
||||||
|
body: formData,
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = await response.json();
|
||||||
|
|
||||||
|
if (response.ok && result.success) {
|
||||||
|
setImportResult(result);
|
||||||
|
} else if (response.status === 401 || response.status === 403) {
|
||||||
|
setError('Authentication required. Please log in.');
|
||||||
|
} else {
|
||||||
|
setError(result.message || 'Failed to import EPUB');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
setError('Error importing EPUB: ' + (err as Error).message);
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const resetForm = () => {
|
||||||
|
setSelectedFile(null);
|
||||||
|
setValidationResult(null);
|
||||||
|
setImportResult(null);
|
||||||
|
setError(null);
|
||||||
|
setAuthorName('');
|
||||||
|
setSeriesName('');
|
||||||
|
setSeriesVolume('');
|
||||||
|
setTags('');
|
||||||
|
};
|
||||||
|
|
||||||
|
if (importResult?.success) {
|
||||||
|
return (
|
||||||
|
<div className="container mx-auto px-4 py-8">
|
||||||
|
<div className="mb-6">
|
||||||
|
<Link
|
||||||
|
href="/stories"
|
||||||
|
className="inline-flex items-center text-blue-600 hover:text-blue-800 mb-4"
|
||||||
|
>
|
||||||
|
<ArrowLeftIcon className="h-4 w-4 mr-2" />
|
||||||
|
Back to Stories
|
||||||
|
</Link>
|
||||||
|
<h1 className="text-3xl font-bold text-gray-900 dark:text-white">
|
||||||
|
EPUB Import Successful
|
||||||
|
</h1>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="bg-white dark:bg-gray-800 border border-gray-200 dark:border-gray-700 rounded-lg p-6">
|
||||||
|
<div className="mb-6">
|
||||||
|
<h2 className="text-xl font-semibold text-green-600 mb-2">Import Completed</h2>
|
||||||
|
<p className="text-gray-600 dark:text-gray-300">
|
||||||
|
Your EPUB has been successfully imported into StoryCove.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<span className="font-semibold text-gray-700 dark:text-gray-300">Story Title:</span>
|
||||||
|
<p className="text-gray-900 dark:text-white">{importResult.storyTitle}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{importResult.wordCount && (
|
||||||
|
<div>
|
||||||
|
<span className="font-semibold text-gray-700 dark:text-gray-300">Word Count:</span>
|
||||||
|
<p className="text-gray-900 dark:text-white">{importResult.wordCount.toLocaleString()} words</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{importResult.totalChapters && (
|
||||||
|
<div>
|
||||||
|
<span className="font-semibold text-gray-700 dark:text-gray-300">Chapters:</span>
|
||||||
|
<p className="text-gray-900 dark:text-white">{importResult.totalChapters}</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{importResult.warnings && importResult.warnings.length > 0 && (
|
||||||
|
<div className="bg-yellow-50 border border-yellow-200 rounded-md p-4">
|
||||||
|
<strong className="text-yellow-800">Warnings:</strong>
|
||||||
|
<ul className="list-disc list-inside mt-2 text-yellow-700">
|
||||||
|
{importResult.warnings.map((warning, index) => (
|
||||||
|
<li key={index}>{warning}</li>
|
||||||
|
))}
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="flex gap-4 mt-6">
|
||||||
|
<Button
|
||||||
|
onClick={() => router.push(`/stories/${importResult.storyId}`)}
|
||||||
|
className="bg-blue-600 hover:bg-blue-700 text-white"
|
||||||
|
>
|
||||||
|
View Story
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
onClick={resetForm}
|
||||||
|
variant="secondary"
|
||||||
|
>
|
||||||
|
Import Another EPUB
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="container mx-auto px-4 py-8">
|
||||||
|
<div className="mb-6">
|
||||||
|
<Link
|
||||||
|
href="/stories"
|
||||||
|
className="inline-flex items-center text-blue-600 hover:text-blue-800 mb-4"
|
||||||
|
>
|
||||||
|
<ArrowLeftIcon className="h-4 w-4 mr-2" />
|
||||||
|
Back to Stories
|
||||||
|
</Link>
|
||||||
|
<h1 className="text-3xl font-bold text-gray-900 dark:text-white">
|
||||||
|
Import EPUB
|
||||||
|
</h1>
|
||||||
|
<p className="text-gray-600 dark:text-gray-300 mt-2">
|
||||||
|
Upload an EPUB file to import it as a story into your library.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{error && (
|
||||||
|
<div className="bg-red-50 border border-red-200 rounded-md p-4 mb-6">
|
||||||
|
<p className="text-red-700">{error}</p>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<form onSubmit={handleSubmit} className="space-y-6">
|
||||||
|
{/* File Upload */}
|
||||||
|
<div className="bg-white dark:bg-gray-800 border border-gray-200 dark:border-gray-700 rounded-lg p-6">
|
||||||
|
<div className="mb-4">
|
||||||
|
<h3 className="text-lg font-semibold mb-2">Select EPUB File</h3>
|
||||||
|
<p className="text-gray-600 dark:text-gray-300">
|
||||||
|
Choose an EPUB file from your device to import.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<label htmlFor="epub-file" className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">EPUB File</label>
|
||||||
|
<Input
|
||||||
|
id="epub-file"
|
||||||
|
type="file"
|
||||||
|
accept=".epub,application/epub+zip"
|
||||||
|
onChange={handleFileChange}
|
||||||
|
disabled={isLoading || isValidating}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{selectedFile && (
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<DocumentArrowUpIcon className="h-5 w-5" />
|
||||||
|
<span className="text-sm text-gray-600">
|
||||||
|
{selectedFile.name} ({(selectedFile.size / 1024 / 1024).toFixed(2)} MB)
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{isValidating && (
|
||||||
|
<div className="text-sm text-blue-600">
|
||||||
|
Validating EPUB file...
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{validationResult && (
|
||||||
|
<div className="text-sm">
|
||||||
|
{validationResult.valid ? (
|
||||||
|
<span className="inline-flex items-center px-2 py-1 rounded text-xs font-medium bg-green-100 text-green-800">
|
||||||
|
Valid EPUB
|
||||||
|
</span>
|
||||||
|
) : (
|
||||||
|
<span className="inline-flex items-center px-2 py-1 rounded text-xs font-medium bg-red-100 text-red-800">
|
||||||
|
Invalid EPUB
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Import Options */}
|
||||||
|
<div className="bg-white dark:bg-gray-800 border border-gray-200 dark:border-gray-700 rounded-lg p-6">
|
||||||
|
<div className="mb-4">
|
||||||
|
<h3 className="text-lg font-semibold mb-2">Import Options</h3>
|
||||||
|
<p className="text-gray-600 dark:text-gray-300">
|
||||||
|
Configure how the EPUB should be imported.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="space-y-4">
|
||||||
|
<div>
|
||||||
|
<label htmlFor="author-name" className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Author Name (Override)</label>
|
||||||
|
<Input
|
||||||
|
id="author-name"
|
||||||
|
value={authorName}
|
||||||
|
onChange={(e) => setAuthorName(e.target.value)}
|
||||||
|
placeholder="Leave empty to use EPUB metadata"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label htmlFor="series-name" className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Series Name</label>
|
||||||
|
<Input
|
||||||
|
id="series-name"
|
||||||
|
value={seriesName}
|
||||||
|
onChange={(e) => setSeriesName(e.target.value)}
|
||||||
|
placeholder="Optional: Add to a series"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{seriesName && (
|
||||||
|
<div>
|
||||||
|
<label htmlFor="series-volume" className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Series Volume</label>
|
||||||
|
<Input
|
||||||
|
id="series-volume"
|
||||||
|
type="number"
|
||||||
|
value={seriesVolume}
|
||||||
|
onChange={(e) => setSeriesVolume(e.target.value)}
|
||||||
|
placeholder="Volume number in series"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<label htmlFor="tags" className="block text-sm font-medium text-gray-700 dark:text-gray-300 mb-1">Tags</label>
|
||||||
|
<Input
|
||||||
|
id="tags"
|
||||||
|
value={tags}
|
||||||
|
onChange={(e) => setTags(e.target.value)}
|
||||||
|
placeholder="Comma-separated tags (e.g., fantasy, adventure, romance)"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="space-y-3">
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id="preserve-reading-position"
|
||||||
|
checked={preserveReadingPosition}
|
||||||
|
onChange={(e) => setPreserveReadingPosition(e.target.checked)}
|
||||||
|
className="mr-2"
|
||||||
|
/>
|
||||||
|
<label htmlFor="preserve-reading-position" className="text-sm text-gray-700 dark:text-gray-300">
|
||||||
|
Preserve reading position from EPUB metadata
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id="create-missing-author"
|
||||||
|
checked={createMissingAuthor}
|
||||||
|
onChange={(e) => setCreateMissingAuthor(e.target.checked)}
|
||||||
|
className="mr-2"
|
||||||
|
/>
|
||||||
|
<label htmlFor="create-missing-author" className="text-sm text-gray-700 dark:text-gray-300">
|
||||||
|
Create author if not found
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id="create-missing-series"
|
||||||
|
checked={createMissingSeries}
|
||||||
|
onChange={(e) => setCreateMissingSeries(e.target.checked)}
|
||||||
|
className="mr-2"
|
||||||
|
/>
|
||||||
|
<label htmlFor="create-missing-series" className="text-sm text-gray-700 dark:text-gray-300">
|
||||||
|
Create series if not found
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id="overwrite-existing"
|
||||||
|
checked={overwriteExisting}
|
||||||
|
onChange={(e) => setOverwriteExisting(e.target.checked)}
|
||||||
|
className="mr-2"
|
||||||
|
/>
|
||||||
|
<label htmlFor="overwrite-existing" className="text-sm text-gray-700 dark:text-gray-300">
|
||||||
|
Overwrite existing story with same title and author
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Submit Button */}
|
||||||
|
<div className="flex justify-end">
|
||||||
|
<Button
|
||||||
|
type="submit"
|
||||||
|
disabled={!selectedFile || isLoading || isValidating || (validationResult && !validationResult.valid)}
|
||||||
|
className="bg-blue-600 hover:bg-blue-700 text-white"
|
||||||
|
>
|
||||||
|
{isLoading ? 'Importing...' : 'Import EPUB'}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
207
frontend/src/components/BulkImportProgress.tsx
Normal file
207
frontend/src/components/BulkImportProgress.tsx
Normal file
@@ -0,0 +1,207 @@
|
|||||||
|
'use client';
|
||||||
|
|
||||||
|
import { useEffect, useState } from 'react';
|
||||||
|
|
||||||
|
interface ProgressUpdate {
|
||||||
|
type: 'progress' | 'completed' | 'error' | 'connected';
|
||||||
|
current: number;
|
||||||
|
total: number;
|
||||||
|
message: string;
|
||||||
|
url?: string;
|
||||||
|
title?: string;
|
||||||
|
author?: string;
|
||||||
|
wordCount?: number;
|
||||||
|
totalWordCount?: number;
|
||||||
|
error?: string;
|
||||||
|
sessionId?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface BulkImportProgressProps {
|
||||||
|
sessionId: string;
|
||||||
|
onComplete?: (data?: any) => void;
|
||||||
|
onError?: (error: string) => void;
|
||||||
|
combineMode?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function BulkImportProgress({
|
||||||
|
sessionId,
|
||||||
|
onComplete,
|
||||||
|
onError,
|
||||||
|
combineMode = false
|
||||||
|
}: BulkImportProgressProps) {
|
||||||
|
const [progress, setProgress] = useState<ProgressUpdate>({
|
||||||
|
type: 'progress',
|
||||||
|
current: 0,
|
||||||
|
total: 1,
|
||||||
|
message: 'Connecting...'
|
||||||
|
});
|
||||||
|
const [isConnected, setIsConnected] = useState(false);
|
||||||
|
const [recentActivities, setRecentActivities] = useState<string[]>([]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const eventSource = new EventSource(`/scrape/bulk/progress?sessionId=${sessionId}`);
|
||||||
|
|
||||||
|
eventSource.onmessage = (event) => {
|
||||||
|
try {
|
||||||
|
const data: ProgressUpdate = JSON.parse(event.data);
|
||||||
|
|
||||||
|
if (data.type === 'connected') {
|
||||||
|
setIsConnected(true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
setProgress(data);
|
||||||
|
|
||||||
|
// Add to recent activities (keep last 5)
|
||||||
|
if (data.message) {
|
||||||
|
setRecentActivities(prev => [
|
||||||
|
data.message,
|
||||||
|
...prev.slice(0, 4)
|
||||||
|
]);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (data.type === 'completed') {
|
||||||
|
setTimeout(() => {
|
||||||
|
onComplete?.(data);
|
||||||
|
eventSource.close();
|
||||||
|
}, 2000); // Show completion message for 2 seconds
|
||||||
|
} else if (data.type === 'error') {
|
||||||
|
onError?.(data.error || 'Unknown error occurred');
|
||||||
|
eventSource.close();
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to parse progress update:', error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
eventSource.onerror = (error) => {
|
||||||
|
console.error('EventSource error:', error);
|
||||||
|
setIsConnected(false);
|
||||||
|
onError?.('Connection to progress stream failed');
|
||||||
|
eventSource.close();
|
||||||
|
};
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
eventSource.close();
|
||||||
|
};
|
||||||
|
}, [sessionId, onComplete, onError]);
|
||||||
|
|
||||||
|
const progressPercentage = progress.total > 0
|
||||||
|
? Math.round((progress.current / progress.total) * 100)
|
||||||
|
: 0;
|
||||||
|
|
||||||
|
const getStatusColor = () => {
|
||||||
|
switch (progress.type) {
|
||||||
|
case 'completed': return 'bg-green-600';
|
||||||
|
case 'error': return 'bg-red-600';
|
||||||
|
default: return 'bg-blue-600';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const getStatusIcon = () => {
|
||||||
|
switch (progress.type) {
|
||||||
|
case 'completed': return '✓';
|
||||||
|
case 'error': return '✗';
|
||||||
|
default: return null;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="bg-white border border-gray-200 rounded-lg p-6">
|
||||||
|
<div className="mb-4">
|
||||||
|
<div className="flex items-center justify-between mb-2">
|
||||||
|
<h3 className="text-lg font-medium text-gray-900">
|
||||||
|
{combineMode ? 'Combining Stories' : 'Bulk Import Progress'}
|
||||||
|
</h3>
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
{!isConnected && (
|
||||||
|
<div className="h-2 w-2 bg-yellow-400 rounded-full animate-pulse"></div>
|
||||||
|
)}
|
||||||
|
<span className="text-sm text-gray-600">
|
||||||
|
{progress.current} of {progress.total}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Progress Bar */}
|
||||||
|
<div className="w-full bg-gray-200 rounded-full h-3 mb-3">
|
||||||
|
<div
|
||||||
|
className={`h-3 rounded-full transition-all duration-500 ${getStatusColor()}`}
|
||||||
|
style={{ width: `${progressPercentage}%` }}
|
||||||
|
></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Progress Percentage */}
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<span className="text-sm font-medium text-gray-900">
|
||||||
|
{progressPercentage}%
|
||||||
|
</span>
|
||||||
|
{progress.type === 'completed' && (
|
||||||
|
<span className="text-green-600 font-medium">
|
||||||
|
{getStatusIcon()} Complete
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
{progress.type === 'error' && (
|
||||||
|
<span className="text-red-600 font-medium">
|
||||||
|
{getStatusIcon()} Error
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Current Status Message */}
|
||||||
|
<div className="mb-4">
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
{progress.type === 'progress' && (
|
||||||
|
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-blue-600"></div>
|
||||||
|
)}
|
||||||
|
<p className="text-sm text-gray-700">{progress.message}</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Word Count for Combine Mode */}
|
||||||
|
{combineMode && progress.totalWordCount !== undefined && (
|
||||||
|
<p className="text-sm text-gray-500 mt-1">
|
||||||
|
Total words collected: {progress.totalWordCount.toLocaleString()}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Current URL being processed */}
|
||||||
|
{progress.url && (
|
||||||
|
<div className="mb-4 p-3 bg-gray-50 rounded-md">
|
||||||
|
<p className="text-sm text-gray-600 mb-1">Currently processing:</p>
|
||||||
|
<p className="text-sm font-mono text-gray-800 truncate">{progress.url}</p>
|
||||||
|
{progress.title && progress.author && (
|
||||||
|
<p className="text-sm text-gray-600 mt-1">
|
||||||
|
"{progress.title}" by {progress.author}
|
||||||
|
{progress.wordCount && (
|
||||||
|
<span className="ml-2 text-gray-500">
|
||||||
|
({progress.wordCount.toLocaleString()} words)
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Recent Activities */}
|
||||||
|
{recentActivities.length > 0 && (
|
||||||
|
<div>
|
||||||
|
<h4 className="text-sm font-medium text-gray-900 mb-2">Recent Activity</h4>
|
||||||
|
<div className="space-y-1 max-h-32 overflow-y-auto">
|
||||||
|
{recentActivities.map((activity, index) => (
|
||||||
|
<p
|
||||||
|
key={index}
|
||||||
|
className={`text-xs text-gray-600 ${
|
||||||
|
index === 0 ? 'font-medium text-gray-800' : ''
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{activity}
|
||||||
|
</p>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -227,7 +227,7 @@ export default function CollectionForm({
|
|||||||
<input
|
<input
|
||||||
id="coverImage"
|
id="coverImage"
|
||||||
type="file"
|
type="file"
|
||||||
accept="image/jpeg,image/png,image/webp"
|
accept="image/jpeg,image/png"
|
||||||
onChange={handleCoverImageChange}
|
onChange={handleCoverImageChange}
|
||||||
className="w-full px-3 py-2 border theme-border rounded-lg theme-card theme-text focus:outline-none focus:ring-2 focus:ring-theme-accent"
|
className="w-full px-3 py-2 border theme-border rounded-lg theme-card theme-text focus:outline-none focus:ring-2 focus:ring-theme-accent"
|
||||||
/>
|
/>
|
||||||
|
|||||||
@@ -26,6 +26,11 @@ export default function Header() {
|
|||||||
label: 'Import from URL',
|
label: 'Import from URL',
|
||||||
description: 'Import a single story from a website'
|
description: 'Import a single story from a website'
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
href: '/stories/import/epub',
|
||||||
|
label: 'Import EPUB',
|
||||||
|
description: 'Import a story from an EPUB file'
|
||||||
|
},
|
||||||
{
|
{
|
||||||
href: '/stories/import/bulk',
|
href: '/stories/import/bulk',
|
||||||
label: 'Bulk Import',
|
label: 'Bulk Import',
|
||||||
@@ -165,6 +170,13 @@ export default function Header() {
|
|||||||
>
|
>
|
||||||
Import from URL
|
Import from URL
|
||||||
</Link>
|
</Link>
|
||||||
|
<Link
|
||||||
|
href="/stories/import/epub"
|
||||||
|
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||||
|
onClick={() => setIsMenuOpen(false)}
|
||||||
|
>
|
||||||
|
Import EPUB
|
||||||
|
</Link>
|
||||||
<Link
|
<Link
|
||||||
href="/stories/import/bulk"
|
href="/stories/import/bulk"
|
||||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||||
|
|||||||
@@ -20,9 +20,12 @@ export default function RichTextEditor({
|
|||||||
}: RichTextEditorProps) {
|
}: RichTextEditorProps) {
|
||||||
const [viewMode, setViewMode] = useState<'visual' | 'html'>('visual');
|
const [viewMode, setViewMode] = useState<'visual' | 'html'>('visual');
|
||||||
const [htmlValue, setHtmlValue] = useState(value);
|
const [htmlValue, setHtmlValue] = useState(value);
|
||||||
|
const [isMaximized, setIsMaximized] = useState(false);
|
||||||
|
const [containerHeight, setContainerHeight] = useState(300); // Default height in pixels
|
||||||
const previewRef = useRef<HTMLDivElement>(null);
|
const previewRef = useRef<HTMLDivElement>(null);
|
||||||
const visualTextareaRef = useRef<HTMLTextAreaElement>(null);
|
const visualTextareaRef = useRef<HTMLTextAreaElement>(null);
|
||||||
const visualDivRef = useRef<HTMLDivElement>(null);
|
const visualDivRef = useRef<HTMLDivElement>(null);
|
||||||
|
const containerRef = useRef<HTMLDivElement>(null);
|
||||||
const [isUserTyping, setIsUserTyping] = useState(false);
|
const [isUserTyping, setIsUserTyping] = useState(false);
|
||||||
|
|
||||||
// Utility functions for cursor position preservation
|
// Utility functions for cursor position preservation
|
||||||
@@ -60,6 +63,62 @@ export default function RichTextEditor({
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Maximize/minimize functionality
|
||||||
|
const toggleMaximize = () => {
|
||||||
|
if (!isMaximized) {
|
||||||
|
// Store current height before maximizing
|
||||||
|
if (containerRef.current) {
|
||||||
|
setContainerHeight(containerRef.current.scrollHeight || containerHeight);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
setIsMaximized(!isMaximized);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Handle manual resize when dragging resize handle
|
||||||
|
const handleMouseDown = (e: React.MouseEvent) => {
|
||||||
|
if (isMaximized) return; // Don't allow resize when maximized
|
||||||
|
|
||||||
|
e.preventDefault();
|
||||||
|
const startY = e.clientY;
|
||||||
|
const startHeight = containerHeight;
|
||||||
|
|
||||||
|
const handleMouseMove = (e: MouseEvent) => {
|
||||||
|
const deltaY = e.clientY - startY;
|
||||||
|
const newHeight = Math.max(200, Math.min(800, startHeight + deltaY)); // Min 200px, Max 800px
|
||||||
|
setContainerHeight(newHeight);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleMouseUp = () => {
|
||||||
|
document.removeEventListener('mousemove', handleMouseMove);
|
||||||
|
document.removeEventListener('mouseup', handleMouseUp);
|
||||||
|
};
|
||||||
|
|
||||||
|
document.addEventListener('mousemove', handleMouseMove);
|
||||||
|
document.addEventListener('mouseup', handleMouseUp);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Escape key handler for maximized mode
|
||||||
|
useEffect(() => {
|
||||||
|
const handleEscapeKey = (e: KeyboardEvent) => {
|
||||||
|
if (e.key === 'Escape' && isMaximized) {
|
||||||
|
setIsMaximized(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
if (isMaximized) {
|
||||||
|
document.addEventListener('keydown', handleEscapeKey);
|
||||||
|
// Prevent body from scrolling when maximized
|
||||||
|
document.body.style.overflow = 'hidden';
|
||||||
|
} else {
|
||||||
|
document.body.style.overflow = '';
|
||||||
|
}
|
||||||
|
|
||||||
|
return () => {
|
||||||
|
document.removeEventListener('keydown', handleEscapeKey);
|
||||||
|
document.body.style.overflow = '';
|
||||||
|
};
|
||||||
|
}, [isMaximized]);
|
||||||
|
|
||||||
// Set initial content when component mounts
|
// Set initial content when component mounts
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const div = visualDivRef.current;
|
const div = visualDivRef.current;
|
||||||
@@ -439,6 +498,17 @@ export default function RichTextEditor({
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="flex items-center gap-1">
|
<div className="flex items-center gap-1">
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={toggleMaximize}
|
||||||
|
title={isMaximized ? "Minimize editor" : "Maximize editor"}
|
||||||
|
className="font-mono"
|
||||||
|
>
|
||||||
|
{isMaximized ? "⊡" : "⊞"}
|
||||||
|
</Button>
|
||||||
|
<div className="w-px h-4 bg-gray-300 mx-1" />
|
||||||
<Button
|
<Button
|
||||||
type="button"
|
type="button"
|
||||||
size="sm"
|
size="sm"
|
||||||
@@ -504,40 +574,160 @@ export default function RichTextEditor({
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Editor */}
|
{/* Editor */}
|
||||||
<div className="border theme-border rounded-b-lg overflow-hidden">
|
<div
|
||||||
{viewMode === 'visual' ? (
|
className={`relative border theme-border rounded-b-lg ${
|
||||||
<div className="relative">
|
isMaximized ? 'fixed inset-4 z-50 bg-white dark:bg-gray-900 shadow-2xl' : ''
|
||||||
<div
|
}`}
|
||||||
ref={visualDivRef}
|
style={isMaximized ? {} : { height: containerHeight }}
|
||||||
contentEditable
|
>
|
||||||
onInput={handleVisualContentChange}
|
<div
|
||||||
onPaste={handlePaste}
|
ref={containerRef}
|
||||||
className="p-3 min-h-[300px] focus:outline-none focus:ring-0 whitespace-pre-wrap"
|
className="h-full flex flex-col overflow-hidden"
|
||||||
style={{ minHeight: '300px' }}
|
>
|
||||||
suppressContentEditableWarning={true}
|
{/* Maximized toolbar (shown when maximized) */}
|
||||||
/>
|
{isMaximized && (
|
||||||
{!value && (
|
<div className="flex items-center justify-between p-2 theme-card border-b theme-border">
|
||||||
<div
|
<div className="flex items-center gap-2">
|
||||||
className="absolute top-3 left-3 text-gray-500 dark:text-gray-400 pointer-events-none select-none"
|
<Button
|
||||||
style={{ minHeight: '300px' }}
|
type="button"
|
||||||
>
|
size="sm"
|
||||||
{placeholder}
|
variant="ghost"
|
||||||
|
onClick={() => setViewMode('visual')}
|
||||||
|
className={viewMode === 'visual' ? 'theme-accent-bg text-white' : ''}
|
||||||
|
>
|
||||||
|
Visual
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => setViewMode('html')}
|
||||||
|
className={viewMode === 'html' ? 'theme-accent-bg text-white' : ''}
|
||||||
|
>
|
||||||
|
HTML
|
||||||
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center gap-1">
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={toggleMaximize}
|
||||||
|
title="Minimize editor"
|
||||||
|
className="font-mono"
|
||||||
|
>
|
||||||
|
⊡
|
||||||
|
</Button>
|
||||||
|
<div className="w-px h-4 bg-gray-300 mx-1" />
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => formatText('strong')}
|
||||||
|
title="Bold"
|
||||||
|
className="font-bold"
|
||||||
|
>
|
||||||
|
B
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => formatText('em')}
|
||||||
|
title="Italic"
|
||||||
|
className="italic"
|
||||||
|
>
|
||||||
|
I
|
||||||
|
</Button>
|
||||||
|
<div className="w-px h-4 bg-gray-300 mx-1" />
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => formatText('h1')}
|
||||||
|
title="Heading 1"
|
||||||
|
className="text-lg font-bold"
|
||||||
|
>
|
||||||
|
H1
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => formatText('h2')}
|
||||||
|
title="Heading 2"
|
||||||
|
className="text-base font-bold"
|
||||||
|
>
|
||||||
|
H2
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => formatText('h3')}
|
||||||
|
title="Heading 3"
|
||||||
|
className="text-sm font-bold"
|
||||||
|
>
|
||||||
|
H3
|
||||||
|
</Button>
|
||||||
|
<div className="w-px h-4 bg-gray-300 mx-1" />
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
size="sm"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => formatText('p')}
|
||||||
|
title="Paragraph"
|
||||||
|
>
|
||||||
|
P
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Editor content */}
|
||||||
|
<div className="flex-1 overflow-hidden">
|
||||||
|
{viewMode === 'visual' ? (
|
||||||
|
<div className="relative h-full">
|
||||||
|
<div
|
||||||
|
ref={visualDivRef}
|
||||||
|
contentEditable
|
||||||
|
onInput={handleVisualContentChange}
|
||||||
|
onPaste={handlePaste}
|
||||||
|
className="p-3 h-full overflow-y-auto focus:outline-none focus:ring-0 whitespace-pre-wrap resize-none"
|
||||||
|
suppressContentEditableWarning={true}
|
||||||
|
/>
|
||||||
|
{!value && (
|
||||||
|
<div className="absolute top-3 left-3 text-gray-500 dark:text-gray-400 pointer-events-none select-none">
|
||||||
|
{placeholder}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<Textarea
|
||||||
|
value={htmlValue}
|
||||||
|
onChange={handleHtmlChange}
|
||||||
|
placeholder="<p>Write your HTML content here...</p>"
|
||||||
|
className="border-0 rounded-none focus:ring-0 font-mono text-sm h-full resize-none"
|
||||||
|
/>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
) : (
|
</div>
|
||||||
<Textarea
|
|
||||||
value={htmlValue}
|
{/* Resize handle (only show when not maximized) */}
|
||||||
onChange={handleHtmlChange}
|
{!isMaximized && (
|
||||||
placeholder="<p>Write your HTML content here...</p>"
|
<div
|
||||||
rows={12}
|
onMouseDown={handleMouseDown}
|
||||||
className="border-0 rounded-none focus:ring-0 font-mono text-sm"
|
className="absolute bottom-0 left-0 right-0 h-2 cursor-ns-resize bg-gray-200 dark:bg-gray-700 hover:bg-gray-300 dark:hover:bg-gray-600 transition-colors flex items-center justify-center"
|
||||||
/>
|
title="Drag to resize"
|
||||||
|
>
|
||||||
|
<div className="w-8 h-0.5 bg-gray-400 dark:bg-gray-500 rounded-full"></div>
|
||||||
|
</div>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Preview for HTML mode */}
|
{/* Preview for HTML mode (only show when not maximized) */}
|
||||||
{viewMode === 'html' && value && (
|
{viewMode === 'html' && value && !isMaximized && (
|
||||||
<div className="space-y-2">
|
<div className="space-y-2">
|
||||||
<h4 className="text-sm font-medium theme-header">Preview:</h4>
|
<h4 className="text-sm font-medium theme-header">Preview:</h4>
|
||||||
<div
|
<div
|
||||||
@@ -561,6 +751,10 @@ export default function RichTextEditor({
|
|||||||
<strong>HTML mode:</strong> Edit HTML source directly for advanced formatting.
|
<strong>HTML mode:</strong> Edit HTML source directly for advanced formatting.
|
||||||
Allowed tags: p, br, div, span, strong, em, b, i, u, s, h1-h6, ul, ol, li, blockquote, and more.
|
Allowed tags: p, br, div, span, strong, em, b, i, u, s, h1-h6, ul, ol, li, blockquote, and more.
|
||||||
</p>
|
</p>
|
||||||
|
<p>
|
||||||
|
<strong>Tips:</strong> Use the ⊞ button to maximize the editor for larger stories.
|
||||||
|
Drag the resize handle at the bottom to adjust height. Press Escape to exit maximized mode.
|
||||||
|
</p>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -32,7 +32,8 @@ export default function ImageUpload({
|
|||||||
if (rejection.errors?.[0]?.code === 'file-too-large') {
|
if (rejection.errors?.[0]?.code === 'file-too-large') {
|
||||||
setError(`File is too large. Maximum size is ${maxSizeMB}MB.`);
|
setError(`File is too large. Maximum size is ${maxSizeMB}MB.`);
|
||||||
} else if (rejection.errors?.[0]?.code === 'file-invalid-type') {
|
} else if (rejection.errors?.[0]?.code === 'file-invalid-type') {
|
||||||
setError('Invalid file type. Please select an image file.');
|
const allowedTypes = accept.split(',').map(type => type.trim()).join(', ');
|
||||||
|
setError(`Invalid file type. Supported formats: ${allowedTypes.replace(/image\//g, '').toUpperCase()}.`);
|
||||||
} else {
|
} else {
|
||||||
setError('File rejected. Please try another file.');
|
setError('File rejected. Please try another file.');
|
||||||
}
|
}
|
||||||
@@ -41,18 +42,31 @@ export default function ImageUpload({
|
|||||||
|
|
||||||
const file = acceptedFiles[0];
|
const file = acceptedFiles[0];
|
||||||
if (file) {
|
if (file) {
|
||||||
|
// Additional client-side validation for file type
|
||||||
|
const allowedTypes = accept.split(',').map(type => type.trim());
|
||||||
|
if (!allowedTypes.includes(file.type)) {
|
||||||
|
const supportedFormats = allowedTypes.map(type => type.replace('image/', '').toUpperCase()).join(', ');
|
||||||
|
setError(`Invalid file type. Your file is ${file.type}. Supported formats: ${supportedFormats}.`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
// Create preview
|
// Create preview
|
||||||
const previewUrl = URL.createObjectURL(file);
|
const previewUrl = URL.createObjectURL(file);
|
||||||
setPreview(previewUrl);
|
setPreview(previewUrl);
|
||||||
onImageSelect(file);
|
onImageSelect(file);
|
||||||
}
|
}
|
||||||
}, [onImageSelect, maxSizeMB]);
|
}, [onImageSelect, maxSizeMB, accept]);
|
||||||
|
|
||||||
|
// Build proper accept object for dropzone based on specific MIME types
|
||||||
|
const acceptTypes = accept.split(',').map(type => type.trim());
|
||||||
|
const dropzoneAccept = acceptTypes.reduce((acc, type) => {
|
||||||
|
acc[type] = []; // Empty array means accept files with this MIME type
|
||||||
|
return acc;
|
||||||
|
}, {} as Record<string, string[]>);
|
||||||
|
|
||||||
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
||||||
onDrop,
|
onDrop,
|
||||||
accept: {
|
accept: dropzoneAccept,
|
||||||
'image/*': accept.split(',').map(type => type.trim()),
|
|
||||||
},
|
|
||||||
maxFiles: 1,
|
maxFiles: 1,
|
||||||
maxSize: maxSizeMB * 1024 * 1024, // Convert MB to bytes
|
maxSize: maxSizeMB * 1024 * 1024, // Convert MB to bytes
|
||||||
});
|
});
|
||||||
@@ -123,7 +137,7 @@ export default function ImageUpload({
|
|||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
<p className="text-sm text-gray-500">
|
<p className="text-sm text-gray-500">
|
||||||
Supports JPEG, PNG, WebP up to {maxSizeMB}MB
|
Supports {acceptTypes.map(type => type.replace('image/', '').toUpperCase()).join(', ')} up to {maxSizeMB}MB
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
|||||||
@@ -235,6 +235,16 @@
|
|||||||
"requiresJavaScript": true
|
"requiresJavaScript": true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
"wanderinginn.com": {
|
||||||
|
"story": {
|
||||||
|
"title": "h1.entry-title",
|
||||||
|
"author": "pirateaba",
|
||||||
|
"content": ".entry-content",
|
||||||
|
"summary": "meta[property='og:description']",
|
||||||
|
"summaryAttribute": "content"
|
||||||
|
}
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
|
||||||
@@ -329,6 +339,10 @@
|
|||||||
"fanfiction.net": {
|
"fanfiction.net": {
|
||||||
"note": "Older site with simpler HTML structure",
|
"note": "Older site with simpler HTML structure",
|
||||||
"warning": "Known to block IPs for aggressive scraping"
|
"warning": "Known to block IPs for aggressive scraping"
|
||||||
|
},
|
||||||
|
"wanderinginn.com": {
|
||||||
|
"note": "WordPress-based site with consistent structure",
|
||||||
|
"author": "All stories by pirateaba - uses text pattern matching for content extraction"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
export interface SiteConfig {
|
export interface SiteConfig {
|
||||||
story: StorySelectors;
|
story: StorySelectors;
|
||||||
authorPage: AuthorPageSelectors;
|
authorPage?: AuthorPageSelectors;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface StorySelectors {
|
export interface StorySelectors {
|
||||||
@@ -13,6 +13,7 @@ export interface StorySelectors {
|
|||||||
multiPage?: MultiPageConfig;
|
multiPage?: MultiPageConfig;
|
||||||
titleFallback?: string;
|
titleFallback?: string;
|
||||||
titleFallbackAttribute?: string;
|
titleFallbackAttribute?: string;
|
||||||
|
contentFallback?: string;
|
||||||
titleTransform?: string;
|
titleTransform?: string;
|
||||||
summaryAttribute?: string;
|
summaryAttribute?: string;
|
||||||
coverImageAttribute?: string;
|
coverImageAttribute?: string;
|
||||||
|
|||||||
File diff suppressed because one or more lines are too long
13
nginx.conf
13
nginx.conf
@@ -39,9 +39,10 @@ http {
|
|||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
proxy_set_header X-Forwarded-Proto $scheme;
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
proxy_cache_bypass $http_upgrade;
|
proxy_cache_bypass $http_upgrade;
|
||||||
proxy_connect_timeout 60s;
|
# Extended timeouts for bulk scraping operations
|
||||||
proxy_send_timeout 60s;
|
proxy_connect_timeout 900s;
|
||||||
proxy_read_timeout 60s;
|
proxy_send_timeout 900s;
|
||||||
|
proxy_read_timeout 900s;
|
||||||
}
|
}
|
||||||
|
|
||||||
# Backend API routes (fallback for all other /api/ routes)
|
# Backend API routes (fallback for all other /api/ routes)
|
||||||
@@ -51,9 +52,9 @@ http {
|
|||||||
proxy_set_header X-Real-IP $remote_addr;
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
proxy_set_header X-Forwarded-Proto $scheme;
|
proxy_set_header X-Forwarded-Proto $scheme;
|
||||||
proxy_connect_timeout 60s;
|
proxy_connect_timeout 900s;
|
||||||
proxy_send_timeout 60s;
|
proxy_send_timeout 900s;
|
||||||
proxy_read_timeout 60s;
|
proxy_read_timeout 900s;
|
||||||
}
|
}
|
||||||
|
|
||||||
# Static image serving
|
# Static image serving
|
||||||
|
|||||||
239
package-lock.json
generated
239
package-lock.json
generated
@@ -5,7 +5,238 @@
|
|||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"cheerio": "^1.1.2"
|
"@anthropic-ai/claude-code": "^1.0.70",
|
||||||
|
"cheerio": "^1.1.2",
|
||||||
|
"g": "^2.0.1"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@anthropic-ai/claude-code": {
|
||||||
|
"version": "1.0.70",
|
||||||
|
"resolved": "https://registry.npmjs.org/@anthropic-ai/claude-code/-/claude-code-1.0.70.tgz",
|
||||||
|
"integrity": "sha512-gJ/bdT/XQ/hp5EKM0QoOWj/eKmK3wvs1TotTLq1unqahiB6B+EAQeRy/uvxv2Ua9nI8p5Bogw8hXB1uUmAHb+A==",
|
||||||
|
"license": "SEE LICENSE IN README.md",
|
||||||
|
"bin": {
|
||||||
|
"claude": "cli.js"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=18.0.0"
|
||||||
|
},
|
||||||
|
"optionalDependencies": {
|
||||||
|
"@img/sharp-darwin-arm64": "^0.33.5",
|
||||||
|
"@img/sharp-darwin-x64": "^0.33.5",
|
||||||
|
"@img/sharp-linux-arm": "^0.33.5",
|
||||||
|
"@img/sharp-linux-arm64": "^0.33.5",
|
||||||
|
"@img/sharp-linux-x64": "^0.33.5",
|
||||||
|
"@img/sharp-win32-x64": "^0.33.5"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-darwin-arm64": {
|
||||||
|
"version": "0.33.5",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.33.5.tgz",
|
||||||
|
"integrity": "sha512-UT4p+iz/2H4twwAoLCqfA9UH5pI6DggwKEGuaPy7nCVQ8ZsiY5PIcrRvD1DzuY3qYL07NtIQcWnBSY/heikIFQ==",
|
||||||
|
"cpu": [
|
||||||
|
"arm64"
|
||||||
|
],
|
||||||
|
"license": "Apache-2.0",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"darwin"
|
||||||
|
],
|
||||||
|
"engines": {
|
||||||
|
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
},
|
||||||
|
"optionalDependencies": {
|
||||||
|
"@img/sharp-libvips-darwin-arm64": "1.0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-darwin-x64": {
|
||||||
|
"version": "0.33.5",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.33.5.tgz",
|
||||||
|
"integrity": "sha512-fyHac4jIc1ANYGRDxtiqelIbdWkIuQaI84Mv45KvGRRxSAa7o7d1ZKAOBaYbnepLC1WqxfpimdeWfvqqSGwR2Q==",
|
||||||
|
"cpu": [
|
||||||
|
"x64"
|
||||||
|
],
|
||||||
|
"license": "Apache-2.0",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"darwin"
|
||||||
|
],
|
||||||
|
"engines": {
|
||||||
|
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
},
|
||||||
|
"optionalDependencies": {
|
||||||
|
"@img/sharp-libvips-darwin-x64": "1.0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-libvips-darwin-arm64": {
|
||||||
|
"version": "1.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.0.4.tgz",
|
||||||
|
"integrity": "sha512-XblONe153h0O2zuFfTAbQYAX2JhYmDHeWikp1LM9Hul9gVPjFY427k6dFEcOL72O01QxQsWi761svJ/ev9xEDg==",
|
||||||
|
"cpu": [
|
||||||
|
"arm64"
|
||||||
|
],
|
||||||
|
"license": "LGPL-3.0-or-later",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"darwin"
|
||||||
|
],
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-libvips-darwin-x64": {
|
||||||
|
"version": "1.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.0.4.tgz",
|
||||||
|
"integrity": "sha512-xnGR8YuZYfJGmWPvmlunFaWJsb9T/AO2ykoP3Fz/0X5XV2aoYBPkX6xqCQvUTKKiLddarLaxpzNe+b1hjeWHAQ==",
|
||||||
|
"cpu": [
|
||||||
|
"x64"
|
||||||
|
],
|
||||||
|
"license": "LGPL-3.0-or-later",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"darwin"
|
||||||
|
],
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-libvips-linux-arm": {
|
||||||
|
"version": "1.0.5",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.0.5.tgz",
|
||||||
|
"integrity": "sha512-gvcC4ACAOPRNATg/ov8/MnbxFDJqf/pDePbBnuBDcjsI8PssmjoKMAz4LtLaVi+OnSb5FK/yIOamqDwGmXW32g==",
|
||||||
|
"cpu": [
|
||||||
|
"arm"
|
||||||
|
],
|
||||||
|
"license": "LGPL-3.0-or-later",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"linux"
|
||||||
|
],
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-libvips-linux-arm64": {
|
||||||
|
"version": "1.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.0.4.tgz",
|
||||||
|
"integrity": "sha512-9B+taZ8DlyyqzZQnoeIvDVR/2F4EbMepXMc/NdVbkzsJbzkUjhXv/70GQJ7tdLA4YJgNP25zukcxpX2/SueNrA==",
|
||||||
|
"cpu": [
|
||||||
|
"arm64"
|
||||||
|
],
|
||||||
|
"license": "LGPL-3.0-or-later",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"linux"
|
||||||
|
],
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-libvips-linux-x64": {
|
||||||
|
"version": "1.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.0.4.tgz",
|
||||||
|
"integrity": "sha512-MmWmQ3iPFZr0Iev+BAgVMb3ZyC4KeFc3jFxnNbEPas60e1cIfevbtuyf9nDGIzOaW9PdnDciJm+wFFaTlj5xYw==",
|
||||||
|
"cpu": [
|
||||||
|
"x64"
|
||||||
|
],
|
||||||
|
"license": "LGPL-3.0-or-later",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"linux"
|
||||||
|
],
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-linux-arm": {
|
||||||
|
"version": "0.33.5",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.33.5.tgz",
|
||||||
|
"integrity": "sha512-JTS1eldqZbJxjvKaAkxhZmBqPRGmxgu+qFKSInv8moZ2AmT5Yib3EQ1c6gp493HvrvV8QgdOXdyaIBrhvFhBMQ==",
|
||||||
|
"cpu": [
|
||||||
|
"arm"
|
||||||
|
],
|
||||||
|
"license": "Apache-2.0",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"linux"
|
||||||
|
],
|
||||||
|
"engines": {
|
||||||
|
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
},
|
||||||
|
"optionalDependencies": {
|
||||||
|
"@img/sharp-libvips-linux-arm": "1.0.5"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-linux-arm64": {
|
||||||
|
"version": "0.33.5",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.33.5.tgz",
|
||||||
|
"integrity": "sha512-JMVv+AMRyGOHtO1RFBiJy/MBsgz0x4AWrT6QoEVVTyh1E39TrCUpTRI7mx9VksGX4awWASxqCYLCV4wBZHAYxA==",
|
||||||
|
"cpu": [
|
||||||
|
"arm64"
|
||||||
|
],
|
||||||
|
"license": "Apache-2.0",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"linux"
|
||||||
|
],
|
||||||
|
"engines": {
|
||||||
|
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
},
|
||||||
|
"optionalDependencies": {
|
||||||
|
"@img/sharp-libvips-linux-arm64": "1.0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-linux-x64": {
|
||||||
|
"version": "0.33.5",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.33.5.tgz",
|
||||||
|
"integrity": "sha512-opC+Ok5pRNAzuvq1AG0ar+1owsu842/Ab+4qvU879ippJBHvyY5n2mxF1izXqkPYlGuP/M556uh53jRLJmzTWA==",
|
||||||
|
"cpu": [
|
||||||
|
"x64"
|
||||||
|
],
|
||||||
|
"license": "Apache-2.0",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"linux"
|
||||||
|
],
|
||||||
|
"engines": {
|
||||||
|
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
|
},
|
||||||
|
"optionalDependencies": {
|
||||||
|
"@img/sharp-libvips-linux-x64": "1.0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/@img/sharp-win32-x64": {
|
||||||
|
"version": "0.33.5",
|
||||||
|
"resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.33.5.tgz",
|
||||||
|
"integrity": "sha512-MpY/o8/8kj+EcnxwvrP4aTJSWw/aZ7JIGR4aBeZkZw5B7/Jn+tY9/VNwtcoGmdT7GfggGIU4kygOMSbYnOrAbg==",
|
||||||
|
"cpu": [
|
||||||
|
"x64"
|
||||||
|
],
|
||||||
|
"license": "Apache-2.0 AND LGPL-3.0-or-later",
|
||||||
|
"optional": true,
|
||||||
|
"os": [
|
||||||
|
"win32"
|
||||||
|
],
|
||||||
|
"engines": {
|
||||||
|
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://opencollective.com/libvips"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"node_modules/boolbase": {
|
"node_modules/boolbase": {
|
||||||
@@ -164,6 +395,12 @@
|
|||||||
"url": "https://github.com/fb55/entities?sponsor=1"
|
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"node_modules/g": {
|
||||||
|
"version": "2.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/g/-/g-2.0.1.tgz",
|
||||||
|
"integrity": "sha512-Fi6Ng5fZ/ANLQ15H11hCe+09sgUoNvDEBevVgx3KoYOhsH5iLNPn54hx0jPZ+3oSWr+xajnp2Qau9VmPsc7hTA==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
"node_modules/htmlparser2": {
|
"node_modules/htmlparser2": {
|
||||||
"version": "10.0.0",
|
"version": "10.0.0",
|
||||||
"resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-10.0.0.tgz",
|
"resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-10.0.0.tgz",
|
||||||
|
|||||||
@@ -1,5 +1,7 @@
|
|||||||
{
|
{
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"cheerio": "^1.1.2"
|
"@anthropic-ai/claude-code": "^1.0.70",
|
||||||
|
"cheerio": "^1.1.2",
|
||||||
|
"g": "^2.0.1"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
Reference in New Issue
Block a user