Compare commits
24 Commits
feature/co
...
142d8328c2
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
142d8328c2 | ||
|
|
c46108c317 | ||
|
|
75c207970d | ||
|
|
3b22d155db | ||
|
|
51e3d20c24 | ||
|
|
5d195b63ef | ||
|
|
5b3a9d183e | ||
|
|
379c8c170f | ||
|
|
090b858a54 | ||
|
|
b0c14d4b37 | ||
|
|
7227061d25 | ||
|
|
415eab07de | ||
|
|
e89331e059 | ||
|
|
370bef2f07 | ||
|
|
9e788c2018 | ||
|
|
590e2590d6 | ||
|
|
57859d7a84 | ||
|
|
5746001c4a | ||
|
|
c08082c0d6 | ||
|
|
860bf02d56 | ||
|
|
a501b27169 | ||
|
|
fcad028959 | ||
|
|
f95d7aa8bb | ||
| 5e8164c6a4 |
Binary file not shown.
|
Before Width: | Height: | Size: 37 KiB |
466
EPUB_IMPORT_EXPORT_SPECIFICATION.md
Normal file
466
EPUB_IMPORT_EXPORT_SPECIFICATION.md
Normal file
@@ -0,0 +1,466 @@
|
||||
# EPUB Import/Export Specification
|
||||
|
||||
## 🎉 Phase 1 & 2 Implementation Complete
|
||||
|
||||
**Status**: Both Phase 1 and Phase 2 fully implemented and operational as of August 2025
|
||||
|
||||
**Phase 1 Achievements**:
|
||||
- ✅ Complete EPUB import functionality with validation and error handling
|
||||
- ✅ Single story EPUB export with XML validation fixes
|
||||
- ✅ Reading position preservation using EPUB CFI standards
|
||||
- ✅ Full frontend UI integration with navigation and authentication
|
||||
- ✅ Moved export button to Story Detail View for better UX
|
||||
- ✅ Added EPUB import to main Add Story menu dropdown
|
||||
|
||||
**Phase 2 Enhancements**:
|
||||
- ✅ **Enhanced Cover Processing**: Automatic extraction and optimization of cover images during EPUB import
|
||||
- ✅ **Advanced Metadata Extraction**: Comprehensive extraction of subjects/tags, keywords, publisher, language, publication dates, and identifiers
|
||||
- ✅ **Collection EPUB Export**: Full collection export with table of contents, proper chapter structure, and metadata aggregation
|
||||
- ✅ **Image Validation**: Robust cover image processing with format detection, resizing, and storage management
|
||||
- ✅ **API Endpoints**: Complete REST API for both individual story and collection EPUB operations
|
||||
|
||||
## Overview
|
||||
|
||||
This specification defines the requirements and implementation details for importing and exporting EPUB files in StoryCove. The feature enables users to import stories from EPUB files and export their stories/collections as EPUB files with preserved reading positions.
|
||||
|
||||
## Scope
|
||||
|
||||
### In Scope
|
||||
- **EPUB Import**: Parse DRM-free EPUB files and import as stories
|
||||
- **EPUB Export**: Export individual stories and collections as EPUB files
|
||||
- **Reading Position Preservation**: Store and restore reading positions using EPUB standards
|
||||
- **Metadata Handling**: Extract and preserve story metadata (title, author, cover, etc.)
|
||||
- **Content Processing**: HTML content sanitization and formatting
|
||||
|
||||
### Out of Scope (Phase 1)
|
||||
- DRM-protected EPUB files (future consideration)
|
||||
- Real-time reading position sync between devices
|
||||
- Advanced EPUB features (audio, video, interactive content)
|
||||
- EPUB validation beyond basic structure
|
||||
|
||||
## Technical Architecture
|
||||
|
||||
### Backend Implementation
|
||||
- **Language**: Java (Spring Boot)
|
||||
- **Primary Library**: EPUBLib (nl.siegmann.epublib:epublib-core:3.1)
|
||||
- **Processing**: Server-side generation and parsing
|
||||
- **File Handling**: Multipart file upload for import, streaming download for export
|
||||
|
||||
### Dependencies
|
||||
```xml
|
||||
<dependency>
|
||||
<groupId>com.positiondev.epublib</groupId>
|
||||
<artifactId>epublib-core</artifactId>
|
||||
<version>3.1</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
### Phase 1 Implementation Notes
|
||||
- **EPUBImportService**: Implemented with full validation, metadata extraction, and reading position handling
|
||||
- **EPUBExportService**: Implemented with XML validation fixes for EPUB reader compatibility
|
||||
- **ReadingPosition Entity**: Created with EPUB CFI support and database indexing
|
||||
- **Authentication**: All endpoints secured with JWT authentication and proper frontend integration
|
||||
- **UI Integration**: Export moved to Story Detail View, Import added to main navigation menu
|
||||
- **XML Compliance**: Fixed XHTML validation issues by properly formatting self-closing tags (`<br>` → `<br />`)
|
||||
|
||||
## EPUB Import Specification
|
||||
|
||||
### Supported Formats
|
||||
- **EPUB 2.0** and **EPUB 3.x** formats
|
||||
- **DRM-Free** files only
|
||||
- **Maximum file size**: 50MB
|
||||
- **Supported content**: Text-based stories with HTML content
|
||||
|
||||
### Import Process Flow
|
||||
1. **File Upload**: User uploads EPUB file via web interface
|
||||
2. **Validation**: Check file format, size, and basic EPUB structure
|
||||
3. **Parsing**: Extract metadata, content, and resources using EPUBLib
|
||||
4. **Content Processing**: Sanitize HTML content using existing Jsoup pipeline
|
||||
5. **Story Creation**: Create Story entity with extracted data
|
||||
6. **Preview**: Show extracted story details for user confirmation
|
||||
7. **Finalization**: Save story to database with imported metadata
|
||||
|
||||
### Metadata Mapping
|
||||
```java
|
||||
// EPUB Metadata → StoryCove Story Entity
|
||||
epub.getMetadata().getFirstTitle() → story.title
|
||||
epub.getMetadata().getAuthors().get(0) → story.authorName
|
||||
epub.getMetadata().getDescriptions().get(0) → story.summary
|
||||
epub.getCoverImage() → story.coverPath
|
||||
epub.getMetadata().getSubjects() → story.tags
|
||||
```
|
||||
|
||||
### Content Extraction
|
||||
- **Multi-chapter EPUBs**: Combine all content files into single HTML
|
||||
- **Chapter separation**: Insert `<hr>` or `<h2>` tags between chapters
|
||||
- **HTML sanitization**: Apply existing sanitization rules
|
||||
- **Image handling**: Extract and store cover images, inline images optional
|
||||
|
||||
### API Endpoints
|
||||
|
||||
#### POST /api/stories/import-epub
|
||||
```java
|
||||
@PostMapping("/import-epub")
|
||||
public ResponseEntity<?> importEPUB(@RequestParam("file") MultipartFile file) {
|
||||
// Implementation in EPUBImportService
|
||||
}
|
||||
```
|
||||
|
||||
**Request**: Multipart file upload
|
||||
**Response**:
|
||||
```json
|
||||
{
|
||||
"message": "EPUB imported successfully",
|
||||
"storyId": "uuid",
|
||||
"extractedData": {
|
||||
"title": "Story Title",
|
||||
"author": "Author Name",
|
||||
"summary": "Story description",
|
||||
"chapterCount": 12,
|
||||
"wordCount": 45000,
|
||||
"hasCovers": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## EPUB Export Specification
|
||||
|
||||
### Export Types
|
||||
1. **Single Story Export**: Convert one story to EPUB
|
||||
2. **Collection Export**: Multiple stories as single EPUB with chapters
|
||||
|
||||
### EPUB Structure Generation
|
||||
```
|
||||
story.epub
|
||||
├── mimetype
|
||||
├── META-INF/
|
||||
│ └── container.xml
|
||||
└── OEBPS/
|
||||
├── content.opf # Package metadata
|
||||
├── toc.ncx # Navigation
|
||||
├── stylesheet.css # Styling
|
||||
├── cover.html # Cover page
|
||||
├── chapter001.xhtml # Story content
|
||||
├── images/
|
||||
│ └── cover.jpg # Cover image
|
||||
└── fonts/ (optional)
|
||||
```
|
||||
|
||||
### Reading Position Implementation
|
||||
|
||||
#### EPUB 3 CFI (Canonical Fragment Identifier)
|
||||
```xml
|
||||
<!-- In content.opf metadata -->
|
||||
<meta property="epub-cfi" content="/6/4[chap01]!/4[body01]/10[para05]/3:142"/>
|
||||
<meta property="reading-percentage" content="0.65"/>
|
||||
<meta property="last-read-timestamp" content="2023-12-07T10:30:00Z"/>
|
||||
```
|
||||
|
||||
#### StoryCove Custom Metadata (Fallback)
|
||||
```xml
|
||||
<meta name="storycove:reading-chapter" content="3"/>
|
||||
<meta name="storycove:reading-paragraph" content="15"/>
|
||||
<meta name="storycove:reading-offset" content="142"/>
|
||||
<meta name="storycove:reading-percentage" content="0.65"/>
|
||||
```
|
||||
|
||||
#### CFI Generation Logic
|
||||
```java
|
||||
public String generateCFI(ReadingPosition position) {
|
||||
return String.format("/6/%d[chap%02d]!/4[body01]/%d[para%02d]/3:%d",
|
||||
(position.getChapterIndex() * 2) + 4,
|
||||
position.getChapterIndex(),
|
||||
(position.getParagraphIndex() * 2) + 4,
|
||||
position.getParagraphIndex(),
|
||||
position.getCharacterOffset());
|
||||
}
|
||||
```
|
||||
|
||||
### API Endpoints
|
||||
|
||||
#### GET /api/stories/{id}/export-epub
|
||||
```java
|
||||
@GetMapping("/{id}/export-epub")
|
||||
public ResponseEntity<StreamingResponseBody> exportStory(@PathVariable UUID id) {
|
||||
// Implementation in EPUBExportService
|
||||
}
|
||||
```
|
||||
|
||||
**Response**: EPUB file download with headers:
|
||||
```
|
||||
Content-Type: application/epub+zip
|
||||
Content-Disposition: attachment; filename="story-title.epub"
|
||||
```
|
||||
|
||||
#### GET /api/collections/{id}/export-epub
|
||||
```java
|
||||
@GetMapping("/{id}/export-epub")
|
||||
public ResponseEntity<StreamingResponseBody> exportCollection(@PathVariable UUID id) {
|
||||
// Implementation in EPUBExportService
|
||||
}
|
||||
```
|
||||
|
||||
**Response**: Multi-story EPUB with table of contents
|
||||
|
||||
## Data Models
|
||||
|
||||
### ReadingPosition Entity
|
||||
```java
|
||||
@Entity
|
||||
@Table(name = "reading_positions")
|
||||
public class ReadingPosition {
|
||||
@Id
|
||||
private UUID id;
|
||||
|
||||
@ManyToOne(fetch = FetchType.LAZY)
|
||||
@JoinColumn(name = "story_id")
|
||||
private Story story;
|
||||
|
||||
@Column(name = "chapter_index")
|
||||
private Integer chapterIndex = 0;
|
||||
|
||||
@Column(name = "paragraph_index")
|
||||
private Integer paragraphIndex = 0;
|
||||
|
||||
@Column(name = "character_offset")
|
||||
private Integer characterOffset = 0;
|
||||
|
||||
@Column(name = "progress_percentage")
|
||||
private Double progressPercentage = 0.0;
|
||||
|
||||
@Column(name = "epub_cfi")
|
||||
private String canonicalFragmentIdentifier;
|
||||
|
||||
@Column(name = "last_read_at")
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
@Column(name = "device_identifier")
|
||||
private String deviceIdentifier;
|
||||
|
||||
// Constructors, getters, setters
|
||||
}
|
||||
```
|
||||
|
||||
### EPUB Import Request DTO
|
||||
```java
|
||||
public class EPUBImportRequest {
|
||||
private String filename;
|
||||
private Long fileSize;
|
||||
private Boolean preserveChapterStructure = true;
|
||||
private Boolean extractCover = true;
|
||||
private String targetCollectionId; // Optional: add to specific collection
|
||||
}
|
||||
```
|
||||
|
||||
### EPUB Export Options DTO
|
||||
```java
|
||||
public class EPUBExportOptions {
|
||||
private Boolean includeReadingPosition = true;
|
||||
private Boolean includeCoverImage = true;
|
||||
private Boolean includeMetadata = true;
|
||||
private String cssStylesheet; // Optional custom CSS
|
||||
private EPUBVersion version = EPUBVersion.EPUB3;
|
||||
}
|
||||
```
|
||||
|
||||
## Service Layer Architecture
|
||||
|
||||
### EPUBImportService
|
||||
```java
|
||||
@Service
|
||||
public class EPUBImportService {
|
||||
|
||||
// Core import method
|
||||
public Story importEPUBFile(MultipartFile file, EPUBImportRequest request);
|
||||
|
||||
// Helper methods
|
||||
private void validateEPUBFile(MultipartFile file);
|
||||
private Book parseEPUBStructure(InputStream inputStream);
|
||||
private Story extractStoryData(Book epub);
|
||||
private String combineChapterContent(Book epub);
|
||||
private void extractAndSaveCover(Book epub, Story story);
|
||||
private List<String> extractTags(Book epub);
|
||||
private ReadingPosition extractReadingPosition(Book epub);
|
||||
}
|
||||
```
|
||||
|
||||
### EPUBExportService
|
||||
```java
|
||||
@Service
|
||||
public class EPUBExportService {
|
||||
|
||||
// Core export methods
|
||||
public byte[] exportSingleStory(UUID storyId, EPUBExportOptions options);
|
||||
public byte[] exportCollection(UUID collectionId, EPUBExportOptions options);
|
||||
|
||||
// Helper methods
|
||||
private Book createEPUBStructure(Story story, ReadingPosition position);
|
||||
private Book createCollectionEPUB(Collection collection, List<ReadingPosition> positions);
|
||||
private void addReadingPositionMetadata(Book book, ReadingPosition position);
|
||||
private String generateCFI(ReadingPosition position);
|
||||
private Resource createChapterResource(Story story);
|
||||
private Resource createStylesheetResource();
|
||||
private void addCoverImage(Book book, Story story);
|
||||
}
|
||||
```
|
||||
|
||||
## Frontend Integration
|
||||
|
||||
### Import UI Flow
|
||||
1. **Upload Interface**: File input with EPUB validation
|
||||
2. **Progress Indicator**: Show parsing progress
|
||||
3. **Preview Screen**: Display extracted metadata for confirmation
|
||||
4. **Confirmation**: Allow editing of title, author, summary before saving
|
||||
5. **Success**: Redirect to created story
|
||||
|
||||
### Export UI Flow
|
||||
1. **Export Button**: Available on story detail and collection pages
|
||||
2. **Options Modal**: Allow selection of export options
|
||||
3. **Progress Indicator**: Show EPUB generation progress
|
||||
4. **Download**: Automatic file download on completion
|
||||
|
||||
### Frontend API Calls
|
||||
```typescript
|
||||
// Import EPUB
|
||||
const importEPUB = async (file: File) => {
|
||||
const formData = new FormData();
|
||||
formData.append('file', file);
|
||||
|
||||
const response = await fetch('/api/stories/import-epub', {
|
||||
method: 'POST',
|
||||
body: formData,
|
||||
});
|
||||
|
||||
return await response.json();
|
||||
};
|
||||
|
||||
// Export Story
|
||||
const exportStoryEPUB = async (storyId: string) => {
|
||||
const response = await fetch(`/api/stories/${storyId}/export-epub`, {
|
||||
method: 'GET',
|
||||
});
|
||||
|
||||
const blob = await response.blob();
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const a = document.createElement('a');
|
||||
a.href = url;
|
||||
a.download = `${storyTitle}.epub`;
|
||||
a.click();
|
||||
};
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Import Errors
|
||||
- **Invalid EPUB format**: "Invalid EPUB file format"
|
||||
- **File too large**: "File size exceeds 50MB limit"
|
||||
- **DRM protected**: "DRM-protected EPUBs not supported"
|
||||
- **Corrupted file**: "EPUB file appears to be corrupted"
|
||||
- **No content**: "EPUB contains no readable content"
|
||||
|
||||
### Export Errors
|
||||
- **Story not found**: "Story not found or access denied"
|
||||
- **Missing content**: "Story has no content to export"
|
||||
- **Generation failure**: "Failed to generate EPUB file"
|
||||
|
||||
## Security Considerations
|
||||
|
||||
### File Upload Security
|
||||
- **File type validation**: Verify EPUB MIME type and structure
|
||||
- **Size limits**: Enforce maximum file size limits
|
||||
- **Content sanitization**: Apply existing HTML sanitization
|
||||
- **Virus scanning**: Consider integration with antivirus scanning
|
||||
|
||||
### Content Security
|
||||
- **HTML sanitization**: Apply existing Jsoup rules to imported content
|
||||
- **Image validation**: Validate extracted cover images
|
||||
- **Metadata escaping**: Escape special characters in metadata
|
||||
|
||||
## Testing Strategy
|
||||
|
||||
### Unit Tests
|
||||
- EPUB parsing and validation logic
|
||||
- CFI generation and parsing
|
||||
- Metadata extraction accuracy
|
||||
- Content sanitization
|
||||
|
||||
### Integration Tests
|
||||
- End-to-end import/export workflow
|
||||
- Reading position preservation
|
||||
- Multi-story collection export
|
||||
- Error handling scenarios
|
||||
|
||||
### Test Data
|
||||
- Sample EPUB files for various scenarios
|
||||
- EPUBs with and without reading positions
|
||||
- Multi-chapter EPUBs
|
||||
- EPUBs with covers and metadata
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
### Import Performance
|
||||
- **Streaming processing**: Process large EPUBs without loading entirely into memory
|
||||
- **Async processing**: Consider async import for large files
|
||||
- **Progress tracking**: Provide progress feedback for large imports
|
||||
|
||||
### Export Performance
|
||||
- **Caching**: Cache generated EPUBs for repeated exports
|
||||
- **Streaming**: Stream EPUB generation for large collections
|
||||
- **Resource optimization**: Optimize image and content sizes
|
||||
|
||||
## Future Enhancements (Out of Scope)
|
||||
|
||||
### Phase 2 Considerations
|
||||
- **DRM support**: Research legal and technical feasibility
|
||||
- **Reading position sync**: Real-time sync across devices
|
||||
- **Advanced EPUB features**: Enhanced typography, annotations
|
||||
- **Bulk operations**: Import/export multiple EPUBs
|
||||
- **EPUB validation**: Full EPUB compliance checking
|
||||
|
||||
### Integration Possibilities
|
||||
- **Cloud storage**: Export directly to Google Drive, Dropbox
|
||||
- **E-reader sync**: Direct sync with Kindle, Kobo devices
|
||||
- **Reading analytics**: Track reading patterns and statistics
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1: Core Functionality ✅ **COMPLETED**
|
||||
- [x] Basic EPUB import (DRM-free)
|
||||
- [x] Single story export
|
||||
- [x] Reading position storage and retrieval
|
||||
- [x] Frontend UI integration
|
||||
|
||||
### Phase 2: Enhanced Features ✅ **COMPLETED**
|
||||
- [x] Collection export with table of contents
|
||||
- [x] Advanced metadata handling (subjects, keywords, publisher, language, etc.)
|
||||
- [x] Enhanced cover image processing for import/export
|
||||
- [x] Comprehensive error handling
|
||||
|
||||
### Phase 3: Advanced Features
|
||||
- [ ] DRM exploration (legal research required)
|
||||
- [ ] Reading position sync
|
||||
- [ ] Advanced EPUB features
|
||||
- [ ] Analytics and reporting
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
### Import Success Criteria ✅ **COMPLETED**
|
||||
- [x] Successfully parse EPUB 2.0 and 3.x files
|
||||
- [x] Extract title, author, summary, and content accurately
|
||||
- [x] Preserve formatting and basic HTML structure
|
||||
- [x] Handle cover images correctly
|
||||
- [x] Import reading positions when present
|
||||
- [x] Provide clear error messages for invalid files
|
||||
|
||||
### Export Success Criteria ✅ **FULLY COMPLETED**
|
||||
- [x] Generate valid EPUB files compatible with major readers
|
||||
- [x] Include accurate metadata and content
|
||||
- [x] Embed reading positions using CFI standard
|
||||
- [x] Support single story export
|
||||
- [x] Support collection export with proper structure
|
||||
- [x] Generate proper table of contents for collections
|
||||
- [x] Include cover images when available
|
||||
|
||||
---
|
||||
|
||||
*This specification serves as the implementation guide for the EPUB import/export feature. All implementation decisions should reference this document for consistency and completeness.*
|
||||
13
README.md
13
README.md
@@ -131,9 +131,12 @@ cd backend
|
||||
### 🎨 **User Experience**
|
||||
- **Dark/Light Mode**: Automatic theme switching with system preference detection
|
||||
- **Responsive Design**: Optimized for desktop, tablet, and mobile
|
||||
- **Reading Mode**: Distraction-free reading interface
|
||||
- **Reading Mode**: Distraction-free reading interface with real-time progress tracking
|
||||
- **Reading Position Memory**: Character-based position tracking with smooth auto-scroll restoration
|
||||
- **Smart Tag Filtering**: Dynamic tag filters with live story counts in library view
|
||||
- **Keyboard Navigation**: Full keyboard accessibility
|
||||
- **Rich Text Editor**: Visual and source editing modes for story content
|
||||
- **Progress Indicators**: Visual reading progress bars and completion tracking
|
||||
|
||||
### 🔒 **Security & Administration**
|
||||
- **JWT Authentication**: Secure token-based authentication
|
||||
@@ -170,9 +173,9 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
||||
|
||||
### **Stories**
|
||||
- **Primary Key**: UUID
|
||||
- **Fields**: title, summary, description, content_html, content_plain, source_url, word_count, rating, volume, cover_path
|
||||
- **Fields**: title, summary, description, content_html, content_plain, source_url, word_count, rating, volume, cover_path, reading_position, last_read_at
|
||||
- **Relationships**: Many-to-One with Author, Many-to-One with Series, Many-to-Many with Tags
|
||||
- **Features**: Automatic word count calculation, HTML sanitization, plain text extraction
|
||||
- **Features**: Automatic word count calculation, HTML sanitization, plain text extraction, reading progress tracking
|
||||
|
||||
### **Authors**
|
||||
- **Primary Key**: UUID
|
||||
@@ -214,7 +217,8 @@ StoryCove uses a PostgreSQL database with the following core entities:
|
||||
- `POST /{id}/rating` - Set story rating
|
||||
- `POST /{id}/tags/{tagId}` - Add tag to story
|
||||
- `DELETE /{id}/tags/{tagId}` - Remove tag from story
|
||||
- `GET /search` - Search stories (Typesense)
|
||||
- `POST /{id}/reading-progress` - Update reading position
|
||||
- `GET /search` - Search stories (Typesense with faceting)
|
||||
- `GET /search/suggestions` - Get search suggestions
|
||||
- `GET /author/{authorId}` - Stories by author
|
||||
- `GET /series/{seriesId}` - Stories in series
|
||||
@@ -295,6 +299,7 @@ All API endpoints use JSON format with proper HTTP status codes:
|
||||
- **Backend**: Spring Boot 3, Java 21, PostgreSQL, Typesense
|
||||
- **Infrastructure**: Docker, Docker Compose, Nginx
|
||||
- **Security**: JWT authentication, HTML sanitization, CORS
|
||||
- **Search**: Typesense with faceting and full-text search capabilities
|
||||
|
||||
### **Local Development Setup**
|
||||
|
||||
|
||||
1
backend/backend.log
Normal file
1
backend/backend.log
Normal file
@@ -0,0 +1 @@
|
||||
(eval):1: no such file or directory: ./mvnw
|
||||
@@ -84,6 +84,11 @@
|
||||
<artifactId>typesense-java</artifactId>
|
||||
<version>1.3.0</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>com.positiondev.epublib</groupId>
|
||||
<artifactId>epublib-core</artifactId>
|
||||
<version>3.1</version>
|
||||
</dependency>
|
||||
|
||||
<!-- Test dependencies -->
|
||||
<dependency>
|
||||
|
||||
@@ -56,7 +56,10 @@ public class SecurityConfig {
|
||||
@Bean
|
||||
public CorsConfigurationSource corsConfigurationSource() {
|
||||
CorsConfiguration configuration = new CorsConfiguration();
|
||||
configuration.setAllowedOriginPatterns(Arrays.asList(allowedOrigins.split(",")));
|
||||
List<String> origins = Arrays.stream(allowedOrigins.split(","))
|
||||
.map(String::trim)
|
||||
.toList();
|
||||
configuration.setAllowedOriginPatterns(origins);
|
||||
configuration.setAllowedMethods(Arrays.asList("GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"));
|
||||
configuration.setAllowedHeaders(List.of("*"));
|
||||
configuration.setAllowCredentials(true);
|
||||
|
||||
@@ -65,10 +65,12 @@ public class AuthorController {
|
||||
|
||||
@PostMapping
|
||||
public ResponseEntity<AuthorDto> createAuthor(@Valid @RequestBody CreateAuthorRequest request) {
|
||||
logger.info("Creating new author: {}", request.getName());
|
||||
Author author = new Author();
|
||||
updateAuthorFromRequest(author, request);
|
||||
|
||||
Author savedAuthor = authorService.create(author);
|
||||
logger.info("Successfully created author: {} (ID: {})", savedAuthor.getName(), savedAuthor.getId());
|
||||
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedAuthor));
|
||||
}
|
||||
|
||||
@@ -81,13 +83,7 @@ public class AuthorController {
|
||||
@RequestParam(required = false, name = "authorRating") Integer rating,
|
||||
@RequestParam(required = false, name = "avatar") MultipartFile avatarFile) {
|
||||
|
||||
System.out.println("DEBUG: MULTIPART PUT called with:");
|
||||
System.out.println(" - name: " + name);
|
||||
System.out.println(" - notes: " + notes);
|
||||
System.out.println(" - urls: " + urls);
|
||||
System.out.println(" - rating: " + rating);
|
||||
System.out.println(" - avatar: " + (avatarFile != null ? avatarFile.getOriginalFilename() : "null"));
|
||||
|
||||
logger.info("Updating author with multipart data (ID: {})", id);
|
||||
try {
|
||||
Author existingAuthor = authorService.findById(id);
|
||||
|
||||
@@ -104,7 +100,6 @@ public class AuthorController {
|
||||
|
||||
// Handle rating update
|
||||
if (rating != null) {
|
||||
System.out.println("DEBUG: Setting author rating via PUT: " + rating);
|
||||
existingAuthor.setAuthorRating(rating);
|
||||
}
|
||||
|
||||
@@ -115,6 +110,7 @@ public class AuthorController {
|
||||
}
|
||||
|
||||
Author updatedAuthor = authorService.update(id, existingAuthor);
|
||||
logger.info("Successfully updated author: {} via multipart", updatedAuthor.getName());
|
||||
return ResponseEntity.ok(convertToDto(updatedAuthor));
|
||||
|
||||
} catch (Exception e) {
|
||||
@@ -125,31 +121,27 @@ public class AuthorController {
|
||||
@PutMapping(value = "/{id}", consumes = "application/json")
|
||||
public ResponseEntity<AuthorDto> updateAuthorJson(@PathVariable UUID id,
|
||||
@Valid @RequestBody UpdateAuthorRequest request) {
|
||||
System.out.println("DEBUG: JSON PUT called with:");
|
||||
System.out.println(" - name: " + request.getName());
|
||||
System.out.println(" - notes: " + request.getNotes());
|
||||
System.out.println(" - urls: " + request.getUrls());
|
||||
System.out.println(" - rating: " + request.getRating());
|
||||
logger.info("Updating author with JSON data: {} (ID: {})", request.getName(), id);
|
||||
|
||||
Author existingAuthor = authorService.findById(id);
|
||||
updateAuthorFromRequest(existingAuthor, request);
|
||||
|
||||
Author updatedAuthor = authorService.update(id, existingAuthor);
|
||||
logger.info("Successfully updated author: {} via JSON", updatedAuthor.getName());
|
||||
return ResponseEntity.ok(convertToDto(updatedAuthor));
|
||||
}
|
||||
|
||||
@PutMapping("/{id}")
|
||||
public ResponseEntity<String> updateAuthorGeneric(@PathVariable UUID id, HttpServletRequest request) {
|
||||
System.out.println("DEBUG: GENERIC PUT called!");
|
||||
System.out.println(" - Content-Type: " + request.getContentType());
|
||||
System.out.println(" - Method: " + request.getMethod());
|
||||
|
||||
return ResponseEntity.status(415).body("Unsupported Media Type. Expected multipart/form-data or application/json");
|
||||
}
|
||||
|
||||
@DeleteMapping("/{id}")
|
||||
public ResponseEntity<?> deleteAuthor(@PathVariable UUID id) {
|
||||
logger.info("Deleting author with ID: {}", id);
|
||||
authorService.delete(id);
|
||||
logger.info("Successfully deleted author with ID: {}", id);
|
||||
return ResponseEntity.ok(Map.of("message", "Author deleted successfully"));
|
||||
}
|
||||
|
||||
@@ -177,11 +169,8 @@ public class AuthorController {
|
||||
|
||||
@PostMapping("/{id}/rating")
|
||||
public ResponseEntity<AuthorDto> rateAuthor(@PathVariable UUID id, @RequestBody RatingRequest request) {
|
||||
System.out.println("DEBUG: Rating author " + id + " with rating " + request.getRating());
|
||||
Author author = authorService.setRating(id, request.getRating());
|
||||
System.out.println("DEBUG: After setRating, author rating is: " + author.getAuthorRating());
|
||||
AuthorDto dto = convertToDto(author);
|
||||
System.out.println("DEBUG: Final DTO rating is: " + dto.getAuthorRating());
|
||||
return ResponseEntity.ok(dto);
|
||||
}
|
||||
|
||||
@@ -211,9 +200,7 @@ public class AuthorController {
|
||||
@PostMapping("/{id}/test-rating/{rating}")
|
||||
public ResponseEntity<Map<String, Object>> testSetRating(@PathVariable UUID id, @PathVariable Integer rating) {
|
||||
try {
|
||||
System.out.println("DEBUG: Test setting rating " + rating + " for author " + id);
|
||||
Author author = authorService.setRating(id, rating);
|
||||
System.out.println("DEBUG: After test setRating, got: " + author.getAuthorRating());
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
@@ -231,13 +218,11 @@ public class AuthorController {
|
||||
@PostMapping("/{id}/test-put-rating")
|
||||
public ResponseEntity<Map<String, Object>> testPutWithRating(@PathVariable UUID id, @RequestParam Integer rating) {
|
||||
try {
|
||||
System.out.println("DEBUG: Test PUT with rating " + rating + " for author " + id);
|
||||
|
||||
Author existingAuthor = authorService.findById(id);
|
||||
existingAuthor.setAuthorRating(rating);
|
||||
Author updatedAuthor = authorService.update(id, existingAuthor);
|
||||
|
||||
System.out.println("DEBUG: After PUT update, rating is: " + updatedAuthor.getAuthorRating());
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
@@ -389,7 +374,6 @@ public class AuthorController {
|
||||
author.setUrls(updateReq.getUrls());
|
||||
}
|
||||
if (updateReq.getRating() != null) {
|
||||
System.out.println("DEBUG: Setting author rating via JSON: " + updateReq.getRating());
|
||||
author.setAuthorRating(updateReq.getRating());
|
||||
}
|
||||
}
|
||||
@@ -402,9 +386,6 @@ public class AuthorController {
|
||||
dto.setNotes(author.getNotes());
|
||||
dto.setAvatarImagePath(author.getAvatarImagePath());
|
||||
|
||||
// Debug logging for author rating
|
||||
System.out.println("DEBUG: Converting author " + author.getName() +
|
||||
" with rating: " + author.getAuthorRating());
|
||||
|
||||
dto.setAuthorRating(author.getAuthorRating());
|
||||
dto.setUrls(author.getUrls());
|
||||
@@ -415,7 +396,6 @@ public class AuthorController {
|
||||
// Calculate and set average story rating
|
||||
dto.setAverageStoryRating(authorService.calculateAverageStoryRating(author.getId()));
|
||||
|
||||
System.out.println("DEBUG: DTO authorRating set to: " + dto.getAuthorRating());
|
||||
|
||||
return dto;
|
||||
}
|
||||
|
||||
@@ -6,7 +6,10 @@ import com.storycove.entity.CollectionStory;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.entity.Tag;
|
||||
import com.storycove.service.CollectionService;
|
||||
import com.storycove.service.EPUBExportService;
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.ReadingTimeService;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import jakarta.validation.Valid;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
@@ -28,12 +31,21 @@ public class CollectionController {
|
||||
|
||||
private final CollectionService collectionService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
private final EPUBExportService epubExportService;
|
||||
|
||||
@Autowired
|
||||
public CollectionController(CollectionService collectionService,
|
||||
ImageService imageService) {
|
||||
ImageService imageService,
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
ReadingTimeService readingTimeService,
|
||||
EPUBExportService epubExportService) {
|
||||
this.collectionService = collectionService;
|
||||
this.imageService = imageService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.readingTimeService = readingTimeService;
|
||||
this.epubExportService = epubExportService;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -48,8 +60,6 @@ public class CollectionController {
|
||||
@RequestParam(required = false) List<String> tags,
|
||||
@RequestParam(defaultValue = "false") boolean archived) {
|
||||
|
||||
logger.info("COLLECTIONS: Search request - search='{}', tags={}, archived={}, page={}, limit={}",
|
||||
search, tags, archived, page, limit);
|
||||
|
||||
// MANDATORY: Use Typesense for all search/filter operations
|
||||
SearchResultDto<Collection> results = collectionService.searchCollections(search, tags, archived, page, limit);
|
||||
@@ -86,13 +96,14 @@ public class CollectionController {
|
||||
*/
|
||||
@PostMapping
|
||||
public ResponseEntity<Collection> createCollection(@Valid @RequestBody CreateCollectionRequest request) {
|
||||
logger.info("Creating new collection: {}", request.getName());
|
||||
Collection collection = collectionService.createCollection(
|
||||
request.getName(),
|
||||
request.getDescription(),
|
||||
request.getTagNames(),
|
||||
request.getStoryIds()
|
||||
);
|
||||
|
||||
logger.info("Successfully created collection: {} (ID: {})", collection.getName(), collection.getId());
|
||||
return ResponseEntity.status(HttpStatus.CREATED).body(collection);
|
||||
}
|
||||
|
||||
@@ -107,6 +118,7 @@ public class CollectionController {
|
||||
@RequestParam(required = false) List<UUID> storyIds,
|
||||
@RequestParam(required = false, name = "coverImage") MultipartFile coverImage) {
|
||||
|
||||
logger.info("Creating new collection with image: {}", name);
|
||||
try {
|
||||
// Create collection first
|
||||
Collection collection = collectionService.createCollection(name, description, tags, storyIds);
|
||||
@@ -120,6 +132,7 @@ public class CollectionController {
|
||||
);
|
||||
}
|
||||
|
||||
logger.info("Successfully created collection with image: {} (ID: {})", collection.getName(), collection.getId());
|
||||
return ResponseEntity.status(HttpStatus.CREATED).body(collection);
|
||||
|
||||
} catch (Exception e) {
|
||||
@@ -152,7 +165,9 @@ public class CollectionController {
|
||||
*/
|
||||
@DeleteMapping("/{id}")
|
||||
public ResponseEntity<Map<String, String>> deleteCollection(@PathVariable UUID id) {
|
||||
logger.info("Deleting collection with ID: {}", id);
|
||||
collectionService.deleteCollection(id);
|
||||
logger.info("Successfully deleted collection with ID: {}", id);
|
||||
return ResponseEntity.ok(Map.of("message", "Collection deleted successfully"));
|
||||
}
|
||||
|
||||
@@ -270,6 +285,114 @@ public class CollectionController {
|
||||
return ResponseEntity.ok(Map.of("message", "Cover removed successfully"));
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/collections/reindex-typesense - Reindex all collections in Typesense
|
||||
*/
|
||||
@PostMapping("/reindex-typesense")
|
||||
public ResponseEntity<Map<String, Object>> reindexCollectionsTypesense() {
|
||||
try {
|
||||
List<Collection> allCollections = collectionService.findAllWithTags();
|
||||
if (typesenseService != null) {
|
||||
typesenseService.reindexAllCollections(allCollections);
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Successfully reindexed all collections",
|
||||
"count", allCollections.size()
|
||||
));
|
||||
} else {
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", false,
|
||||
"message", "Typesense service not available"
|
||||
));
|
||||
}
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to reindex collections", e);
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/collections/{id}/epub - Export collection as EPUB
|
||||
*/
|
||||
@GetMapping("/{id}/epub")
|
||||
public ResponseEntity<org.springframework.core.io.Resource> exportCollectionAsEPUB(@PathVariable UUID id) {
|
||||
logger.info("Exporting collection {} to EPUB", id);
|
||||
|
||||
try {
|
||||
Collection collection = collectionService.findById(id);
|
||||
List<Story> stories = collection.getCollectionStories().stream()
|
||||
.sorted((cs1, cs2) -> Integer.compare(cs1.getPosition(), cs2.getPosition()))
|
||||
.map(cs -> cs.getStory())
|
||||
.collect(java.util.stream.Collectors.toList());
|
||||
|
||||
if (stories.isEmpty()) {
|
||||
logger.warn("Collection {} contains no stories for export", id);
|
||||
return ResponseEntity.badRequest()
|
||||
.body(null);
|
||||
}
|
||||
|
||||
EPUBExportRequest request = new EPUBExportRequest();
|
||||
request.setIncludeCoverImage(true);
|
||||
request.setIncludeMetadata(true);
|
||||
request.setIncludeReadingPosition(false); // Collections don't have reading positions
|
||||
|
||||
org.springframework.core.io.Resource resource = epubExportService.exportCollectionAsEPUB(id, request);
|
||||
String filename = epubExportService.getCollectionEPUBFilename(collection);
|
||||
|
||||
logger.info("Successfully exported collection EPUB: {}", filename);
|
||||
|
||||
return ResponseEntity.ok()
|
||||
.header("Content-Disposition", "attachment; filename=\"" + filename + "\"")
|
||||
.header("Content-Type", "application/epub+zip")
|
||||
.body(resource);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error exporting collection EPUB: {}", e.getMessage(), e);
|
||||
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/collections/{id}/epub - Export collection as EPUB with custom options
|
||||
*/
|
||||
@PostMapping("/{id}/epub")
|
||||
public ResponseEntity<org.springframework.core.io.Resource> exportCollectionAsEPUBWithOptions(
|
||||
@PathVariable UUID id,
|
||||
@Valid @RequestBody EPUBExportRequest request) {
|
||||
logger.info("Exporting collection {} to EPUB with custom options", id);
|
||||
|
||||
try {
|
||||
Collection collection = collectionService.findById(id);
|
||||
List<Story> stories = collection.getCollectionStories().stream()
|
||||
.sorted((cs1, cs2) -> Integer.compare(cs1.getPosition(), cs2.getPosition()))
|
||||
.map(cs -> cs.getStory())
|
||||
.collect(java.util.stream.Collectors.toList());
|
||||
|
||||
if (stories.isEmpty()) {
|
||||
logger.warn("Collection {} contains no stories for export", id);
|
||||
return ResponseEntity.badRequest()
|
||||
.body(null);
|
||||
}
|
||||
|
||||
org.springframework.core.io.Resource resource = epubExportService.exportCollectionAsEPUB(id, request);
|
||||
String filename = epubExportService.getCollectionEPUBFilename(collection);
|
||||
|
||||
logger.info("Successfully exported collection EPUB with options: {}", filename);
|
||||
|
||||
return ResponseEntity.ok()
|
||||
.header("Content-Disposition", "attachment; filename=\"" + filename + "\"")
|
||||
.header("Content-Type", "application/epub+zip")
|
||||
.body(resource);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error exporting collection EPUB: {}", e.getMessage(), e);
|
||||
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
|
||||
}
|
||||
}
|
||||
|
||||
// Mapper methods
|
||||
|
||||
private CollectionDto mapToCollectionDto(Collection collection) {
|
||||
@@ -290,6 +413,11 @@ public class CollectionController {
|
||||
.toList());
|
||||
}
|
||||
|
||||
// Map tag names for search results
|
||||
if (collection.getTagNames() != null) {
|
||||
dto.setTagNames(collection.getTagNames());
|
||||
}
|
||||
|
||||
// Map collection stories (lightweight)
|
||||
if (collection.getCollectionStories() != null) {
|
||||
dto.setCollectionStories(collection.getCollectionStories().stream()
|
||||
@@ -300,7 +428,7 @@ public class CollectionController {
|
||||
// Set calculated properties
|
||||
dto.setStoryCount(collection.getStoryCount());
|
||||
dto.setTotalWordCount(collection.getTotalWordCount());
|
||||
dto.setEstimatedReadingTime(collection.getEstimatedReadingTime());
|
||||
dto.setEstimatedReadingTime(readingTimeService.calculateReadingTime(collection.getTotalWordCount()));
|
||||
dto.setAverageStoryRating(collection.getAverageStoryRating());
|
||||
|
||||
return dto;
|
||||
|
||||
@@ -0,0 +1,54 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.dto.HtmlSanitizationConfigDto;
|
||||
import com.storycove.service.HtmlSanitizationService;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.Map;
|
||||
|
||||
@RestController
|
||||
@RequestMapping("/api/config")
|
||||
public class ConfigController {
|
||||
|
||||
private final HtmlSanitizationService htmlSanitizationService;
|
||||
|
||||
@Value("${app.reading.speed.default:200}")
|
||||
private int defaultReadingSpeed;
|
||||
|
||||
@Autowired
|
||||
public ConfigController(HtmlSanitizationService htmlSanitizationService) {
|
||||
this.htmlSanitizationService = htmlSanitizationService;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the HTML sanitization configuration for frontend use
|
||||
* This allows the frontend to use the same sanitization rules as the backend
|
||||
*/
|
||||
@GetMapping("/html-sanitization")
|
||||
public ResponseEntity<HtmlSanitizationConfigDto> getHtmlSanitizationConfig() {
|
||||
HtmlSanitizationConfigDto config = htmlSanitizationService.getConfiguration();
|
||||
return ResponseEntity.ok(config);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get application settings configuration
|
||||
*/
|
||||
@GetMapping("/settings")
|
||||
public ResponseEntity<Map<String, Object>> getSettings() {
|
||||
Map<String, Object> settings = Map.of(
|
||||
"defaultReadingSpeed", defaultReadingSpeed
|
||||
);
|
||||
return ResponseEntity.ok(settings);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get reading speed for calculation purposes
|
||||
*/
|
||||
@GetMapping("/reading-speed")
|
||||
public ResponseEntity<Map<String, Integer>> getReadingSpeed() {
|
||||
return ResponseEntity.ok(Map.of("wordsPerMinute", defaultReadingSpeed));
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,154 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.service.DatabaseManagementService;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.http.HttpHeaders;
|
||||
import org.springframework.http.MediaType;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
import org.springframework.web.multipart.MultipartFile;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.time.LocalDateTime;
|
||||
import java.time.format.DateTimeFormatter;
|
||||
import java.util.Map;
|
||||
|
||||
@RestController
|
||||
@RequestMapping("/api/database")
|
||||
public class DatabaseController {
|
||||
|
||||
@Autowired
|
||||
private DatabaseManagementService databaseManagementService;
|
||||
|
||||
@PostMapping("/backup")
|
||||
public ResponseEntity<Resource> backupDatabase() {
|
||||
try {
|
||||
Resource backup = databaseManagementService.createBackup();
|
||||
|
||||
String timestamp = LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss"));
|
||||
String filename = "storycove_backup_" + timestamp + ".sql";
|
||||
|
||||
return ResponseEntity.ok()
|
||||
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"" + filename + "\"")
|
||||
.contentType(MediaType.APPLICATION_OCTET_STREAM)
|
||||
.body(backup);
|
||||
} catch (Exception e) {
|
||||
throw new RuntimeException("Failed to create database backup: " + e.getMessage(), e);
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/restore")
|
||||
public ResponseEntity<Map<String, Object>> restoreDatabase(@RequestParam("file") MultipartFile file) {
|
||||
try {
|
||||
if (file.isEmpty()) {
|
||||
return ResponseEntity.badRequest()
|
||||
.body(Map.of("success", false, "message", "No file uploaded"));
|
||||
}
|
||||
|
||||
if (!file.getOriginalFilename().endsWith(".sql")) {
|
||||
return ResponseEntity.badRequest()
|
||||
.body(Map.of("success", false, "message", "Invalid file type. Please upload a .sql file"));
|
||||
}
|
||||
|
||||
databaseManagementService.restoreFromBackup(file.getInputStream());
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Database restored successfully from " + file.getOriginalFilename()
|
||||
));
|
||||
} catch (IOException e) {
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to read backup file: " + e.getMessage()));
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to restore database: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/clear")
|
||||
public ResponseEntity<Map<String, Object>> clearDatabase() {
|
||||
try {
|
||||
int deletedRecords = databaseManagementService.clearAllData();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Database cleared successfully",
|
||||
"deletedRecords", deletedRecords
|
||||
));
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to clear database: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/backup-complete")
|
||||
public ResponseEntity<Resource> backupComplete() {
|
||||
try {
|
||||
Resource backup = databaseManagementService.createCompleteBackup();
|
||||
|
||||
String timestamp = LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyy-MM-dd_HH-mm-ss"));
|
||||
String filename = "storycove_complete_backup_" + timestamp + ".zip";
|
||||
|
||||
return ResponseEntity.ok()
|
||||
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"" + filename + "\"")
|
||||
.header(HttpHeaders.CONTENT_TYPE, "application/zip")
|
||||
.body(backup);
|
||||
} catch (Exception e) {
|
||||
throw new RuntimeException("Failed to create complete backup: " + e.getMessage(), e);
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/restore-complete")
|
||||
public ResponseEntity<Map<String, Object>> restoreComplete(@RequestParam("file") MultipartFile file) {
|
||||
System.err.println("Complete restore endpoint called with file: " + (file != null ? file.getOriginalFilename() : "null"));
|
||||
try {
|
||||
if (file.isEmpty()) {
|
||||
System.err.println("File is empty - returning bad request");
|
||||
return ResponseEntity.badRequest()
|
||||
.body(Map.of("success", false, "message", "No file uploaded"));
|
||||
}
|
||||
|
||||
if (!file.getOriginalFilename().endsWith(".zip")) {
|
||||
System.err.println("Invalid file type: " + file.getOriginalFilename());
|
||||
return ResponseEntity.badRequest()
|
||||
.body(Map.of("success", false, "message", "Invalid file type. Please upload a .zip file"));
|
||||
}
|
||||
|
||||
System.err.println("File validation passed, calling restore service...");
|
||||
databaseManagementService.restoreFromCompleteBackup(file.getInputStream());
|
||||
System.err.println("Restore service completed successfully");
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Complete backup restored successfully from " + file.getOriginalFilename()
|
||||
));
|
||||
} catch (IOException e) {
|
||||
System.err.println("IOException during restore: " + e.getMessage());
|
||||
e.printStackTrace();
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to read backup file: " + e.getMessage()));
|
||||
} catch (Exception e) {
|
||||
System.err.println("Exception during restore: " + e.getMessage());
|
||||
e.printStackTrace();
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to restore complete backup: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
@PostMapping("/clear-complete")
|
||||
public ResponseEntity<Map<String, Object>> clearComplete() {
|
||||
try {
|
||||
int deletedRecords = databaseManagementService.clearAllDataAndFiles();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", "Database and files cleared successfully",
|
||||
"deletedRecords", deletedRecords
|
||||
));
|
||||
} catch (Exception e) {
|
||||
return ResponseEntity.internalServerError()
|
||||
.body(Map.of("success", false, "message", "Failed to clear database and files: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,31 +0,0 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.dto.HtmlSanitizationConfigDto;
|
||||
import com.storycove.service.HtmlSanitizationService;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.GetMapping;
|
||||
import org.springframework.web.bind.annotation.RequestMapping;
|
||||
import org.springframework.web.bind.annotation.RestController;
|
||||
|
||||
@RestController
|
||||
@RequestMapping("/api/config")
|
||||
public class HtmlSanitizationController {
|
||||
|
||||
private final HtmlSanitizationService htmlSanitizationService;
|
||||
|
||||
@Autowired
|
||||
public HtmlSanitizationController(HtmlSanitizationService htmlSanitizationService) {
|
||||
this.htmlSanitizationService = htmlSanitizationService;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the HTML sanitization configuration for frontend use
|
||||
* This allows the frontend to use the same sanitization rules as the backend
|
||||
*/
|
||||
@GetMapping("/html-sanitization")
|
||||
public ResponseEntity<HtmlSanitizationConfigDto> getHtmlSanitizationConfig() {
|
||||
HtmlSanitizationConfigDto config = htmlSanitizationService.getConfiguration();
|
||||
return ResponseEntity.ok(config);
|
||||
}
|
||||
}
|
||||
@@ -41,6 +41,9 @@ public class StoryController {
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final CollectionService collectionService;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
private final EPUBImportService epubImportService;
|
||||
private final EPUBExportService epubExportService;
|
||||
|
||||
public StoryController(StoryService storyService,
|
||||
AuthorService authorService,
|
||||
@@ -48,7 +51,10 @@ public class StoryController {
|
||||
HtmlSanitizationService sanitizationService,
|
||||
ImageService imageService,
|
||||
CollectionService collectionService,
|
||||
@Autowired(required = false) TypesenseService typesenseService) {
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
ReadingTimeService readingTimeService,
|
||||
EPUBImportService epubImportService,
|
||||
EPUBExportService epubExportService) {
|
||||
this.storyService = storyService;
|
||||
this.authorService = authorService;
|
||||
this.seriesService = seriesService;
|
||||
@@ -56,6 +62,9 @@ public class StoryController {
|
||||
this.imageService = imageService;
|
||||
this.collectionService = collectionService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.readingTimeService = readingTimeService;
|
||||
this.epubImportService = epubImportService;
|
||||
this.epubExportService = epubExportService;
|
||||
}
|
||||
|
||||
@GetMapping
|
||||
@@ -81,25 +90,46 @@ public class StoryController {
|
||||
return ResponseEntity.ok(convertToDto(story));
|
||||
}
|
||||
|
||||
@GetMapping("/{id}/read")
|
||||
public ResponseEntity<StoryReadingDto> getStoryForReading(@PathVariable UUID id) {
|
||||
logger.info("Getting story {} for reading", id);
|
||||
Story story = storyService.findById(id);
|
||||
return ResponseEntity.ok(convertToReadingDto(story));
|
||||
}
|
||||
|
||||
@PostMapping
|
||||
public ResponseEntity<StoryDto> createStory(@Valid @RequestBody CreateStoryRequest request) {
|
||||
logger.info("Creating new story: {}", request.getTitle());
|
||||
Story story = new Story();
|
||||
updateStoryFromRequest(story, request);
|
||||
|
||||
Story savedStory = storyService.createWithTagNames(story, request.getTagNames());
|
||||
logger.info("Successfully created story: {} (ID: {})", savedStory.getTitle(), savedStory.getId());
|
||||
return ResponseEntity.status(HttpStatus.CREATED).body(convertToDto(savedStory));
|
||||
}
|
||||
|
||||
@PutMapping("/{id}")
|
||||
public ResponseEntity<StoryDto> updateStory(@PathVariable UUID id,
|
||||
@Valid @RequestBody UpdateStoryRequest request) {
|
||||
logger.info("Updating story: {} (ID: {})", request.getTitle(), id);
|
||||
|
||||
// Handle author creation/lookup at controller level before calling service
|
||||
if (request.getAuthorName() != null && !request.getAuthorName().trim().isEmpty() && request.getAuthorId() == null) {
|
||||
Author author = findOrCreateAuthor(request.getAuthorName().trim());
|
||||
request.setAuthorId(author.getId());
|
||||
request.setAuthorName(null); // Clear author name since we now have the ID
|
||||
}
|
||||
|
||||
Story updatedStory = storyService.updateWithTagNames(id, request);
|
||||
logger.info("Successfully updated story: {}", updatedStory.getTitle());
|
||||
return ResponseEntity.ok(convertToDto(updatedStory));
|
||||
}
|
||||
|
||||
@DeleteMapping("/{id}")
|
||||
public ResponseEntity<?> deleteStory(@PathVariable UUID id) {
|
||||
logger.info("Deleting story with ID: {}", id);
|
||||
storyService.delete(id);
|
||||
logger.info("Successfully deleted story with ID: {}", id);
|
||||
return ResponseEntity.ok(Map.of("message", "Story deleted successfully"));
|
||||
}
|
||||
|
||||
@@ -143,6 +173,20 @@ public class StoryController {
|
||||
return ResponseEntity.ok(convertToDto(story));
|
||||
}
|
||||
|
||||
@PostMapping("/{id}/reading-progress")
|
||||
public ResponseEntity<StoryDto> updateReadingProgress(@PathVariable UUID id, @RequestBody ReadingProgressRequest request) {
|
||||
logger.info("Updating reading progress for story {} to position {}", id, request.getPosition());
|
||||
Story story = storyService.updateReadingProgress(id, request.getPosition());
|
||||
return ResponseEntity.ok(convertToDto(story));
|
||||
}
|
||||
|
||||
@PostMapping("/{id}/reading-status")
|
||||
public ResponseEntity<StoryDto> updateReadingStatus(@PathVariable UUID id, @RequestBody ReadingStatusRequest request) {
|
||||
logger.info("Updating reading status for story {} to {}", id, request.getIsRead() ? "read" : "unread");
|
||||
Story story = storyService.updateReadingStatus(id, request.getIsRead());
|
||||
return ResponseEntity.ok(convertToDto(story));
|
||||
}
|
||||
|
||||
@PostMapping("/reindex")
|
||||
public ResponseEntity<String> manualReindex() {
|
||||
if (typesenseService == null) {
|
||||
@@ -209,7 +253,6 @@ public class StoryController {
|
||||
@RequestParam(required = false) String sortBy,
|
||||
@RequestParam(required = false) String sortDir) {
|
||||
|
||||
logger.info("CONTROLLER DEBUG: Search request - query='{}', tags={}, authors={}", query, tags, authors);
|
||||
|
||||
if (typesenseService != null) {
|
||||
SearchResultDto<StorySearchDto> results = typesenseService.searchStories(
|
||||
@@ -361,9 +404,13 @@ public class StoryController {
|
||||
if (updateReq.getVolume() != null) {
|
||||
story.setVolume(updateReq.getVolume());
|
||||
}
|
||||
// Handle author - either by ID or by name
|
||||
if (updateReq.getAuthorId() != null) {
|
||||
Author author = authorService.findById(updateReq.getAuthorId());
|
||||
story.setAuthor(author);
|
||||
} else if (updateReq.getAuthorName() != null && !updateReq.getAuthorName().trim().isEmpty()) {
|
||||
Author author = findOrCreateAuthor(updateReq.getAuthorName().trim());
|
||||
story.setAuthor(author);
|
||||
}
|
||||
// Handle series - either by ID or by name
|
||||
if (updateReq.getSeriesId() != null) {
|
||||
@@ -385,7 +432,6 @@ public class StoryController {
|
||||
dto.setSummary(story.getSummary());
|
||||
dto.setDescription(story.getDescription());
|
||||
dto.setContentHtml(story.getContentHtml());
|
||||
dto.setContentPlain(story.getContentPlain());
|
||||
dto.setSourceUrl(story.getSourceUrl());
|
||||
dto.setCoverPath(story.getCoverPath());
|
||||
dto.setWordCount(story.getWordCount());
|
||||
@@ -394,6 +440,48 @@ public class StoryController {
|
||||
dto.setCreatedAt(story.getCreatedAt());
|
||||
dto.setUpdatedAt(story.getUpdatedAt());
|
||||
|
||||
// Reading progress fields
|
||||
dto.setIsRead(story.getIsRead());
|
||||
dto.setReadingPosition(story.getReadingPosition());
|
||||
dto.setLastReadAt(story.getLastReadAt());
|
||||
|
||||
if (story.getAuthor() != null) {
|
||||
dto.setAuthorId(story.getAuthor().getId());
|
||||
dto.setAuthorName(story.getAuthor().getName());
|
||||
}
|
||||
|
||||
if (story.getSeries() != null) {
|
||||
dto.setSeriesId(story.getSeries().getId());
|
||||
dto.setSeriesName(story.getSeries().getName());
|
||||
}
|
||||
|
||||
dto.setTags(story.getTags().stream()
|
||||
.map(this::convertTagToDto)
|
||||
.collect(Collectors.toList()));
|
||||
|
||||
return dto;
|
||||
}
|
||||
|
||||
private StoryReadingDto convertToReadingDto(Story story) {
|
||||
StoryReadingDto dto = new StoryReadingDto();
|
||||
dto.setId(story.getId());
|
||||
dto.setTitle(story.getTitle());
|
||||
dto.setSummary(story.getSummary());
|
||||
dto.setDescription(story.getDescription());
|
||||
dto.setContentHtml(story.getContentHtml());
|
||||
dto.setSourceUrl(story.getSourceUrl());
|
||||
dto.setCoverPath(story.getCoverPath());
|
||||
dto.setWordCount(story.getWordCount());
|
||||
dto.setRating(story.getRating());
|
||||
dto.setVolume(story.getVolume());
|
||||
dto.setCreatedAt(story.getCreatedAt());
|
||||
dto.setUpdatedAt(story.getUpdatedAt());
|
||||
|
||||
// Reading progress fields
|
||||
dto.setIsRead(story.getIsRead());
|
||||
dto.setReadingPosition(story.getReadingPosition());
|
||||
dto.setLastReadAt(story.getLastReadAt());
|
||||
|
||||
if (story.getAuthor() != null) {
|
||||
dto.setAuthorId(story.getAuthor().getId());
|
||||
dto.setAuthorName(story.getAuthor().getName());
|
||||
@@ -426,6 +514,11 @@ public class StoryController {
|
||||
dto.setUpdatedAt(story.getUpdatedAt());
|
||||
dto.setPartOfSeries(story.isPartOfSeries());
|
||||
|
||||
// Reading progress fields
|
||||
dto.setIsRead(story.getIsRead());
|
||||
dto.setReadingPosition(story.getReadingPosition());
|
||||
dto.setLastReadAt(story.getLastReadAt());
|
||||
|
||||
if (story.getAuthor() != null) {
|
||||
dto.setAuthorId(story.getAuthor().getId());
|
||||
dto.setAuthorName(story.getAuthor().getName());
|
||||
@@ -467,12 +560,151 @@ public class StoryController {
|
||||
// to avoid circular references and keep it lightweight
|
||||
dto.setStoryCount(collection.getStoryCount());
|
||||
dto.setTotalWordCount(collection.getTotalWordCount());
|
||||
dto.setEstimatedReadingTime(collection.getEstimatedReadingTime());
|
||||
dto.setEstimatedReadingTime(readingTimeService.calculateReadingTime(collection.getTotalWordCount()));
|
||||
dto.setAverageStoryRating(collection.getAverageStoryRating());
|
||||
|
||||
return dto;
|
||||
}
|
||||
|
||||
@GetMapping("/check-duplicate")
|
||||
public ResponseEntity<Map<String, Object>> checkDuplicate(
|
||||
@RequestParam String title,
|
||||
@RequestParam String authorName) {
|
||||
try {
|
||||
List<Story> duplicates = storyService.findPotentialDuplicates(title, authorName);
|
||||
|
||||
Map<String, Object> response = Map.of(
|
||||
"hasDuplicates", !duplicates.isEmpty(),
|
||||
"count", duplicates.size(),
|
||||
"duplicates", duplicates.stream()
|
||||
.map(story -> Map.of(
|
||||
"id", story.getId(),
|
||||
"title", story.getTitle(),
|
||||
"authorName", story.getAuthor() != null ? story.getAuthor().getName() : "",
|
||||
"createdAt", story.getCreatedAt()
|
||||
))
|
||||
.collect(Collectors.toList())
|
||||
);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
} catch (Exception e) {
|
||||
logger.error("Error checking for duplicates", e);
|
||||
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||
.body(Map.of("error", "Failed to check for duplicates"));
|
||||
}
|
||||
}
|
||||
|
||||
// EPUB Import endpoint
|
||||
@PostMapping("/epub/import")
|
||||
public ResponseEntity<EPUBImportResponse> importEPUB(
|
||||
@RequestParam("file") MultipartFile file,
|
||||
@RequestParam(required = false) UUID authorId,
|
||||
@RequestParam(required = false) String authorName,
|
||||
@RequestParam(required = false) UUID seriesId,
|
||||
@RequestParam(required = false) String seriesName,
|
||||
@RequestParam(required = false) Integer seriesVolume,
|
||||
@RequestParam(required = false) List<String> tags,
|
||||
@RequestParam(defaultValue = "true") Boolean preserveReadingPosition,
|
||||
@RequestParam(defaultValue = "false") Boolean overwriteExisting,
|
||||
@RequestParam(defaultValue = "true") Boolean createMissingAuthor,
|
||||
@RequestParam(defaultValue = "true") Boolean createMissingSeries) {
|
||||
|
||||
logger.info("Importing EPUB file: {}", file.getOriginalFilename());
|
||||
|
||||
EPUBImportRequest request = new EPUBImportRequest();
|
||||
request.setEpubFile(file);
|
||||
request.setAuthorId(authorId);
|
||||
request.setAuthorName(authorName);
|
||||
request.setSeriesId(seriesId);
|
||||
request.setSeriesName(seriesName);
|
||||
request.setSeriesVolume(seriesVolume);
|
||||
request.setTags(tags);
|
||||
request.setPreserveReadingPosition(preserveReadingPosition);
|
||||
request.setOverwriteExisting(overwriteExisting);
|
||||
request.setCreateMissingAuthor(createMissingAuthor);
|
||||
request.setCreateMissingSeries(createMissingSeries);
|
||||
|
||||
try {
|
||||
EPUBImportResponse response = epubImportService.importEPUB(request);
|
||||
|
||||
if (response.isSuccess()) {
|
||||
logger.info("Successfully imported EPUB: {} (Story ID: {})",
|
||||
response.getStoryTitle(), response.getStoryId());
|
||||
return ResponseEntity.ok(response);
|
||||
} else {
|
||||
logger.warn("EPUB import failed: {}", response.getMessage());
|
||||
return ResponseEntity.badRequest().body(response);
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error importing EPUB: {}", e.getMessage(), e);
|
||||
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||
.body(EPUBImportResponse.error("Internal server error: " + e.getMessage()));
|
||||
}
|
||||
}
|
||||
|
||||
// EPUB Export endpoint
|
||||
@PostMapping("/epub/export")
|
||||
public ResponseEntity<org.springframework.core.io.Resource> exportEPUB(
|
||||
@Valid @RequestBody EPUBExportRequest request) {
|
||||
|
||||
logger.info("Exporting story {} to EPUB", request.getStoryId());
|
||||
|
||||
try {
|
||||
if (!epubExportService.canExportStory(request.getStoryId())) {
|
||||
return ResponseEntity.badRequest().build();
|
||||
}
|
||||
|
||||
org.springframework.core.io.Resource resource = epubExportService.exportStoryAsEPUB(request);
|
||||
Story story = storyService.findById(request.getStoryId());
|
||||
String filename = epubExportService.getEPUBFilename(story);
|
||||
|
||||
logger.info("Successfully exported EPUB: {}", filename);
|
||||
|
||||
return ResponseEntity.ok()
|
||||
.header("Content-Disposition", "attachment; filename=\"" + filename + "\"")
|
||||
.header("Content-Type", "application/epub+zip")
|
||||
.body(resource);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error exporting EPUB: {}", e.getMessage(), e);
|
||||
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
|
||||
}
|
||||
}
|
||||
|
||||
// EPUB Export by story ID (GET endpoint)
|
||||
@GetMapping("/{id}/epub")
|
||||
public ResponseEntity<org.springframework.core.io.Resource> exportStoryAsEPUB(@PathVariable UUID id) {
|
||||
logger.info("Exporting story {} to EPUB via GET", id);
|
||||
|
||||
EPUBExportRequest request = new EPUBExportRequest(id);
|
||||
return exportEPUB(request);
|
||||
}
|
||||
|
||||
// Validate EPUB file
|
||||
@PostMapping("/epub/validate")
|
||||
public ResponseEntity<Map<String, Object>> validateEPUBFile(@RequestParam("file") MultipartFile file) {
|
||||
logger.info("Validating EPUB file: {}", file.getOriginalFilename());
|
||||
|
||||
try {
|
||||
List<String> errors = epubImportService.validateEPUBFile(file);
|
||||
|
||||
Map<String, Object> response = Map.of(
|
||||
"valid", errors.isEmpty(),
|
||||
"errors", errors,
|
||||
"filename", file.getOriginalFilename(),
|
||||
"size", file.getSize()
|
||||
);
|
||||
|
||||
return ResponseEntity.ok(response);
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error validating EPUB file: {}", e.getMessage(), e);
|
||||
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
|
||||
.body(Map.of("error", "Failed to validate EPUB file"));
|
||||
}
|
||||
}
|
||||
|
||||
// Request DTOs
|
||||
public static class CreateStoryRequest {
|
||||
private String title;
|
||||
@@ -520,6 +752,7 @@ public class StoryController {
|
||||
private String sourceUrl;
|
||||
private Integer volume;
|
||||
private UUID authorId;
|
||||
private String authorName;
|
||||
private UUID seriesId;
|
||||
private String seriesName;
|
||||
private List<String> tagNames;
|
||||
@@ -539,6 +772,8 @@ public class StoryController {
|
||||
public void setVolume(Integer volume) { this.volume = volume; }
|
||||
public UUID getAuthorId() { return authorId; }
|
||||
public void setAuthorId(UUID authorId) { this.authorId = authorId; }
|
||||
public String getAuthorName() { return authorName; }
|
||||
public void setAuthorName(String authorName) { this.authorName = authorName; }
|
||||
public UUID getSeriesId() { return seriesId; }
|
||||
public void setSeriesId(UUID seriesId) { this.seriesId = seriesId; }
|
||||
public String getSeriesName() { return seriesName; }
|
||||
|
||||
@@ -132,17 +132,39 @@ public class TagController {
|
||||
return ResponseEntity.ok(stats);
|
||||
}
|
||||
|
||||
@GetMapping("/collections")
|
||||
public ResponseEntity<List<TagDto>> getTagsUsedByCollections() {
|
||||
List<Tag> tags = tagService.findTagsUsedByCollections();
|
||||
List<TagDto> tagDtos = tags.stream()
|
||||
.map(this::convertToDtoWithCollectionCount)
|
||||
.collect(Collectors.toList());
|
||||
|
||||
return ResponseEntity.ok(tagDtos);
|
||||
}
|
||||
|
||||
private TagDto convertToDto(Tag tag) {
|
||||
TagDto dto = new TagDto();
|
||||
dto.setId(tag.getId());
|
||||
dto.setName(tag.getName());
|
||||
dto.setStoryCount(tag.getStories() != null ? tag.getStories().size() : 0);
|
||||
dto.setCollectionCount(tag.getCollections() != null ? tag.getCollections().size() : 0);
|
||||
dto.setCreatedAt(tag.getCreatedAt());
|
||||
// updatedAt field not present in Tag entity per spec
|
||||
|
||||
return dto;
|
||||
}
|
||||
|
||||
private TagDto convertToDtoWithCollectionCount(Tag tag) {
|
||||
TagDto dto = new TagDto();
|
||||
dto.setId(tag.getId());
|
||||
dto.setName(tag.getName());
|
||||
dto.setCollectionCount(tag.getCollections() != null ? tag.getCollections().size() : 0);
|
||||
dto.setCreatedAt(tag.getCreatedAt());
|
||||
// Don't set storyCount for collection-focused endpoint
|
||||
|
||||
return dto;
|
||||
}
|
||||
|
||||
// Request DTOs
|
||||
public static class CreateTagRequest {
|
||||
private String name;
|
||||
|
||||
@@ -16,6 +16,7 @@ public class CollectionDto {
|
||||
private String coverImagePath;
|
||||
private Boolean isArchived;
|
||||
private List<TagDto> tags;
|
||||
private List<String> tagNames; // For search results
|
||||
private List<CollectionStoryDto> collectionStories;
|
||||
private Integer storyCount;
|
||||
private Integer totalWordCount;
|
||||
@@ -83,6 +84,14 @@ public class CollectionDto {
|
||||
this.tags = tags;
|
||||
}
|
||||
|
||||
public List<String> getTagNames() {
|
||||
return tagNames;
|
||||
}
|
||||
|
||||
public void setTagNames(List<String> tagNames) {
|
||||
this.tagNames = tagNames;
|
||||
}
|
||||
|
||||
public List<CollectionStoryDto> getCollectionStories() {
|
||||
return collectionStories;
|
||||
}
|
||||
|
||||
115
backend/src/main/java/com/storycove/dto/EPUBExportRequest.java
Normal file
115
backend/src/main/java/com/storycove/dto/EPUBExportRequest.java
Normal file
@@ -0,0 +1,115 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import jakarta.validation.constraints.NotNull;
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
public class EPUBExportRequest {
|
||||
|
||||
@NotNull(message = "Story ID is required")
|
||||
private UUID storyId;
|
||||
|
||||
private String customTitle;
|
||||
|
||||
private String customAuthor;
|
||||
|
||||
private Boolean includeReadingPosition = true;
|
||||
|
||||
private Boolean includeCoverImage = true;
|
||||
|
||||
private Boolean includeMetadata = true;
|
||||
|
||||
private List<String> customMetadata;
|
||||
|
||||
private String language = "en";
|
||||
|
||||
private Boolean splitByChapters = false;
|
||||
|
||||
private Integer maxWordsPerChapter;
|
||||
|
||||
public EPUBExportRequest() {}
|
||||
|
||||
public EPUBExportRequest(UUID storyId) {
|
||||
this.storyId = storyId;
|
||||
}
|
||||
|
||||
public UUID getStoryId() {
|
||||
return storyId;
|
||||
}
|
||||
|
||||
public void setStoryId(UUID storyId) {
|
||||
this.storyId = storyId;
|
||||
}
|
||||
|
||||
public String getCustomTitle() {
|
||||
return customTitle;
|
||||
}
|
||||
|
||||
public void setCustomTitle(String customTitle) {
|
||||
this.customTitle = customTitle;
|
||||
}
|
||||
|
||||
public String getCustomAuthor() {
|
||||
return customAuthor;
|
||||
}
|
||||
|
||||
public void setCustomAuthor(String customAuthor) {
|
||||
this.customAuthor = customAuthor;
|
||||
}
|
||||
|
||||
public Boolean getIncludeReadingPosition() {
|
||||
return includeReadingPosition;
|
||||
}
|
||||
|
||||
public void setIncludeReadingPosition(Boolean includeReadingPosition) {
|
||||
this.includeReadingPosition = includeReadingPosition;
|
||||
}
|
||||
|
||||
public Boolean getIncludeCoverImage() {
|
||||
return includeCoverImage;
|
||||
}
|
||||
|
||||
public void setIncludeCoverImage(Boolean includeCoverImage) {
|
||||
this.includeCoverImage = includeCoverImage;
|
||||
}
|
||||
|
||||
public Boolean getIncludeMetadata() {
|
||||
return includeMetadata;
|
||||
}
|
||||
|
||||
public void setIncludeMetadata(Boolean includeMetadata) {
|
||||
this.includeMetadata = includeMetadata;
|
||||
}
|
||||
|
||||
public List<String> getCustomMetadata() {
|
||||
return customMetadata;
|
||||
}
|
||||
|
||||
public void setCustomMetadata(List<String> customMetadata) {
|
||||
this.customMetadata = customMetadata;
|
||||
}
|
||||
|
||||
public String getLanguage() {
|
||||
return language;
|
||||
}
|
||||
|
||||
public void setLanguage(String language) {
|
||||
this.language = language;
|
||||
}
|
||||
|
||||
public Boolean getSplitByChapters() {
|
||||
return splitByChapters;
|
||||
}
|
||||
|
||||
public void setSplitByChapters(Boolean splitByChapters) {
|
||||
this.splitByChapters = splitByChapters;
|
||||
}
|
||||
|
||||
public Integer getMaxWordsPerChapter() {
|
||||
return maxWordsPerChapter;
|
||||
}
|
||||
|
||||
public void setMaxWordsPerChapter(Integer maxWordsPerChapter) {
|
||||
this.maxWordsPerChapter = maxWordsPerChapter;
|
||||
}
|
||||
}
|
||||
133
backend/src/main/java/com/storycove/dto/EPUBImportRequest.java
Normal file
133
backend/src/main/java/com/storycove/dto/EPUBImportRequest.java
Normal file
@@ -0,0 +1,133 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import jakarta.validation.constraints.NotNull;
|
||||
import org.springframework.web.multipart.MultipartFile;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
public class EPUBImportRequest {
|
||||
|
||||
@NotNull(message = "EPUB file is required")
|
||||
private MultipartFile epubFile;
|
||||
|
||||
private UUID authorId;
|
||||
|
||||
private String authorName;
|
||||
|
||||
private UUID seriesId;
|
||||
|
||||
private String seriesName;
|
||||
|
||||
private Integer seriesVolume;
|
||||
|
||||
private List<String> tags;
|
||||
|
||||
private Boolean preserveReadingPosition = true;
|
||||
|
||||
private Boolean overwriteExisting = false;
|
||||
|
||||
private Boolean createMissingAuthor = true;
|
||||
|
||||
private Boolean createMissingSeries = true;
|
||||
|
||||
private Boolean extractCover = true;
|
||||
|
||||
public EPUBImportRequest() {}
|
||||
|
||||
public MultipartFile getEpubFile() {
|
||||
return epubFile;
|
||||
}
|
||||
|
||||
public void setEpubFile(MultipartFile epubFile) {
|
||||
this.epubFile = epubFile;
|
||||
}
|
||||
|
||||
public UUID getAuthorId() {
|
||||
return authorId;
|
||||
}
|
||||
|
||||
public void setAuthorId(UUID authorId) {
|
||||
this.authorId = authorId;
|
||||
}
|
||||
|
||||
public String getAuthorName() {
|
||||
return authorName;
|
||||
}
|
||||
|
||||
public void setAuthorName(String authorName) {
|
||||
this.authorName = authorName;
|
||||
}
|
||||
|
||||
public UUID getSeriesId() {
|
||||
return seriesId;
|
||||
}
|
||||
|
||||
public void setSeriesId(UUID seriesId) {
|
||||
this.seriesId = seriesId;
|
||||
}
|
||||
|
||||
public String getSeriesName() {
|
||||
return seriesName;
|
||||
}
|
||||
|
||||
public void setSeriesName(String seriesName) {
|
||||
this.seriesName = seriesName;
|
||||
}
|
||||
|
||||
public Integer getSeriesVolume() {
|
||||
return seriesVolume;
|
||||
}
|
||||
|
||||
public void setSeriesVolume(Integer seriesVolume) {
|
||||
this.seriesVolume = seriesVolume;
|
||||
}
|
||||
|
||||
public List<String> getTags() {
|
||||
return tags;
|
||||
}
|
||||
|
||||
public void setTags(List<String> tags) {
|
||||
this.tags = tags;
|
||||
}
|
||||
|
||||
public Boolean getPreserveReadingPosition() {
|
||||
return preserveReadingPosition;
|
||||
}
|
||||
|
||||
public void setPreserveReadingPosition(Boolean preserveReadingPosition) {
|
||||
this.preserveReadingPosition = preserveReadingPosition;
|
||||
}
|
||||
|
||||
public Boolean getOverwriteExisting() {
|
||||
return overwriteExisting;
|
||||
}
|
||||
|
||||
public void setOverwriteExisting(Boolean overwriteExisting) {
|
||||
this.overwriteExisting = overwriteExisting;
|
||||
}
|
||||
|
||||
public Boolean getCreateMissingAuthor() {
|
||||
return createMissingAuthor;
|
||||
}
|
||||
|
||||
public void setCreateMissingAuthor(Boolean createMissingAuthor) {
|
||||
this.createMissingAuthor = createMissingAuthor;
|
||||
}
|
||||
|
||||
public Boolean getCreateMissingSeries() {
|
||||
return createMissingSeries;
|
||||
}
|
||||
|
||||
public void setCreateMissingSeries(Boolean createMissingSeries) {
|
||||
this.createMissingSeries = createMissingSeries;
|
||||
}
|
||||
|
||||
public Boolean getExtractCover() {
|
||||
return extractCover;
|
||||
}
|
||||
|
||||
public void setExtractCover(Boolean extractCover) {
|
||||
this.extractCover = extractCover;
|
||||
}
|
||||
}
|
||||
107
backend/src/main/java/com/storycove/dto/EPUBImportResponse.java
Normal file
107
backend/src/main/java/com/storycove/dto/EPUBImportResponse.java
Normal file
@@ -0,0 +1,107 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
public class EPUBImportResponse {
|
||||
|
||||
private boolean success;
|
||||
private String message;
|
||||
private UUID storyId;
|
||||
private String storyTitle;
|
||||
private Integer totalChapters;
|
||||
private Integer wordCount;
|
||||
private ReadingPositionDto readingPosition;
|
||||
private List<String> warnings;
|
||||
private List<String> errors;
|
||||
|
||||
public EPUBImportResponse() {}
|
||||
|
||||
public EPUBImportResponse(boolean success, String message) {
|
||||
this.success = success;
|
||||
this.message = message;
|
||||
}
|
||||
|
||||
public static EPUBImportResponse success(UUID storyId, String storyTitle) {
|
||||
EPUBImportResponse response = new EPUBImportResponse(true, "EPUB imported successfully");
|
||||
response.setStoryId(storyId);
|
||||
response.setStoryTitle(storyTitle);
|
||||
return response;
|
||||
}
|
||||
|
||||
public static EPUBImportResponse error(String message) {
|
||||
return new EPUBImportResponse(false, message);
|
||||
}
|
||||
|
||||
public boolean isSuccess() {
|
||||
return success;
|
||||
}
|
||||
|
||||
public void setSuccess(boolean success) {
|
||||
this.success = success;
|
||||
}
|
||||
|
||||
public String getMessage() {
|
||||
return message;
|
||||
}
|
||||
|
||||
public void setMessage(String message) {
|
||||
this.message = message;
|
||||
}
|
||||
|
||||
public UUID getStoryId() {
|
||||
return storyId;
|
||||
}
|
||||
|
||||
public void setStoryId(UUID storyId) {
|
||||
this.storyId = storyId;
|
||||
}
|
||||
|
||||
public String getStoryTitle() {
|
||||
return storyTitle;
|
||||
}
|
||||
|
||||
public void setStoryTitle(String storyTitle) {
|
||||
this.storyTitle = storyTitle;
|
||||
}
|
||||
|
||||
public Integer getTotalChapters() {
|
||||
return totalChapters;
|
||||
}
|
||||
|
||||
public void setTotalChapters(Integer totalChapters) {
|
||||
this.totalChapters = totalChapters;
|
||||
}
|
||||
|
||||
public Integer getWordCount() {
|
||||
return wordCount;
|
||||
}
|
||||
|
||||
public void setWordCount(Integer wordCount) {
|
||||
this.wordCount = wordCount;
|
||||
}
|
||||
|
||||
public ReadingPositionDto getReadingPosition() {
|
||||
return readingPosition;
|
||||
}
|
||||
|
||||
public void setReadingPosition(ReadingPositionDto readingPosition) {
|
||||
this.readingPosition = readingPosition;
|
||||
}
|
||||
|
||||
public List<String> getWarnings() {
|
||||
return warnings;
|
||||
}
|
||||
|
||||
public void setWarnings(List<String> warnings) {
|
||||
this.warnings = warnings;
|
||||
}
|
||||
|
||||
public List<String> getErrors() {
|
||||
return errors;
|
||||
}
|
||||
|
||||
public void setErrors(List<String> errors) {
|
||||
this.errors = errors;
|
||||
}
|
||||
}
|
||||
31
backend/src/main/java/com/storycove/dto/FacetCountDto.java
Normal file
31
backend/src/main/java/com/storycove/dto/FacetCountDto.java
Normal file
@@ -0,0 +1,31 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
public class FacetCountDto {
|
||||
|
||||
private String value;
|
||||
private int count;
|
||||
|
||||
public FacetCountDto() {}
|
||||
|
||||
public FacetCountDto(String value, int count) {
|
||||
this.value = value;
|
||||
this.count = count;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public String getValue() {
|
||||
return value;
|
||||
}
|
||||
|
||||
public void setValue(String value) {
|
||||
this.value = value;
|
||||
}
|
||||
|
||||
public int getCount() {
|
||||
return count;
|
||||
}
|
||||
|
||||
public void setCount(int count) {
|
||||
this.count = count;
|
||||
}
|
||||
}
|
||||
@@ -8,6 +8,7 @@ public class HtmlSanitizationConfigDto {
|
||||
private Map<String, List<String>> allowedAttributes;
|
||||
private List<String> allowedCssProperties;
|
||||
private Map<String, List<String>> removedAttributes;
|
||||
private Map<String, Map<String, List<String>>> allowedProtocols;
|
||||
private String description;
|
||||
|
||||
public HtmlSanitizationConfigDto() {}
|
||||
@@ -44,6 +45,14 @@ public class HtmlSanitizationConfigDto {
|
||||
this.removedAttributes = removedAttributes;
|
||||
}
|
||||
|
||||
public Map<String, Map<String, List<String>>> getAllowedProtocols() {
|
||||
return allowedProtocols;
|
||||
}
|
||||
|
||||
public void setAllowedProtocols(Map<String, Map<String, List<String>>> allowedProtocols) {
|
||||
this.allowedProtocols = allowedProtocols;
|
||||
}
|
||||
|
||||
public String getDescription() {
|
||||
return description;
|
||||
}
|
||||
|
||||
124
backend/src/main/java/com/storycove/dto/ReadingPositionDto.java
Normal file
124
backend/src/main/java/com/storycove/dto/ReadingPositionDto.java
Normal file
@@ -0,0 +1,124 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.UUID;
|
||||
|
||||
public class ReadingPositionDto {
|
||||
|
||||
private UUID id;
|
||||
private UUID storyId;
|
||||
private Integer chapterIndex;
|
||||
private String chapterTitle;
|
||||
private Integer wordPosition;
|
||||
private Integer characterPosition;
|
||||
private Double percentageComplete;
|
||||
private String epubCfi;
|
||||
private String contextBefore;
|
||||
private String contextAfter;
|
||||
private LocalDateTime createdAt;
|
||||
private LocalDateTime updatedAt;
|
||||
|
||||
public ReadingPositionDto() {}
|
||||
|
||||
public ReadingPositionDto(UUID storyId, Integer chapterIndex, Integer wordPosition) {
|
||||
this.storyId = storyId;
|
||||
this.chapterIndex = chapterIndex;
|
||||
this.wordPosition = wordPosition;
|
||||
}
|
||||
|
||||
public UUID getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(UUID id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public UUID getStoryId() {
|
||||
return storyId;
|
||||
}
|
||||
|
||||
public void setStoryId(UUID storyId) {
|
||||
this.storyId = storyId;
|
||||
}
|
||||
|
||||
public Integer getChapterIndex() {
|
||||
return chapterIndex;
|
||||
}
|
||||
|
||||
public void setChapterIndex(Integer chapterIndex) {
|
||||
this.chapterIndex = chapterIndex;
|
||||
}
|
||||
|
||||
public String getChapterTitle() {
|
||||
return chapterTitle;
|
||||
}
|
||||
|
||||
public void setChapterTitle(String chapterTitle) {
|
||||
this.chapterTitle = chapterTitle;
|
||||
}
|
||||
|
||||
public Integer getWordPosition() {
|
||||
return wordPosition;
|
||||
}
|
||||
|
||||
public void setWordPosition(Integer wordPosition) {
|
||||
this.wordPosition = wordPosition;
|
||||
}
|
||||
|
||||
public Integer getCharacterPosition() {
|
||||
return characterPosition;
|
||||
}
|
||||
|
||||
public void setCharacterPosition(Integer characterPosition) {
|
||||
this.characterPosition = characterPosition;
|
||||
}
|
||||
|
||||
public Double getPercentageComplete() {
|
||||
return percentageComplete;
|
||||
}
|
||||
|
||||
public void setPercentageComplete(Double percentageComplete) {
|
||||
this.percentageComplete = percentageComplete;
|
||||
}
|
||||
|
||||
public String getEpubCfi() {
|
||||
return epubCfi;
|
||||
}
|
||||
|
||||
public void setEpubCfi(String epubCfi) {
|
||||
this.epubCfi = epubCfi;
|
||||
}
|
||||
|
||||
public String getContextBefore() {
|
||||
return contextBefore;
|
||||
}
|
||||
|
||||
public void setContextBefore(String contextBefore) {
|
||||
this.contextBefore = contextBefore;
|
||||
}
|
||||
|
||||
public String getContextAfter() {
|
||||
return contextAfter;
|
||||
}
|
||||
|
||||
public void setContextAfter(String contextAfter) {
|
||||
this.contextAfter = contextAfter;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
public void setCreatedAt(LocalDateTime createdAt) {
|
||||
this.createdAt = createdAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getUpdatedAt() {
|
||||
return updatedAt;
|
||||
}
|
||||
|
||||
public void setUpdatedAt(LocalDateTime updatedAt) {
|
||||
this.updatedAt = updatedAt;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import jakarta.validation.constraints.Min;
|
||||
|
||||
public class ReadingProgressRequest {
|
||||
|
||||
@Min(value = 0, message = "Reading position must be non-negative")
|
||||
private Integer position;
|
||||
|
||||
public ReadingProgressRequest() {}
|
||||
|
||||
public ReadingProgressRequest(Integer position) {
|
||||
this.position = position;
|
||||
}
|
||||
|
||||
public Integer getPosition() {
|
||||
return position;
|
||||
}
|
||||
|
||||
public void setPosition(Integer position) {
|
||||
this.position = position;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,23 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import jakarta.validation.constraints.NotNull;
|
||||
|
||||
public class ReadingStatusRequest {
|
||||
|
||||
@NotNull(message = "Reading status is required")
|
||||
private Boolean isRead;
|
||||
|
||||
public ReadingStatusRequest() {}
|
||||
|
||||
public ReadingStatusRequest(Boolean isRead) {
|
||||
this.isRead = isRead;
|
||||
}
|
||||
|
||||
public Boolean getIsRead() {
|
||||
return isRead;
|
||||
}
|
||||
|
||||
public void setIsRead(Boolean isRead) {
|
||||
this.isRead = isRead;
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
public class SearchResultDto<T> {
|
||||
|
||||
@@ -10,6 +11,7 @@ public class SearchResultDto<T> {
|
||||
private int perPage;
|
||||
private String query;
|
||||
private long searchTimeMs;
|
||||
private Map<String, List<FacetCountDto>> facets;
|
||||
|
||||
public SearchResultDto() {}
|
||||
|
||||
@@ -22,6 +24,16 @@ public class SearchResultDto<T> {
|
||||
this.searchTimeMs = searchTimeMs;
|
||||
}
|
||||
|
||||
public SearchResultDto(List<T> results, long totalHits, int page, int perPage, String query, long searchTimeMs, Map<String, List<FacetCountDto>> facets) {
|
||||
this.results = results;
|
||||
this.totalHits = totalHits;
|
||||
this.page = page;
|
||||
this.perPage = perPage;
|
||||
this.query = query;
|
||||
this.searchTimeMs = searchTimeMs;
|
||||
this.facets = facets;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public List<T> getResults() {
|
||||
return results;
|
||||
@@ -70,4 +82,12 @@ public class SearchResultDto<T> {
|
||||
public void setSearchTimeMs(long searchTimeMs) {
|
||||
this.searchTimeMs = searchTimeMs;
|
||||
}
|
||||
|
||||
public Map<String, List<FacetCountDto>> getFacets() {
|
||||
return facets;
|
||||
}
|
||||
|
||||
public void setFacets(Map<String, List<FacetCountDto>> facets) {
|
||||
this.facets = facets;
|
||||
}
|
||||
}
|
||||
@@ -21,13 +21,18 @@ public class StoryDto {
|
||||
private String description;
|
||||
|
||||
private String contentHtml;
|
||||
private String contentPlain;
|
||||
// contentPlain removed for performance - use StoryReadingDto when content is needed
|
||||
private String sourceUrl;
|
||||
private String coverPath;
|
||||
private Integer wordCount;
|
||||
private Integer rating;
|
||||
private Integer volume;
|
||||
|
||||
// Reading progress fields
|
||||
private Boolean isRead;
|
||||
private Integer readingPosition;
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
// Related entities as simple references
|
||||
private UUID authorId;
|
||||
private String authorName;
|
||||
@@ -85,13 +90,6 @@ public class StoryDto {
|
||||
this.contentHtml = contentHtml;
|
||||
}
|
||||
|
||||
public String getContentPlain() {
|
||||
return contentPlain;
|
||||
}
|
||||
|
||||
public void setContentPlain(String contentPlain) {
|
||||
this.contentPlain = contentPlain;
|
||||
}
|
||||
|
||||
public String getSourceUrl() {
|
||||
return sourceUrl;
|
||||
@@ -133,6 +131,30 @@ public class StoryDto {
|
||||
this.volume = volume;
|
||||
}
|
||||
|
||||
public Boolean getIsRead() {
|
||||
return isRead;
|
||||
}
|
||||
|
||||
public void setIsRead(Boolean isRead) {
|
||||
this.isRead = isRead;
|
||||
}
|
||||
|
||||
public Integer getReadingPosition() {
|
||||
return readingPosition;
|
||||
}
|
||||
|
||||
public void setReadingPosition(Integer readingPosition) {
|
||||
this.readingPosition = readingPosition;
|
||||
}
|
||||
|
||||
public LocalDateTime getLastReadAt() {
|
||||
return lastReadAt;
|
||||
}
|
||||
|
||||
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||
this.lastReadAt = lastReadAt;
|
||||
}
|
||||
|
||||
public UUID getAuthorId() {
|
||||
return authorId;
|
||||
}
|
||||
|
||||
202
backend/src/main/java/com/storycove/dto/StoryReadingDto.java
Normal file
202
backend/src/main/java/com/storycove/dto/StoryReadingDto.java
Normal file
@@ -0,0 +1,202 @@
|
||||
package com.storycove.dto;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
/**
|
||||
* Story DTO specifically for reading view.
|
||||
* Contains contentHtml but excludes contentPlain for performance.
|
||||
*/
|
||||
public class StoryReadingDto {
|
||||
|
||||
private UUID id;
|
||||
private String title;
|
||||
private String summary;
|
||||
private String description;
|
||||
private String contentHtml; // For reading - includes HTML
|
||||
// contentPlain excluded for performance
|
||||
private String sourceUrl;
|
||||
private String coverPath;
|
||||
private Integer wordCount;
|
||||
private Integer rating;
|
||||
private Integer volume;
|
||||
|
||||
// Reading progress fields
|
||||
private Boolean isRead;
|
||||
private Integer readingPosition;
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
// Related entities as simple references
|
||||
private UUID authorId;
|
||||
private String authorName;
|
||||
private UUID seriesId;
|
||||
private String seriesName;
|
||||
private List<TagDto> tags;
|
||||
|
||||
private LocalDateTime createdAt;
|
||||
private LocalDateTime updatedAt;
|
||||
|
||||
public StoryReadingDto() {}
|
||||
|
||||
// Getters and Setters
|
||||
public UUID getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(UUID id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public String getTitle() {
|
||||
return title;
|
||||
}
|
||||
|
||||
public void setTitle(String title) {
|
||||
this.title = title;
|
||||
}
|
||||
|
||||
public String getSummary() {
|
||||
return summary;
|
||||
}
|
||||
|
||||
public void setSummary(String summary) {
|
||||
this.summary = summary;
|
||||
}
|
||||
|
||||
public String getDescription() {
|
||||
return description;
|
||||
}
|
||||
|
||||
public void setDescription(String description) {
|
||||
this.description = description;
|
||||
}
|
||||
|
||||
public String getContentHtml() {
|
||||
return contentHtml;
|
||||
}
|
||||
|
||||
public void setContentHtml(String contentHtml) {
|
||||
this.contentHtml = contentHtml;
|
||||
}
|
||||
|
||||
public String getSourceUrl() {
|
||||
return sourceUrl;
|
||||
}
|
||||
|
||||
public void setSourceUrl(String sourceUrl) {
|
||||
this.sourceUrl = sourceUrl;
|
||||
}
|
||||
|
||||
public String getCoverPath() {
|
||||
return coverPath;
|
||||
}
|
||||
|
||||
public void setCoverPath(String coverPath) {
|
||||
this.coverPath = coverPath;
|
||||
}
|
||||
|
||||
public Integer getWordCount() {
|
||||
return wordCount;
|
||||
}
|
||||
|
||||
public void setWordCount(Integer wordCount) {
|
||||
this.wordCount = wordCount;
|
||||
}
|
||||
|
||||
public Integer getRating() {
|
||||
return rating;
|
||||
}
|
||||
|
||||
public void setRating(Integer rating) {
|
||||
this.rating = rating;
|
||||
}
|
||||
|
||||
public Integer getVolume() {
|
||||
return volume;
|
||||
}
|
||||
|
||||
public void setVolume(Integer volume) {
|
||||
this.volume = volume;
|
||||
}
|
||||
|
||||
public Boolean getIsRead() {
|
||||
return isRead;
|
||||
}
|
||||
|
||||
public void setIsRead(Boolean isRead) {
|
||||
this.isRead = isRead;
|
||||
}
|
||||
|
||||
public Integer getReadingPosition() {
|
||||
return readingPosition;
|
||||
}
|
||||
|
||||
public void setReadingPosition(Integer readingPosition) {
|
||||
this.readingPosition = readingPosition;
|
||||
}
|
||||
|
||||
public LocalDateTime getLastReadAt() {
|
||||
return lastReadAt;
|
||||
}
|
||||
|
||||
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||
this.lastReadAt = lastReadAt;
|
||||
}
|
||||
|
||||
public UUID getAuthorId() {
|
||||
return authorId;
|
||||
}
|
||||
|
||||
public void setAuthorId(UUID authorId) {
|
||||
this.authorId = authorId;
|
||||
}
|
||||
|
||||
public String getAuthorName() {
|
||||
return authorName;
|
||||
}
|
||||
|
||||
public void setAuthorName(String authorName) {
|
||||
this.authorName = authorName;
|
||||
}
|
||||
|
||||
public UUID getSeriesId() {
|
||||
return seriesId;
|
||||
}
|
||||
|
||||
public void setSeriesId(UUID seriesId) {
|
||||
this.seriesId = seriesId;
|
||||
}
|
||||
|
||||
public String getSeriesName() {
|
||||
return seriesName;
|
||||
}
|
||||
|
||||
public void setSeriesName(String seriesName) {
|
||||
this.seriesName = seriesName;
|
||||
}
|
||||
|
||||
public List<TagDto> getTags() {
|
||||
return tags;
|
||||
}
|
||||
|
||||
public void setTags(List<TagDto> tags) {
|
||||
this.tags = tags;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
public void setCreatedAt(LocalDateTime createdAt) {
|
||||
this.createdAt = createdAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getUpdatedAt() {
|
||||
return updatedAt;
|
||||
}
|
||||
|
||||
public void setUpdatedAt(LocalDateTime updatedAt) {
|
||||
this.updatedAt = updatedAt;
|
||||
}
|
||||
}
|
||||
@@ -16,6 +16,10 @@ public class StorySearchDto {
|
||||
private Integer rating;
|
||||
private Integer volume;
|
||||
|
||||
// Reading status
|
||||
private Boolean isRead;
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
// Author info
|
||||
private UUID authorId;
|
||||
private String authorName;
|
||||
@@ -109,6 +113,22 @@ public class StorySearchDto {
|
||||
this.volume = volume;
|
||||
}
|
||||
|
||||
public Boolean getIsRead() {
|
||||
return isRead;
|
||||
}
|
||||
|
||||
public void setIsRead(Boolean isRead) {
|
||||
this.isRead = isRead;
|
||||
}
|
||||
|
||||
public LocalDateTime getLastReadAt() {
|
||||
return lastReadAt;
|
||||
}
|
||||
|
||||
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||
this.lastReadAt = lastReadAt;
|
||||
}
|
||||
|
||||
public UUID getAuthorId() {
|
||||
return authorId;
|
||||
}
|
||||
|
||||
@@ -20,6 +20,11 @@ public class StorySummaryDto {
|
||||
private Integer rating;
|
||||
private Integer volume;
|
||||
|
||||
// Reading progress fields
|
||||
private Boolean isRead;
|
||||
private Integer readingPosition;
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
// Related entities as simple references
|
||||
private UUID authorId;
|
||||
private String authorName;
|
||||
@@ -106,6 +111,30 @@ public class StorySummaryDto {
|
||||
this.volume = volume;
|
||||
}
|
||||
|
||||
public Boolean getIsRead() {
|
||||
return isRead;
|
||||
}
|
||||
|
||||
public void setIsRead(Boolean isRead) {
|
||||
this.isRead = isRead;
|
||||
}
|
||||
|
||||
public Integer getReadingPosition() {
|
||||
return readingPosition;
|
||||
}
|
||||
|
||||
public void setReadingPosition(Integer readingPosition) {
|
||||
this.readingPosition = readingPosition;
|
||||
}
|
||||
|
||||
public LocalDateTime getLastReadAt() {
|
||||
return lastReadAt;
|
||||
}
|
||||
|
||||
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||
this.lastReadAt = lastReadAt;
|
||||
}
|
||||
|
||||
public UUID getAuthorId() {
|
||||
return authorId;
|
||||
}
|
||||
|
||||
@@ -15,6 +15,7 @@ public class TagDto {
|
||||
private String name;
|
||||
|
||||
private Integer storyCount;
|
||||
private Integer collectionCount;
|
||||
private LocalDateTime createdAt;
|
||||
private LocalDateTime updatedAt;
|
||||
|
||||
@@ -49,6 +50,14 @@ public class TagDto {
|
||||
this.storyCount = storyCount;
|
||||
}
|
||||
|
||||
public Integer getCollectionCount() {
|
||||
return collectionCount;
|
||||
}
|
||||
|
||||
public void setCollectionCount(Integer collectionCount) {
|
||||
this.collectionCount = collectionCount;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
@@ -52,6 +52,10 @@ public class Collection {
|
||||
)
|
||||
private Set<Tag> tags = new HashSet<>();
|
||||
|
||||
// Transient field for search results - tag names only to avoid lazy loading issues
|
||||
@Transient
|
||||
private List<String> tagNames;
|
||||
|
||||
@CreationTimestamp
|
||||
@Column(name = "created_at", nullable = false, updatable = false)
|
||||
private LocalDateTime createdAt;
|
||||
@@ -192,6 +196,14 @@ public class Collection {
|
||||
this.tags = tags;
|
||||
}
|
||||
|
||||
public List<String> getTagNames() {
|
||||
return tagNames;
|
||||
}
|
||||
|
||||
public void setTagNames(List<String> tagNames) {
|
||||
this.tagNames = tagNames;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
230
backend/src/main/java/com/storycove/entity/ReadingPosition.java
Normal file
230
backend/src/main/java/com/storycove/entity/ReadingPosition.java
Normal file
@@ -0,0 +1,230 @@
|
||||
package com.storycove.entity;
|
||||
|
||||
import jakarta.persistence.*;
|
||||
import jakarta.validation.constraints.NotNull;
|
||||
import org.hibernate.annotations.CreationTimestamp;
|
||||
import org.hibernate.annotations.UpdateTimestamp;
|
||||
import com.fasterxml.jackson.annotation.JsonBackReference;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.UUID;
|
||||
|
||||
@Entity
|
||||
@Table(name = "reading_positions", indexes = {
|
||||
@Index(name = "idx_reading_position_story", columnList = "story_id")
|
||||
})
|
||||
public class ReadingPosition {
|
||||
|
||||
@Id
|
||||
@GeneratedValue(strategy = GenerationType.UUID)
|
||||
private UUID id;
|
||||
|
||||
@NotNull
|
||||
@ManyToOne(fetch = FetchType.LAZY)
|
||||
@JoinColumn(name = "story_id", nullable = false)
|
||||
@JsonBackReference("story-reading-positions")
|
||||
private Story story;
|
||||
|
||||
@Column(name = "chapter_index")
|
||||
private Integer chapterIndex;
|
||||
|
||||
@Column(name = "chapter_title")
|
||||
private String chapterTitle;
|
||||
|
||||
@Column(name = "word_position")
|
||||
private Integer wordPosition;
|
||||
|
||||
@Column(name = "character_position")
|
||||
private Integer characterPosition;
|
||||
|
||||
@Column(name = "percentage_complete")
|
||||
private Double percentageComplete;
|
||||
|
||||
@Column(name = "epub_cfi", columnDefinition = "TEXT")
|
||||
private String epubCfi;
|
||||
|
||||
@Column(name = "context_before", length = 500)
|
||||
private String contextBefore;
|
||||
|
||||
@Column(name = "context_after", length = 500)
|
||||
private String contextAfter;
|
||||
|
||||
@CreationTimestamp
|
||||
@Column(name = "created_at", nullable = false, updatable = false)
|
||||
private LocalDateTime createdAt;
|
||||
|
||||
@UpdateTimestamp
|
||||
@Column(name = "updated_at", nullable = false)
|
||||
private LocalDateTime updatedAt;
|
||||
|
||||
public ReadingPosition() {}
|
||||
|
||||
public ReadingPosition(Story story) {
|
||||
this.story = story;
|
||||
this.chapterIndex = 0;
|
||||
this.wordPosition = 0;
|
||||
this.characterPosition = 0;
|
||||
this.percentageComplete = 0.0;
|
||||
}
|
||||
|
||||
public ReadingPosition(Story story, Integer chapterIndex, Integer wordPosition) {
|
||||
this.story = story;
|
||||
this.chapterIndex = chapterIndex;
|
||||
this.wordPosition = wordPosition;
|
||||
this.characterPosition = 0;
|
||||
this.percentageComplete = 0.0;
|
||||
}
|
||||
|
||||
public void updatePosition(Integer chapterIndex, Integer wordPosition, Integer characterPosition) {
|
||||
this.chapterIndex = chapterIndex;
|
||||
this.wordPosition = wordPosition;
|
||||
this.characterPosition = characterPosition;
|
||||
calculatePercentageComplete();
|
||||
}
|
||||
|
||||
public void updatePositionWithCfi(String epubCfi, Integer chapterIndex, Integer wordPosition) {
|
||||
this.epubCfi = epubCfi;
|
||||
this.chapterIndex = chapterIndex;
|
||||
this.wordPosition = wordPosition;
|
||||
calculatePercentageComplete();
|
||||
}
|
||||
|
||||
private void calculatePercentageComplete() {
|
||||
if (story != null && story.getWordCount() != null && story.getWordCount() > 0) {
|
||||
int totalWords = story.getWordCount();
|
||||
int currentPosition = (chapterIndex != null ? chapterIndex * 1000 : 0) +
|
||||
(wordPosition != null ? wordPosition : 0);
|
||||
this.percentageComplete = Math.min(100.0, (double) currentPosition / totalWords * 100);
|
||||
}
|
||||
}
|
||||
|
||||
public boolean isAtBeginning() {
|
||||
return (chapterIndex == null || chapterIndex == 0) &&
|
||||
(wordPosition == null || wordPosition == 0);
|
||||
}
|
||||
|
||||
public boolean isCompleted() {
|
||||
return percentageComplete != null && percentageComplete >= 95.0;
|
||||
}
|
||||
|
||||
// Getters and Setters
|
||||
public UUID getId() {
|
||||
return id;
|
||||
}
|
||||
|
||||
public void setId(UUID id) {
|
||||
this.id = id;
|
||||
}
|
||||
|
||||
public Story getStory() {
|
||||
return story;
|
||||
}
|
||||
|
||||
public void setStory(Story story) {
|
||||
this.story = story;
|
||||
}
|
||||
|
||||
public Integer getChapterIndex() {
|
||||
return chapterIndex;
|
||||
}
|
||||
|
||||
public void setChapterIndex(Integer chapterIndex) {
|
||||
this.chapterIndex = chapterIndex;
|
||||
}
|
||||
|
||||
public String getChapterTitle() {
|
||||
return chapterTitle;
|
||||
}
|
||||
|
||||
public void setChapterTitle(String chapterTitle) {
|
||||
this.chapterTitle = chapterTitle;
|
||||
}
|
||||
|
||||
public Integer getWordPosition() {
|
||||
return wordPosition;
|
||||
}
|
||||
|
||||
public void setWordPosition(Integer wordPosition) {
|
||||
this.wordPosition = wordPosition;
|
||||
}
|
||||
|
||||
public Integer getCharacterPosition() {
|
||||
return characterPosition;
|
||||
}
|
||||
|
||||
public void setCharacterPosition(Integer characterPosition) {
|
||||
this.characterPosition = characterPosition;
|
||||
}
|
||||
|
||||
public Double getPercentageComplete() {
|
||||
return percentageComplete;
|
||||
}
|
||||
|
||||
public void setPercentageComplete(Double percentageComplete) {
|
||||
this.percentageComplete = percentageComplete;
|
||||
}
|
||||
|
||||
public String getEpubCfi() {
|
||||
return epubCfi;
|
||||
}
|
||||
|
||||
public void setEpubCfi(String epubCfi) {
|
||||
this.epubCfi = epubCfi;
|
||||
}
|
||||
|
||||
public String getContextBefore() {
|
||||
return contextBefore;
|
||||
}
|
||||
|
||||
public void setContextBefore(String contextBefore) {
|
||||
this.contextBefore = contextBefore;
|
||||
}
|
||||
|
||||
public String getContextAfter() {
|
||||
return contextAfter;
|
||||
}
|
||||
|
||||
public void setContextAfter(String contextAfter) {
|
||||
this.contextAfter = contextAfter;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
public void setCreatedAt(LocalDateTime createdAt) {
|
||||
this.createdAt = createdAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getUpdatedAt() {
|
||||
return updatedAt;
|
||||
}
|
||||
|
||||
public void setUpdatedAt(LocalDateTime updatedAt) {
|
||||
this.updatedAt = updatedAt;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
if (!(o instanceof ReadingPosition)) return false;
|
||||
ReadingPosition that = (ReadingPosition) o;
|
||||
return id != null && id.equals(that.id);
|
||||
}
|
||||
|
||||
@Override
|
||||
public int hashCode() {
|
||||
return getClass().hashCode();
|
||||
}
|
||||
|
||||
@Override
|
||||
public String toString() {
|
||||
return "ReadingPosition{" +
|
||||
"id=" + id +
|
||||
", storyId=" + (story != null ? story.getId() : null) +
|
||||
", chapterIndex=" + chapterIndex +
|
||||
", wordPosition=" + wordPosition +
|
||||
", percentageComplete=" + percentageComplete +
|
||||
'}';
|
||||
}
|
||||
}
|
||||
@@ -55,6 +55,15 @@ public class Story {
|
||||
@Column(name = "volume")
|
||||
private Integer volume;
|
||||
|
||||
@Column(name = "is_read")
|
||||
private Boolean isRead = false;
|
||||
|
||||
@Column(name = "reading_position")
|
||||
private Integer readingPosition = 0;
|
||||
|
||||
@Column(name = "last_read_at")
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
@ManyToOne(fetch = FetchType.LAZY)
|
||||
@JoinColumn(name = "author_id")
|
||||
@JsonBackReference("author-stories")
|
||||
@@ -212,6 +221,30 @@ public class Story {
|
||||
this.volume = volume;
|
||||
}
|
||||
|
||||
public Boolean getIsRead() {
|
||||
return isRead;
|
||||
}
|
||||
|
||||
public void setIsRead(Boolean isRead) {
|
||||
this.isRead = isRead;
|
||||
}
|
||||
|
||||
public Integer getReadingPosition() {
|
||||
return readingPosition;
|
||||
}
|
||||
|
||||
public void setReadingPosition(Integer readingPosition) {
|
||||
this.readingPosition = readingPosition;
|
||||
}
|
||||
|
||||
public LocalDateTime getLastReadAt() {
|
||||
return lastReadAt;
|
||||
}
|
||||
|
||||
public void setLastReadAt(LocalDateTime lastReadAt) {
|
||||
this.lastReadAt = lastReadAt;
|
||||
}
|
||||
|
||||
public Author getAuthor() {
|
||||
return author;
|
||||
}
|
||||
@@ -252,6 +285,37 @@ public class Story {
|
||||
this.updatedAt = updatedAt;
|
||||
}
|
||||
|
||||
/**
|
||||
* Updates the reading progress and timestamp
|
||||
*/
|
||||
public void updateReadingProgress(Integer position) {
|
||||
this.readingPosition = position;
|
||||
this.lastReadAt = LocalDateTime.now();
|
||||
}
|
||||
|
||||
/**
|
||||
* Marks the story as read and updates the reading position to the end
|
||||
*/
|
||||
public void markAsRead() {
|
||||
this.isRead = true;
|
||||
this.lastReadAt = LocalDateTime.now();
|
||||
// Set reading position to the end of content if available
|
||||
if (contentPlain != null) {
|
||||
this.readingPosition = contentPlain.length();
|
||||
} else if (contentHtml != null) {
|
||||
this.readingPosition = contentHtml.length();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Marks the story as unread and resets reading position
|
||||
*/
|
||||
public void markAsUnread() {
|
||||
this.isRead = false;
|
||||
this.readingPosition = 0;
|
||||
this.lastReadAt = null;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean equals(Object o) {
|
||||
if (this == o) return true;
|
||||
@@ -272,6 +336,8 @@ public class Story {
|
||||
", title='" + title + '\'' +
|
||||
", wordCount=" + wordCount +
|
||||
", rating=" + rating +
|
||||
", isRead=" + isRead +
|
||||
", readingPosition=" + readingPosition +
|
||||
'}';
|
||||
}
|
||||
}
|
||||
@@ -29,6 +29,10 @@ public class Tag {
|
||||
@JsonBackReference("story-tags")
|
||||
private Set<Story> stories = new HashSet<>();
|
||||
|
||||
@ManyToMany(mappedBy = "tags")
|
||||
@JsonBackReference("collection-tags")
|
||||
private Set<Collection> collections = new HashSet<>();
|
||||
|
||||
@CreationTimestamp
|
||||
@Column(name = "created_at", nullable = false, updatable = false)
|
||||
private LocalDateTime createdAt;
|
||||
@@ -67,6 +71,14 @@ public class Tag {
|
||||
this.stories = stories;
|
||||
}
|
||||
|
||||
public Set<Collection> getCollections() {
|
||||
return collections;
|
||||
}
|
||||
|
||||
public void setCollections(Set<Collection> collections) {
|
||||
this.collections = collections;
|
||||
}
|
||||
|
||||
public LocalDateTime getCreatedAt() {
|
||||
return createdAt;
|
||||
}
|
||||
|
||||
@@ -4,6 +4,7 @@ import com.storycove.entity.Author;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Modifying;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
@@ -52,4 +53,5 @@ public interface AuthorRepository extends JpaRepository<Author, UUID> {
|
||||
|
||||
@Query(value = "SELECT author_rating FROM authors WHERE id = :id", nativeQuery = true)
|
||||
Integer findAuthorRatingById(@Param("id") UUID id);
|
||||
|
||||
}
|
||||
@@ -2,6 +2,7 @@ package com.storycove.repository;
|
||||
|
||||
import com.storycove.entity.Collection;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Modifying;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
@@ -45,4 +46,11 @@ public interface CollectionRepository extends JpaRepository<Collection, UUID> {
|
||||
*/
|
||||
@Query("SELECT c FROM Collection c WHERE c.isArchived = false ORDER BY c.updatedAt DESC")
|
||||
List<Collection> findAllActiveCollections();
|
||||
|
||||
/**
|
||||
* Find all collections with tags for reindexing operations
|
||||
*/
|
||||
@Query("SELECT c FROM Collection c LEFT JOIN FETCH c.tags ORDER BY c.updatedAt DESC")
|
||||
List<Collection> findAllWithTags();
|
||||
|
||||
}
|
||||
@@ -0,0 +1,57 @@
|
||||
package com.storycove.repository;
|
||||
|
||||
import com.storycove.entity.ReadingPosition;
|
||||
import com.storycove.entity.Story;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
|
||||
@Repository
|
||||
public interface ReadingPositionRepository extends JpaRepository<ReadingPosition, UUID> {
|
||||
|
||||
Optional<ReadingPosition> findByStoryId(UUID storyId);
|
||||
|
||||
Optional<ReadingPosition> findByStory(Story story);
|
||||
|
||||
List<ReadingPosition> findByStoryIdIn(List<UUID> storyIds);
|
||||
|
||||
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.story.id = :storyId ORDER BY rp.updatedAt DESC")
|
||||
List<ReadingPosition> findByStoryIdOrderByUpdatedAtDesc(@Param("storyId") UUID storyId);
|
||||
|
||||
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete >= :minPercentage")
|
||||
List<ReadingPosition> findByMinimumPercentageComplete(@Param("minPercentage") Double minPercentage);
|
||||
|
||||
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete >= 95.0")
|
||||
List<ReadingPosition> findCompletedReadings();
|
||||
|
||||
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.percentageComplete > 0 AND rp.percentageComplete < 95.0")
|
||||
List<ReadingPosition> findInProgressReadings();
|
||||
|
||||
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.updatedAt >= :since ORDER BY rp.updatedAt DESC")
|
||||
List<ReadingPosition> findRecentlyUpdated(@Param("since") LocalDateTime since);
|
||||
|
||||
@Query("SELECT rp FROM ReadingPosition rp ORDER BY rp.updatedAt DESC")
|
||||
List<ReadingPosition> findAllOrderByUpdatedAtDesc();
|
||||
|
||||
@Query("SELECT COUNT(rp) FROM ReadingPosition rp WHERE rp.percentageComplete >= 95.0")
|
||||
long countCompletedReadings();
|
||||
|
||||
@Query("SELECT COUNT(rp) FROM ReadingPosition rp WHERE rp.percentageComplete > 0 AND rp.percentageComplete < 95.0")
|
||||
long countInProgressReadings();
|
||||
|
||||
@Query("SELECT AVG(rp.percentageComplete) FROM ReadingPosition rp WHERE rp.percentageComplete > 0")
|
||||
Double findAverageReadingProgress();
|
||||
|
||||
@Query("SELECT rp FROM ReadingPosition rp WHERE rp.epubCfi IS NOT NULL")
|
||||
List<ReadingPosition> findPositionsWithEpubCfi();
|
||||
|
||||
boolean existsByStoryId(UUID storyId);
|
||||
|
||||
void deleteByStoryId(UUID storyId);
|
||||
}
|
||||
@@ -7,6 +7,7 @@ import com.storycove.entity.Tag;
|
||||
import org.springframework.data.domain.Page;
|
||||
import org.springframework.data.domain.Pageable;
|
||||
import org.springframework.data.jpa.repository.JpaRepository;
|
||||
import org.springframework.data.jpa.repository.Modifying;
|
||||
import org.springframework.data.jpa.repository.Query;
|
||||
import org.springframework.data.repository.query.Param;
|
||||
import org.springframework.stereotype.Repository;
|
||||
@@ -114,4 +115,8 @@ public interface StoryRepository extends JpaRepository<Story, UUID> {
|
||||
"LEFT JOIN FETCH s.series " +
|
||||
"LEFT JOIN FETCH s.tags")
|
||||
List<Story> findAllWithAssociations();
|
||||
|
||||
@Query("SELECT s FROM Story s WHERE UPPER(s.title) = UPPER(:title) AND UPPER(s.author.name) = UPPER(:authorName)")
|
||||
List<Story> findByTitleAndAuthorNameIgnoreCase(@Param("title") String title, @Param("authorName") String authorName);
|
||||
|
||||
}
|
||||
@@ -54,4 +54,7 @@ public interface TagRepository extends JpaRepository<Tag, UUID> {
|
||||
|
||||
@Query("SELECT COUNT(t) FROM Tag t WHERE SIZE(t.stories) > 0")
|
||||
long countUsedTags();
|
||||
|
||||
@Query("SELECT t FROM Tag t WHERE SIZE(t.collections) > 0 ORDER BY SIZE(t.collections) DESC, t.name ASC")
|
||||
List<Tag> findTagsUsedByCollections();
|
||||
}
|
||||
@@ -31,7 +31,7 @@ public class AuthorService {
|
||||
private final TypesenseService typesenseService;
|
||||
|
||||
@Autowired
|
||||
public AuthorService(AuthorRepository authorRepository, TypesenseService typesenseService) {
|
||||
public AuthorService(AuthorRepository authorRepository, @Autowired(required = false) TypesenseService typesenseService) {
|
||||
this.authorRepository = authorRepository;
|
||||
this.typesenseService = typesenseService;
|
||||
}
|
||||
@@ -133,10 +133,12 @@ public class AuthorService {
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Index in Typesense
|
||||
try {
|
||||
typesenseService.indexAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to index author in Typesense: " + savedAuthor.getName(), e);
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.indexAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to index author in Typesense: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
|
||||
return savedAuthor;
|
||||
@@ -155,10 +157,12 @@ public class AuthorService {
|
||||
Author savedAuthor = authorRepository.save(existingAuthor);
|
||||
|
||||
// Update in Typesense
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense: " + savedAuthor.getName(), e);
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
|
||||
return savedAuthor;
|
||||
@@ -175,10 +179,12 @@ public class AuthorService {
|
||||
authorRepository.delete(author);
|
||||
|
||||
// Remove from Typesense
|
||||
try {
|
||||
typesenseService.deleteAuthor(id.toString());
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to delete author from Typesense: " + author.getName(), e);
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.deleteAuthor(id.toString());
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to delete author from Typesense: " + author.getName(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -188,10 +194,12 @@ public class AuthorService {
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after adding URL: " + savedAuthor.getName(), e);
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after adding URL: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
|
||||
return savedAuthor;
|
||||
@@ -203,10 +211,12 @@ public class AuthorService {
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after removing URL: " + savedAuthor.getName(), e);
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after removing URL: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
|
||||
return savedAuthor;
|
||||
@@ -242,10 +252,12 @@ public class AuthorService {
|
||||
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
|
||||
|
||||
// Update in Typesense
|
||||
try {
|
||||
typesenseService.updateAuthor(refreshedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after rating: " + refreshedAuthor.getName(), e);
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(refreshedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after rating: " + refreshedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
|
||||
return refreshedAuthor;
|
||||
@@ -290,10 +302,12 @@ public class AuthorService {
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after setting avatar: " + savedAuthor.getName(), e);
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after setting avatar: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
|
||||
return savedAuthor;
|
||||
@@ -305,10 +319,12 @@ public class AuthorService {
|
||||
Author savedAuthor = authorRepository.save(author);
|
||||
|
||||
// Update in Typesense
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after removing avatar: " + savedAuthor.getName(), e);
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(savedAuthor);
|
||||
} catch (Exception e) {
|
||||
logger.warn("Failed to update author in Typesense after removing avatar: " + savedAuthor.getName(), e);
|
||||
}
|
||||
}
|
||||
|
||||
return savedAuthor;
|
||||
|
||||
@@ -10,6 +10,7 @@ public class CollectionSearchResult extends Collection {
|
||||
|
||||
private Integer storedStoryCount;
|
||||
private Integer storedTotalWordCount;
|
||||
private int wordsPerMinute = 200; // Default, can be overridden
|
||||
|
||||
public CollectionSearchResult(Collection collection) {
|
||||
this.setId(collection.getId());
|
||||
@@ -20,6 +21,7 @@ public class CollectionSearchResult extends Collection {
|
||||
this.setCreatedAt(collection.getCreatedAt());
|
||||
this.setUpdatedAt(collection.getUpdatedAt());
|
||||
this.setCoverImagePath(collection.getCoverImagePath());
|
||||
this.setTagNames(collection.getTagNames()); // Copy tag names for search results
|
||||
// Note: don't copy collectionStories or tags to avoid lazy loading issues
|
||||
}
|
||||
|
||||
@@ -31,6 +33,10 @@ public class CollectionSearchResult extends Collection {
|
||||
this.storedTotalWordCount = totalWordCount;
|
||||
}
|
||||
|
||||
public void setWordsPerMinute(int wordsPerMinute) {
|
||||
this.wordsPerMinute = wordsPerMinute;
|
||||
}
|
||||
|
||||
@Override
|
||||
public int getStoryCount() {
|
||||
return storedStoryCount != null ? storedStoryCount : 0;
|
||||
@@ -43,8 +49,7 @@ public class CollectionSearchResult extends Collection {
|
||||
|
||||
@Override
|
||||
public int getEstimatedReadingTime() {
|
||||
// Assuming 200 words per minute reading speed
|
||||
return Math.max(1, getTotalWordCount() / 200);
|
||||
return Math.max(1, getTotalWordCount() / wordsPerMinute);
|
||||
}
|
||||
|
||||
@Override
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.SearchResultDto;
|
||||
import com.storycove.dto.StoryReadingDto;
|
||||
import com.storycove.dto.TagDto;
|
||||
import com.storycove.entity.Collection;
|
||||
import com.storycove.entity.CollectionStory;
|
||||
import com.storycove.entity.Story;
|
||||
@@ -34,18 +36,21 @@ public class CollectionService {
|
||||
private final StoryRepository storyRepository;
|
||||
private final TagRepository tagRepository;
|
||||
private final TypesenseService typesenseService;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
|
||||
@Autowired
|
||||
public CollectionService(CollectionRepository collectionRepository,
|
||||
CollectionStoryRepository collectionStoryRepository,
|
||||
StoryRepository storyRepository,
|
||||
TagRepository tagRepository,
|
||||
@Autowired(required = false) TypesenseService typesenseService) {
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
ReadingTimeService readingTimeService) {
|
||||
this.collectionRepository = collectionRepository;
|
||||
this.collectionStoryRepository = collectionStoryRepository;
|
||||
this.storyRepository = storyRepository;
|
||||
this.tagRepository = tagRepository;
|
||||
this.typesenseService = typesenseService;
|
||||
this.readingTimeService = readingTimeService;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -78,6 +83,13 @@ public class CollectionService {
|
||||
.orElseThrow(() -> new ResourceNotFoundException("Collection not found with id: " + id));
|
||||
}
|
||||
|
||||
/**
|
||||
* Find all collections with tags for reindexing
|
||||
*/
|
||||
public List<Collection> findAllWithTags() {
|
||||
return collectionRepository.findAllWithTags();
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new collection with optional initial stories
|
||||
*/
|
||||
@@ -326,7 +338,7 @@ public class CollectionService {
|
||||
);
|
||||
|
||||
return Map.of(
|
||||
"story", story,
|
||||
"story", convertToReadingDto(story),
|
||||
"collection", collectionContext
|
||||
);
|
||||
}
|
||||
@@ -344,7 +356,7 @@ public class CollectionService {
|
||||
int totalWordCount = collectionStories.stream()
|
||||
.mapToInt(cs -> cs.getStory().getWordCount() != null ? cs.getStory().getWordCount() : 0)
|
||||
.sum();
|
||||
int estimatedReadingTime = Math.max(1, totalWordCount / 200); // 200 words per minute
|
||||
int estimatedReadingTime = readingTimeService.calculateReadingTime(totalWordCount);
|
||||
|
||||
double averageStoryRating = collectionStories.stream()
|
||||
.filter(cs -> cs.getStory().getRating() != null)
|
||||
@@ -420,4 +432,49 @@ public class CollectionService {
|
||||
public List<Collection> findAllForIndexing() {
|
||||
return collectionRepository.findAllActiveCollections();
|
||||
}
|
||||
|
||||
private StoryReadingDto convertToReadingDto(Story story) {
|
||||
StoryReadingDto dto = new StoryReadingDto();
|
||||
dto.setId(story.getId());
|
||||
dto.setTitle(story.getTitle());
|
||||
dto.setSummary(story.getSummary());
|
||||
dto.setDescription(story.getDescription());
|
||||
dto.setContentHtml(story.getContentHtml());
|
||||
dto.setSourceUrl(story.getSourceUrl());
|
||||
dto.setCoverPath(story.getCoverPath());
|
||||
dto.setWordCount(story.getWordCount());
|
||||
dto.setRating(story.getRating());
|
||||
dto.setVolume(story.getVolume());
|
||||
dto.setCreatedAt(story.getCreatedAt());
|
||||
dto.setUpdatedAt(story.getUpdatedAt());
|
||||
|
||||
// Reading progress fields
|
||||
dto.setIsRead(story.getIsRead());
|
||||
dto.setReadingPosition(story.getReadingPosition());
|
||||
dto.setLastReadAt(story.getLastReadAt());
|
||||
|
||||
if (story.getAuthor() != null) {
|
||||
dto.setAuthorId(story.getAuthor().getId());
|
||||
dto.setAuthorName(story.getAuthor().getName());
|
||||
}
|
||||
|
||||
if (story.getSeries() != null) {
|
||||
dto.setSeriesId(story.getSeries().getId());
|
||||
dto.setSeriesName(story.getSeries().getName());
|
||||
}
|
||||
|
||||
dto.setTags(story.getTags().stream()
|
||||
.map(this::convertTagToDto)
|
||||
.collect(Collectors.toList()));
|
||||
|
||||
return dto;
|
||||
}
|
||||
|
||||
private TagDto convertTagToDto(Tag tag) {
|
||||
TagDto dto = new TagDto();
|
||||
dto.setId(tag.getId());
|
||||
dto.setName(tag.getName());
|
||||
dto.setStoryCount(tag.getStories().size());
|
||||
return dto;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,658 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.storycove.entity.*;
|
||||
import com.storycove.repository.*;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.core.io.ByteArrayResource;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
|
||||
import javax.sql.DataSource;
|
||||
import java.io.*;
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.nio.file.*;
|
||||
import java.sql.*;
|
||||
import java.time.LocalDateTime;
|
||||
import java.time.format.DateTimeFormatter;
|
||||
import java.util.*;
|
||||
import java.util.zip.ZipEntry;
|
||||
import java.util.zip.ZipInputStream;
|
||||
import java.util.zip.ZipOutputStream;
|
||||
|
||||
@Service
|
||||
public class DatabaseManagementService {
|
||||
|
||||
@Autowired
|
||||
private DataSource dataSource;
|
||||
|
||||
@Autowired
|
||||
private StoryRepository storyRepository;
|
||||
|
||||
@Autowired
|
||||
private AuthorRepository authorRepository;
|
||||
|
||||
@Autowired
|
||||
private SeriesRepository seriesRepository;
|
||||
|
||||
@Autowired
|
||||
private TagRepository tagRepository;
|
||||
|
||||
@Autowired
|
||||
private CollectionRepository collectionRepository;
|
||||
|
||||
@Autowired
|
||||
private TypesenseService typesenseService;
|
||||
|
||||
@Autowired
|
||||
private ReadingPositionRepository readingPositionRepository;
|
||||
|
||||
@Value("${storycove.images.upload-dir:/app/images}")
|
||||
private String uploadDir;
|
||||
|
||||
/**
|
||||
* Create a comprehensive backup including database and files in ZIP format
|
||||
*/
|
||||
public Resource createCompleteBackup() throws SQLException, IOException {
|
||||
Path tempZip = Files.createTempFile("storycove-backup", ".zip");
|
||||
|
||||
try (ZipOutputStream zipOut = new ZipOutputStream(Files.newOutputStream(tempZip))) {
|
||||
// 1. Add database dump
|
||||
addDatabaseDumpToZip(zipOut);
|
||||
|
||||
// 2. Add all image files
|
||||
addFilesToZip(zipOut);
|
||||
|
||||
// 3. Add metadata
|
||||
addMetadataToZip(zipOut);
|
||||
}
|
||||
|
||||
// Return the ZIP file as a resource
|
||||
byte[] zipData = Files.readAllBytes(tempZip);
|
||||
Files.deleteIfExists(tempZip);
|
||||
|
||||
return new ByteArrayResource(zipData);
|
||||
}
|
||||
|
||||
/**
|
||||
* Restore from complete backup (ZIP format)
|
||||
*/
|
||||
public void restoreFromCompleteBackup(InputStream backupStream) throws IOException, SQLException {
|
||||
System.err.println("Starting complete backup restore...");
|
||||
Path tempDir = Files.createTempDirectory("storycove-restore");
|
||||
System.err.println("Created temp directory: " + tempDir);
|
||||
|
||||
try {
|
||||
// 1. Extract ZIP to temp directory
|
||||
System.err.println("Extracting ZIP archive...");
|
||||
extractZipArchive(backupStream, tempDir);
|
||||
System.err.println("ZIP extraction completed.");
|
||||
|
||||
// 2. Validate backup structure
|
||||
System.err.println("Validating backup structure...");
|
||||
validateBackupStructure(tempDir);
|
||||
System.err.println("Backup structure validation completed.");
|
||||
|
||||
// 3. Clear existing data and files
|
||||
System.err.println("Clearing existing data and files...");
|
||||
clearAllDataAndFiles();
|
||||
System.err.println("Clear operation completed.");
|
||||
|
||||
// 4. Restore database
|
||||
Path databaseFile = tempDir.resolve("database.sql");
|
||||
if (Files.exists(databaseFile)) {
|
||||
System.err.println("Restoring database from SQL file...");
|
||||
try (InputStream sqlStream = Files.newInputStream(databaseFile)) {
|
||||
restoreFromBackup(sqlStream);
|
||||
}
|
||||
System.err.println("Database restore completed.");
|
||||
} else {
|
||||
System.err.println("Warning: No database.sql file found in backup.");
|
||||
}
|
||||
|
||||
// 5. Restore files
|
||||
Path filesDir = tempDir.resolve("files");
|
||||
if (Files.exists(filesDir)) {
|
||||
System.err.println("Restoring files...");
|
||||
restoreFiles(filesDir);
|
||||
System.err.println("File restore completed.");
|
||||
} else {
|
||||
System.err.println("No files directory found in backup - skipping file restore.");
|
||||
}
|
||||
|
||||
System.err.println("Complete backup restore finished successfully.");
|
||||
|
||||
} catch (Exception e) {
|
||||
System.err.println("Error during complete backup restore: " + e.getMessage());
|
||||
e.printStackTrace();
|
||||
throw e;
|
||||
} finally {
|
||||
// Clean up temp directory
|
||||
System.err.println("Cleaning up temp directory: " + tempDir);
|
||||
deleteDirectory(tempDir);
|
||||
System.err.println("Cleanup completed.");
|
||||
}
|
||||
}
|
||||
|
||||
public Resource createBackup() throws SQLException, IOException {
|
||||
StringBuilder sqlDump = new StringBuilder();
|
||||
|
||||
try (Connection connection = dataSource.getConnection()) {
|
||||
// Add header
|
||||
sqlDump.append("-- StoryCove Database Backup\n");
|
||||
sqlDump.append("-- Generated at: ").append(new java.util.Date()).append("\n\n");
|
||||
|
||||
// Disable foreign key checks during restore (PostgreSQL syntax)
|
||||
sqlDump.append("SET session_replication_role = replica;\n\n");
|
||||
|
||||
// List of tables in dependency order (parents first for insertion)
|
||||
List<String> insertTables = Arrays.asList(
|
||||
"authors", "series", "tags", "collections",
|
||||
"stories", "story_tags", "author_urls", "collection_stories"
|
||||
);
|
||||
|
||||
// TRUNCATE in reverse order (children first)
|
||||
List<String> truncateTables = Arrays.asList(
|
||||
"collection_stories", "author_urls", "story_tags",
|
||||
"stories", "collections", "tags", "series", "authors"
|
||||
);
|
||||
|
||||
// Generate TRUNCATE statements for each table (assuming tables already exist)
|
||||
for (String tableName : truncateTables) {
|
||||
sqlDump.append("-- Truncate Table: ").append(tableName).append("\n");
|
||||
sqlDump.append("TRUNCATE TABLE \"").append(tableName).append("\" CASCADE;\n");
|
||||
}
|
||||
sqlDump.append("\n");
|
||||
|
||||
// Generate INSERT statements in dependency order
|
||||
for (String tableName : insertTables) {
|
||||
sqlDump.append("-- Data for Table: ").append(tableName).append("\n");
|
||||
|
||||
// Get table data
|
||||
try (PreparedStatement stmt = connection.prepareStatement("SELECT * FROM \"" + tableName + "\"");
|
||||
ResultSet rs = stmt.executeQuery()) {
|
||||
|
||||
ResultSetMetaData metaData = rs.getMetaData();
|
||||
int columnCount = metaData.getColumnCount();
|
||||
|
||||
// Build column names for INSERT statement
|
||||
StringBuilder columnNames = new StringBuilder();
|
||||
for (int i = 1; i <= columnCount; i++) {
|
||||
if (i > 1) columnNames.append(", ");
|
||||
columnNames.append("\"").append(metaData.getColumnName(i)).append("\"");
|
||||
}
|
||||
|
||||
while (rs.next()) {
|
||||
sqlDump.append("INSERT INTO \"").append(tableName).append("\" (")
|
||||
.append(columnNames).append(") VALUES (");
|
||||
|
||||
for (int i = 1; i <= columnCount; i++) {
|
||||
if (i > 1) sqlDump.append(", ");
|
||||
|
||||
Object value = rs.getObject(i);
|
||||
sqlDump.append(formatSqlValue(value));
|
||||
}
|
||||
|
||||
sqlDump.append(");\n");
|
||||
}
|
||||
}
|
||||
|
||||
sqlDump.append("\n");
|
||||
}
|
||||
|
||||
// Re-enable foreign key checks (PostgreSQL syntax)
|
||||
sqlDump.append("SET session_replication_role = DEFAULT;\n");
|
||||
}
|
||||
|
||||
byte[] backupData = sqlDump.toString().getBytes(StandardCharsets.UTF_8);
|
||||
return new ByteArrayResource(backupData);
|
||||
}
|
||||
|
||||
@Transactional
|
||||
public void restoreFromBackup(InputStream backupStream) throws IOException, SQLException {
|
||||
// Read the SQL file
|
||||
StringBuilder sqlContent = new StringBuilder();
|
||||
try (BufferedReader reader = new BufferedReader(new InputStreamReader(backupStream, StandardCharsets.UTF_8))) {
|
||||
String line;
|
||||
while ((line = reader.readLine()) != null) {
|
||||
// Skip comments and empty lines
|
||||
if (!line.trim().startsWith("--") && !line.trim().isEmpty()) {
|
||||
sqlContent.append(line).append("\n");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Execute the SQL statements
|
||||
try (Connection connection = dataSource.getConnection()) {
|
||||
connection.setAutoCommit(false);
|
||||
|
||||
try {
|
||||
// Parse SQL statements properly (handle semicolons inside string literals)
|
||||
List<String> statements = parseStatements(sqlContent.toString());
|
||||
|
||||
int successCount = 0;
|
||||
for (String statement : statements) {
|
||||
String trimmedStatement = statement.trim();
|
||||
if (!trimmedStatement.isEmpty()) {
|
||||
try (PreparedStatement stmt = connection.prepareStatement(trimmedStatement)) {
|
||||
stmt.executeUpdate();
|
||||
successCount++;
|
||||
} catch (SQLException e) {
|
||||
// Log detailed error information for failed statements
|
||||
System.err.println("ERROR: Failed to execute SQL statement #" + (successCount + 1));
|
||||
System.err.println("Error: " + e.getMessage());
|
||||
System.err.println("SQL State: " + e.getSQLState());
|
||||
System.err.println("Error Code: " + e.getErrorCode());
|
||||
|
||||
// Show the problematic statement (first 500 chars)
|
||||
String statementPreview = trimmedStatement.length() > 500 ?
|
||||
trimmedStatement.substring(0, 500) + "..." : trimmedStatement;
|
||||
System.err.println("Statement: " + statementPreview);
|
||||
|
||||
throw e; // Re-throw to trigger rollback
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
connection.commit();
|
||||
System.err.println("Restore completed successfully. Executed " + successCount + " SQL statements.");
|
||||
|
||||
// Reindex search after successful restore
|
||||
try {
|
||||
System.err.println("Starting Typesense reindex after successful restore...");
|
||||
typesenseService.recreateStoriesCollection();
|
||||
typesenseService.recreateAuthorsCollection();
|
||||
// Note: Collections collection will be recreated when needed by the service
|
||||
System.err.println("Typesense reindex completed successfully.");
|
||||
} catch (Exception e) {
|
||||
// Log the error but don't fail the restore
|
||||
System.err.println("Warning: Failed to reindex Typesense after restore: " + e.getMessage());
|
||||
e.printStackTrace();
|
||||
}
|
||||
|
||||
} catch (SQLException e) {
|
||||
connection.rollback();
|
||||
throw e;
|
||||
} finally {
|
||||
connection.setAutoCommit(true);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Transactional
|
||||
public int clearAllData() {
|
||||
int totalDeleted = 0;
|
||||
|
||||
try {
|
||||
// Count entities before deletion
|
||||
int collectionCount = (int) collectionRepository.count();
|
||||
int storyCount = (int) storyRepository.count();
|
||||
int authorCount = (int) authorRepository.count();
|
||||
int seriesCount = (int) seriesRepository.count();
|
||||
int tagCount = (int) tagRepository.count();
|
||||
|
||||
// Clean up reading positions first (to avoid foreign key constraint violations)
|
||||
readingPositionRepository.deleteAll();
|
||||
|
||||
// Delete main entities (cascade will handle junction tables)
|
||||
collectionRepository.deleteAll();
|
||||
storyRepository.deleteAll();
|
||||
authorRepository.deleteAll();
|
||||
seriesRepository.deleteAll();
|
||||
tagRepository.deleteAll();
|
||||
|
||||
totalDeleted = collectionCount + storyCount + authorCount + seriesCount + tagCount;
|
||||
|
||||
// Note: Search indexes will need to be manually recreated after clearing
|
||||
// Use the settings page to recreate Typesense collections after clearing the database
|
||||
|
||||
} catch (Exception e) {
|
||||
throw new RuntimeException("Failed to clear database: " + e.getMessage(), e);
|
||||
}
|
||||
|
||||
return totalDeleted;
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses SQL content into individual statements, properly handling semicolons inside string literals
|
||||
*/
|
||||
private List<String> parseStatements(String sql) {
|
||||
List<String> statements = new ArrayList<>();
|
||||
StringBuilder currentStatement = new StringBuilder();
|
||||
boolean inString = false;
|
||||
|
||||
for (int i = 0; i < sql.length(); i++) {
|
||||
char c = sql.charAt(i);
|
||||
|
||||
if (c == '\'' && !inString) {
|
||||
// Start of string literal
|
||||
inString = true;
|
||||
currentStatement.append(c);
|
||||
} else if (c == '\'' && inString) {
|
||||
// Potential end of string literal
|
||||
currentStatement.append(c);
|
||||
|
||||
// Check if this is an escaped quote (doubled single quote)
|
||||
if (i + 1 < sql.length() && sql.charAt(i + 1) == '\'') {
|
||||
// This is an escaped quote, skip the next quote
|
||||
i++;
|
||||
currentStatement.append('\'');
|
||||
} else {
|
||||
// End of string literal
|
||||
inString = false;
|
||||
}
|
||||
} else if (c == ';' && !inString) {
|
||||
// Statement terminator outside of string literal
|
||||
String statement = currentStatement.toString().trim();
|
||||
if (!statement.isEmpty()) {
|
||||
statements.add(statement);
|
||||
}
|
||||
currentStatement = new StringBuilder();
|
||||
} else {
|
||||
currentStatement.append(c);
|
||||
}
|
||||
}
|
||||
|
||||
// Add final statement if any
|
||||
String finalStatement = currentStatement.toString().trim();
|
||||
if (!finalStatement.isEmpty()) {
|
||||
statements.add(finalStatement);
|
||||
}
|
||||
|
||||
return statements;
|
||||
}
|
||||
|
||||
/**
|
||||
* Formats a database value for SQL insertion, handling proper escaping
|
||||
*/
|
||||
private String formatSqlValue(Object value) {
|
||||
if (value == null) {
|
||||
return "NULL";
|
||||
}
|
||||
|
||||
if (value instanceof Boolean) {
|
||||
return ((Boolean) value) ? "true" : "false";
|
||||
}
|
||||
|
||||
if (value instanceof Number) {
|
||||
return value.toString();
|
||||
}
|
||||
|
||||
// Handle all other types as strings (String, UUID, Timestamp, CLOB, TEXT, etc.)
|
||||
String stringValue;
|
||||
|
||||
// Special handling for CLOB types
|
||||
if (value instanceof Clob) {
|
||||
Clob clob = (Clob) value;
|
||||
try {
|
||||
stringValue = clob.getSubString(1, (int) clob.length());
|
||||
} catch (SQLException e) {
|
||||
stringValue = value.toString();
|
||||
}
|
||||
} else {
|
||||
stringValue = value.toString();
|
||||
}
|
||||
|
||||
// Escape single quotes by replacing ' with '' and wrap in quotes
|
||||
String escapedValue = stringValue.replace("'", "''");
|
||||
|
||||
return "'" + escapedValue + "'";
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all data AND files (for complete restore)
|
||||
*/
|
||||
@Transactional
|
||||
public int clearAllDataAndFiles() {
|
||||
// First clear the database
|
||||
int totalDeleted = clearAllData();
|
||||
|
||||
// Then clear all uploaded files
|
||||
clearAllFiles();
|
||||
|
||||
// Clear search indexes
|
||||
clearSearchIndexes();
|
||||
|
||||
return totalDeleted;
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all uploaded files
|
||||
*/
|
||||
private void clearAllFiles() {
|
||||
Path imagesPath = Paths.get(uploadDir);
|
||||
|
||||
if (Files.exists(imagesPath)) {
|
||||
try {
|
||||
Files.walk(imagesPath)
|
||||
.filter(Files::isRegularFile)
|
||||
.forEach(filePath -> {
|
||||
try {
|
||||
Files.deleteIfExists(filePath);
|
||||
} catch (IOException e) {
|
||||
System.err.println("Warning: Failed to delete file: " + filePath + " - " + e.getMessage());
|
||||
}
|
||||
});
|
||||
} catch (IOException e) {
|
||||
System.err.println("Warning: Failed to clear files directory: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear search indexes
|
||||
*/
|
||||
private void clearSearchIndexes() {
|
||||
try {
|
||||
System.err.println("Clearing search indexes after complete clear...");
|
||||
typesenseService.recreateStoriesCollection();
|
||||
typesenseService.recreateAuthorsCollection();
|
||||
// Note: Collections collection will be recreated when needed by the service
|
||||
System.err.println("Search indexes cleared successfully.");
|
||||
} catch (Exception e) {
|
||||
// Log the error but don't fail the clear operation
|
||||
System.err.println("Warning: Failed to clear search indexes: " + e.getMessage());
|
||||
e.printStackTrace();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Add database dump to ZIP archive
|
||||
*/
|
||||
private void addDatabaseDumpToZip(ZipOutputStream zipOut) throws SQLException, IOException {
|
||||
Resource sqlBackup = createBackup();
|
||||
|
||||
ZipEntry sqlEntry = new ZipEntry("database.sql");
|
||||
zipOut.putNextEntry(sqlEntry);
|
||||
|
||||
try (InputStream sqlStream = sqlBackup.getInputStream()) {
|
||||
byte[] buffer = new byte[8192];
|
||||
int bytesRead;
|
||||
while ((bytesRead = sqlStream.read(buffer)) != -1) {
|
||||
zipOut.write(buffer, 0, bytesRead);
|
||||
}
|
||||
}
|
||||
|
||||
zipOut.closeEntry();
|
||||
}
|
||||
|
||||
/**
|
||||
* Add all files to ZIP archive
|
||||
*/
|
||||
private void addFilesToZip(ZipOutputStream zipOut) throws IOException {
|
||||
Path imagesPath = Paths.get(uploadDir);
|
||||
|
||||
if (!Files.exists(imagesPath)) {
|
||||
return;
|
||||
}
|
||||
|
||||
Files.walk(imagesPath)
|
||||
.filter(Files::isRegularFile)
|
||||
.forEach(filePath -> {
|
||||
try {
|
||||
Path relativePath = imagesPath.relativize(filePath);
|
||||
String zipEntryName = "files/" + relativePath.toString().replace('\\', '/');
|
||||
|
||||
ZipEntry entry = new ZipEntry(zipEntryName);
|
||||
zipOut.putNextEntry(entry);
|
||||
Files.copy(filePath, zipOut);
|
||||
zipOut.closeEntry();
|
||||
} catch (IOException e) {
|
||||
throw new RuntimeException("Failed to add file to backup: " + filePath, e);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Add metadata to ZIP archive
|
||||
*/
|
||||
private void addMetadataToZip(ZipOutputStream zipOut) throws IOException, SQLException {
|
||||
Map<String, Object> metadata = new HashMap<>();
|
||||
metadata.put("version", "1.0");
|
||||
metadata.put("format", "storycove-complete-backup");
|
||||
metadata.put("timestamp", LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||
metadata.put("generator", "StoryCove Database Management Service");
|
||||
|
||||
// Add statistics
|
||||
Map<String, Object> stats = new HashMap<>();
|
||||
try (Connection connection = dataSource.getConnection()) {
|
||||
stats.put("stories", getTableCount(connection, "stories"));
|
||||
stats.put("authors", getTableCount(connection, "authors"));
|
||||
stats.put("collections", getTableCount(connection, "collections"));
|
||||
stats.put("tags", getTableCount(connection, "tags"));
|
||||
stats.put("series", getTableCount(connection, "series"));
|
||||
}
|
||||
metadata.put("statistics", stats);
|
||||
|
||||
// Count files
|
||||
Path imagesPath = Paths.get(uploadDir);
|
||||
int fileCount = 0;
|
||||
if (Files.exists(imagesPath)) {
|
||||
fileCount = (int) Files.walk(imagesPath).filter(Files::isRegularFile).count();
|
||||
}
|
||||
metadata.put("fileCount", fileCount);
|
||||
|
||||
ObjectMapper mapper = new ObjectMapper();
|
||||
String metadataJson = mapper.writeValueAsString(metadata);
|
||||
|
||||
ZipEntry metadataEntry = new ZipEntry("metadata.json");
|
||||
zipOut.putNextEntry(metadataEntry);
|
||||
zipOut.write(metadataJson.getBytes(StandardCharsets.UTF_8));
|
||||
zipOut.closeEntry();
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract ZIP archive to directory
|
||||
*/
|
||||
private void extractZipArchive(InputStream zipStream, Path targetDir) throws IOException {
|
||||
try (ZipInputStream zis = new ZipInputStream(zipStream)) {
|
||||
ZipEntry entry;
|
||||
while ((entry = zis.getNextEntry()) != null) {
|
||||
Path entryPath = targetDir.resolve(entry.getName());
|
||||
|
||||
// Security check: ensure the entry path is within the target directory
|
||||
if (!entryPath.normalize().startsWith(targetDir.normalize())) {
|
||||
throw new IOException("ZIP entry is outside of target directory: " + entry.getName());
|
||||
}
|
||||
|
||||
if (entry.isDirectory()) {
|
||||
Files.createDirectories(entryPath);
|
||||
} else {
|
||||
Files.createDirectories(entryPath.getParent());
|
||||
Files.copy(zis, entryPath, StandardCopyOption.REPLACE_EXISTING);
|
||||
}
|
||||
|
||||
zis.closeEntry();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate backup structure
|
||||
*/
|
||||
private void validateBackupStructure(Path backupDir) throws IOException {
|
||||
Path metadataFile = backupDir.resolve("metadata.json");
|
||||
Path databaseFile = backupDir.resolve("database.sql");
|
||||
|
||||
if (!Files.exists(metadataFile)) {
|
||||
throw new IOException("Invalid backup: metadata.json not found");
|
||||
}
|
||||
|
||||
if (!Files.exists(databaseFile)) {
|
||||
throw new IOException("Invalid backup: database.sql not found");
|
||||
}
|
||||
|
||||
// Validate metadata
|
||||
try {
|
||||
ObjectMapper mapper = new ObjectMapper();
|
||||
Map<String, Object> metadata = mapper.readValue(Files.newInputStream(metadataFile), Map.class);
|
||||
|
||||
String format = (String) metadata.get("format");
|
||||
if (!"storycove-complete-backup".equals(format)) {
|
||||
throw new IOException("Invalid backup format: " + format);
|
||||
}
|
||||
|
||||
String version = (String) metadata.get("version");
|
||||
if (!"1.0".equals(version)) {
|
||||
throw new IOException("Unsupported backup version: " + version);
|
||||
}
|
||||
|
||||
} catch (Exception e) {
|
||||
throw new IOException("Failed to validate backup metadata: " + e.getMessage(), e);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Restore files from backup
|
||||
*/
|
||||
private void restoreFiles(Path filesDir) throws IOException {
|
||||
Path targetDir = Paths.get(uploadDir);
|
||||
Files.createDirectories(targetDir);
|
||||
|
||||
Files.walk(filesDir)
|
||||
.filter(Files::isRegularFile)
|
||||
.forEach(sourceFile -> {
|
||||
try {
|
||||
Path relativePath = filesDir.relativize(sourceFile);
|
||||
Path targetFile = targetDir.resolve(relativePath);
|
||||
|
||||
Files.createDirectories(targetFile.getParent());
|
||||
Files.copy(sourceFile, targetFile, StandardCopyOption.REPLACE_EXISTING);
|
||||
} catch (IOException e) {
|
||||
throw new RuntimeException("Failed to restore file: " + sourceFile, e);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete directory recursively
|
||||
*/
|
||||
private void deleteDirectory(Path directory) throws IOException {
|
||||
if (Files.exists(directory)) {
|
||||
Files.walk(directory)
|
||||
.sorted(Comparator.reverseOrder())
|
||||
.forEach(path -> {
|
||||
try {
|
||||
Files.delete(path);
|
||||
} catch (IOException e) {
|
||||
System.err.println("Warning: Failed to delete temp file: " + path);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get count of records in a table
|
||||
*/
|
||||
private int getTableCount(Connection connection, String tableName) throws SQLException {
|
||||
try (PreparedStatement stmt = connection.prepareStatement("SELECT COUNT(*) FROM \"" + tableName + "\"");
|
||||
ResultSet rs = stmt.executeQuery()) {
|
||||
if (rs.next()) {
|
||||
return rs.getInt(1);
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,585 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.EPUBExportRequest;
|
||||
import com.storycove.entity.Collection;
|
||||
import com.storycove.entity.ReadingPosition;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.repository.ReadingPositionRepository;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
|
||||
import nl.siegmann.epublib.domain.*;
|
||||
import nl.siegmann.epublib.epub.EpubWriter;
|
||||
|
||||
import org.jsoup.Jsoup;
|
||||
import org.jsoup.nodes.Document;
|
||||
import org.jsoup.nodes.Element;
|
||||
import org.jsoup.select.Elements;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.core.io.ByteArrayResource;
|
||||
import org.springframework.core.io.Resource;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
|
||||
import java.io.ByteArrayOutputStream;
|
||||
import java.io.FileInputStream;
|
||||
import java.io.IOException;
|
||||
import java.nio.file.Files;
|
||||
import java.nio.file.Path;
|
||||
import java.nio.file.Paths;
|
||||
import java.time.LocalDateTime;
|
||||
import java.time.format.DateTimeFormatter;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
@Service
|
||||
@Transactional
|
||||
public class EPUBExportService {
|
||||
|
||||
private final StoryService storyService;
|
||||
private final ReadingPositionRepository readingPositionRepository;
|
||||
private final CollectionService collectionService;
|
||||
|
||||
@Autowired
|
||||
public EPUBExportService(StoryService storyService,
|
||||
ReadingPositionRepository readingPositionRepository,
|
||||
CollectionService collectionService) {
|
||||
this.storyService = storyService;
|
||||
this.readingPositionRepository = readingPositionRepository;
|
||||
this.collectionService = collectionService;
|
||||
}
|
||||
|
||||
public Resource exportStoryAsEPUB(EPUBExportRequest request) throws IOException {
|
||||
Story story = storyService.findById(request.getStoryId());
|
||||
|
||||
Book book = createEPUBBook(story, request);
|
||||
|
||||
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
|
||||
EpubWriter epubWriter = new EpubWriter();
|
||||
epubWriter.write(book, outputStream);
|
||||
|
||||
return new ByteArrayResource(outputStream.toByteArray());
|
||||
}
|
||||
|
||||
public Resource exportCollectionAsEPUB(UUID collectionId, EPUBExportRequest request) throws IOException {
|
||||
Collection collection = collectionService.findById(collectionId);
|
||||
List<Story> stories = collection.getCollectionStories().stream()
|
||||
.sorted((cs1, cs2) -> Integer.compare(cs1.getPosition(), cs2.getPosition()))
|
||||
.map(cs -> cs.getStory())
|
||||
.collect(Collectors.toList());
|
||||
|
||||
if (stories.isEmpty()) {
|
||||
throw new ResourceNotFoundException("Collection contains no stories to export");
|
||||
}
|
||||
|
||||
Book book = createCollectionEPUBBook(collection, stories, request);
|
||||
|
||||
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
|
||||
EpubWriter epubWriter = new EpubWriter();
|
||||
epubWriter.write(book, outputStream);
|
||||
|
||||
return new ByteArrayResource(outputStream.toByteArray());
|
||||
}
|
||||
|
||||
private Book createEPUBBook(Story story, EPUBExportRequest request) throws IOException {
|
||||
Book book = new Book();
|
||||
|
||||
setupMetadata(book, story, request);
|
||||
|
||||
addCoverImage(book, story, request);
|
||||
|
||||
addContent(book, story, request);
|
||||
|
||||
addReadingPosition(book, story, request);
|
||||
|
||||
return book;
|
||||
}
|
||||
|
||||
private Book createCollectionEPUBBook(Collection collection, List<Story> stories, EPUBExportRequest request) throws IOException {
|
||||
Book book = new Book();
|
||||
|
||||
setupCollectionMetadata(book, collection, stories, request);
|
||||
|
||||
addCollectionCoverImage(book, collection, request);
|
||||
|
||||
addCollectionContent(book, stories, request);
|
||||
|
||||
return book;
|
||||
}
|
||||
|
||||
private void setupMetadata(Book book, Story story, EPUBExportRequest request) {
|
||||
Metadata metadata = book.getMetadata();
|
||||
|
||||
String title = request.getCustomTitle() != null ?
|
||||
request.getCustomTitle() : story.getTitle();
|
||||
metadata.addTitle(title);
|
||||
|
||||
String authorName = request.getCustomAuthor() != null ?
|
||||
request.getCustomAuthor() :
|
||||
(story.getAuthor() != null ? story.getAuthor().getName() : "Unknown Author");
|
||||
metadata.addAuthor(new Author(authorName));
|
||||
|
||||
metadata.setLanguage(request.getLanguage() != null ? request.getLanguage() : "en");
|
||||
|
||||
metadata.addIdentifier(new Identifier("storycove", story.getId().toString()));
|
||||
|
||||
if (story.getDescription() != null) {
|
||||
metadata.addDescription(story.getDescription());
|
||||
}
|
||||
|
||||
if (request.getIncludeMetadata()) {
|
||||
metadata.addDate(new Date(java.util.Date.from(
|
||||
story.getCreatedAt().atZone(java.time.ZoneId.systemDefault()).toInstant()
|
||||
), Date.Event.CREATION));
|
||||
|
||||
if (story.getSeries() != null) {
|
||||
// Add series and metadata info to description instead of using addMeta
|
||||
StringBuilder description = new StringBuilder();
|
||||
if (story.getDescription() != null) {
|
||||
description.append(story.getDescription()).append("\n\n");
|
||||
}
|
||||
|
||||
description.append("Series: ").append(story.getSeries().getName());
|
||||
if (story.getVolume() != null) {
|
||||
description.append(" (Volume ").append(story.getVolume()).append(")");
|
||||
}
|
||||
description.append("\n");
|
||||
|
||||
if (story.getWordCount() != null) {
|
||||
description.append("Word Count: ").append(story.getWordCount()).append("\n");
|
||||
}
|
||||
|
||||
if (story.getRating() != null) {
|
||||
description.append("Rating: ").append(story.getRating()).append("/5\n");
|
||||
}
|
||||
|
||||
if (!story.getTags().isEmpty()) {
|
||||
String tags = story.getTags().stream()
|
||||
.map(tag -> tag.getName())
|
||||
.reduce((a, b) -> a + ", " + b)
|
||||
.orElse("");
|
||||
description.append("Tags: ").append(tags).append("\n");
|
||||
}
|
||||
|
||||
description.append("\nGenerated by StoryCove on ")
|
||||
.append(LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||
|
||||
metadata.addDescription(description.toString());
|
||||
}
|
||||
}
|
||||
|
||||
if (request.getCustomMetadata() != null && !request.getCustomMetadata().isEmpty()) {
|
||||
// Add custom metadata to description since addMeta doesn't exist
|
||||
StringBuilder customDesc = new StringBuilder();
|
||||
for (String customMeta : request.getCustomMetadata()) {
|
||||
String[] parts = customMeta.split(":", 2);
|
||||
if (parts.length == 2) {
|
||||
customDesc.append(parts[0].trim()).append(": ").append(parts[1].trim()).append("\n");
|
||||
}
|
||||
}
|
||||
if (customDesc.length() > 0) {
|
||||
String existingDesc = metadata.getDescriptions().isEmpty() ? "" : metadata.getDescriptions().get(0);
|
||||
metadata.addDescription(existingDesc + "\n" + customDesc.toString());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void addCoverImage(Book book, Story story, EPUBExportRequest request) {
|
||||
if (!request.getIncludeCoverImage() || story.getCoverPath() == null) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
Path coverPath = Paths.get(story.getCoverPath());
|
||||
if (Files.exists(coverPath)) {
|
||||
byte[] coverImageData = Files.readAllBytes(coverPath);
|
||||
String mimeType = Files.probeContentType(coverPath);
|
||||
if (mimeType == null) {
|
||||
mimeType = "image/jpeg";
|
||||
}
|
||||
|
||||
nl.siegmann.epublib.domain.Resource coverResource =
|
||||
new nl.siegmann.epublib.domain.Resource(coverImageData, "cover.jpg");
|
||||
|
||||
book.setCoverImage(coverResource);
|
||||
}
|
||||
} catch (IOException e) {
|
||||
// Skip cover image on error
|
||||
}
|
||||
}
|
||||
|
||||
private void addContent(Book book, Story story, EPUBExportRequest request) {
|
||||
String content = story.getContentHtml();
|
||||
if (content == null) {
|
||||
content = story.getContentPlain() != null ?
|
||||
"<p>" + story.getContentPlain().replace("\n", "</p><p>") + "</p>" :
|
||||
"<p>No content available</p>";
|
||||
}
|
||||
|
||||
if (request.getSplitByChapters()) {
|
||||
addChapterizedContent(book, content, request);
|
||||
} else {
|
||||
addSingleChapterContent(book, content, story);
|
||||
}
|
||||
}
|
||||
|
||||
private void addSingleChapterContent(Book book, String content, Story story) {
|
||||
String html = createChapterHTML(story.getTitle(), content);
|
||||
|
||||
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter.html");
|
||||
|
||||
book.addSection(story.getTitle(), chapterResource);
|
||||
}
|
||||
|
||||
private void addChapterizedContent(Book book, String content, EPUBExportRequest request) {
|
||||
Document doc = Jsoup.parse(content);
|
||||
Elements chapters = doc.select("div.chapter, h1, h2, h3");
|
||||
|
||||
if (chapters.isEmpty()) {
|
||||
List<String> paragraphs = splitByWords(content,
|
||||
request.getMaxWordsPerChapter() != null ? request.getMaxWordsPerChapter() : 2000);
|
||||
|
||||
for (int i = 0; i < paragraphs.size(); i++) {
|
||||
String chapterTitle = "Chapter " + (i + 1);
|
||||
String html = createChapterHTML(chapterTitle, paragraphs.get(i));
|
||||
|
||||
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter" + (i + 1) + ".html");
|
||||
|
||||
book.addSection(chapterTitle, chapterResource);
|
||||
}
|
||||
} else {
|
||||
for (int i = 0; i < chapters.size(); i++) {
|
||||
Element chapter = chapters.get(i);
|
||||
String chapterTitle = chapter.text();
|
||||
if (chapterTitle.trim().isEmpty()) {
|
||||
chapterTitle = "Chapter " + (i + 1);
|
||||
}
|
||||
|
||||
String chapterContent = chapter.html();
|
||||
String html = createChapterHTML(chapterTitle, chapterContent);
|
||||
|
||||
nl.siegmann.epublib.domain.Resource chapterResource =
|
||||
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "chapter" + (i + 1) + ".html");
|
||||
|
||||
book.addSection(chapterTitle, chapterResource);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private List<String> splitByWords(String content, int maxWordsPerChapter) {
|
||||
String[] words = content.split("\\s+");
|
||||
List<String> chapters = new ArrayList<>();
|
||||
StringBuilder currentChapter = new StringBuilder();
|
||||
int wordCount = 0;
|
||||
|
||||
for (String word : words) {
|
||||
currentChapter.append(word).append(" ");
|
||||
wordCount++;
|
||||
|
||||
if (wordCount >= maxWordsPerChapter) {
|
||||
chapters.add(currentChapter.toString().trim());
|
||||
currentChapter = new StringBuilder();
|
||||
wordCount = 0;
|
||||
}
|
||||
}
|
||||
|
||||
if (currentChapter.length() > 0) {
|
||||
chapters.add(currentChapter.toString().trim());
|
||||
}
|
||||
|
||||
return chapters;
|
||||
}
|
||||
|
||||
private String createChapterHTML(String title, String content) {
|
||||
return "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" +
|
||||
"<!DOCTYPE html PUBLIC \"-//W3C//DTD XHTML 1.1//EN\" " +
|
||||
"\"http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd\">" +
|
||||
"<html xmlns=\"http://www.w3.org/1999/xhtml\">" +
|
||||
"<head>" +
|
||||
"<title>" + escapeHtml(title) + "</title>" +
|
||||
"<style type=\"text/css\">" +
|
||||
"body { font-family: serif; margin: 1em; }" +
|
||||
"h1 { text-align: center; }" +
|
||||
"p { text-indent: 1em; margin: 0.5em 0; }" +
|
||||
"</style>" +
|
||||
"</head>" +
|
||||
"<body>" +
|
||||
"<h1>" + escapeHtml(title) + "</h1>" +
|
||||
fixHtmlForXhtml(content) +
|
||||
"</body>" +
|
||||
"</html>";
|
||||
}
|
||||
|
||||
private void addReadingPosition(Book book, Story story, EPUBExportRequest request) {
|
||||
if (!request.getIncludeReadingPosition()) {
|
||||
return;
|
||||
}
|
||||
|
||||
Optional<ReadingPosition> positionOpt = readingPositionRepository.findByStoryId(story.getId());
|
||||
if (positionOpt.isPresent()) {
|
||||
ReadingPosition position = positionOpt.get();
|
||||
Metadata metadata = book.getMetadata();
|
||||
|
||||
// Add reading position to description since addMeta doesn't exist
|
||||
StringBuilder positionDesc = new StringBuilder();
|
||||
if (position.getEpubCfi() != null) {
|
||||
positionDesc.append("EPUB CFI: ").append(position.getEpubCfi()).append("\n");
|
||||
}
|
||||
|
||||
if (position.getChapterIndex() != null && position.getWordPosition() != null) {
|
||||
positionDesc.append("Reading Position: Chapter ")
|
||||
.append(position.getChapterIndex())
|
||||
.append(", Word ").append(position.getWordPosition()).append("\n");
|
||||
}
|
||||
|
||||
if (position.getPercentageComplete() != null) {
|
||||
positionDesc.append("Reading Progress: ")
|
||||
.append(String.format("%.1f%%", position.getPercentageComplete())).append("\n");
|
||||
}
|
||||
|
||||
positionDesc.append("Last Read: ")
|
||||
.append(position.getUpdatedAt().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||
|
||||
String existingDesc = metadata.getDescriptions().isEmpty() ? "" : metadata.getDescriptions().get(0);
|
||||
metadata.addDescription(existingDesc + "\n\n--- Reading Position ---\n" + positionDesc.toString());
|
||||
}
|
||||
}
|
||||
|
||||
private String fixHtmlForXhtml(String html) {
|
||||
if (html == null) return "";
|
||||
|
||||
// Fix common XHTML validation issues
|
||||
String fixed = html
|
||||
// Fix self-closing tags to be XHTML compliant
|
||||
.replaceAll("<br>", "<br />")
|
||||
.replaceAll("<hr>", "<hr />")
|
||||
.replaceAll("<img([^>]*)>", "<img$1 />")
|
||||
.replaceAll("<input([^>]*)>", "<input$1 />")
|
||||
.replaceAll("<area([^>]*)>", "<area$1 />")
|
||||
.replaceAll("<base([^>]*)>", "<base$1 />")
|
||||
.replaceAll("<col([^>]*)>", "<col$1 />")
|
||||
.replaceAll("<embed([^>]*)>", "<embed$1 />")
|
||||
.replaceAll("<link([^>]*)>", "<link$1 />")
|
||||
.replaceAll("<meta([^>]*)>", "<meta$1 />")
|
||||
.replaceAll("<param([^>]*)>", "<param$1 />")
|
||||
.replaceAll("<source([^>]*)>", "<source$1 />")
|
||||
.replaceAll("<track([^>]*)>", "<track$1 />")
|
||||
.replaceAll("<wbr([^>]*)>", "<wbr$1 />");
|
||||
|
||||
return fixed;
|
||||
}
|
||||
|
||||
private String escapeHtml(String text) {
|
||||
if (text == null) return "";
|
||||
return text.replace("&", "&")
|
||||
.replace("<", "<")
|
||||
.replace(">", ">")
|
||||
.replace("\"", """)
|
||||
.replace("'", "'");
|
||||
}
|
||||
|
||||
public String getEPUBFilename(Story story) {
|
||||
StringBuilder filename = new StringBuilder();
|
||||
|
||||
if (story.getAuthor() != null) {
|
||||
filename.append(sanitizeFilename(story.getAuthor().getName()))
|
||||
.append(" - ");
|
||||
}
|
||||
|
||||
filename.append(sanitizeFilename(story.getTitle()));
|
||||
|
||||
if (story.getSeries() != null && story.getVolume() != null) {
|
||||
filename.append(" (")
|
||||
.append(sanitizeFilename(story.getSeries().getName()))
|
||||
.append(" ")
|
||||
.append(story.getVolume())
|
||||
.append(")");
|
||||
}
|
||||
|
||||
filename.append(".epub");
|
||||
|
||||
return filename.toString();
|
||||
}
|
||||
|
||||
private String sanitizeFilename(String filename) {
|
||||
if (filename == null) return "unknown";
|
||||
return filename.replaceAll("[^a-zA-Z0-9._\\- ]", "")
|
||||
.trim()
|
||||
.replaceAll("\\s+", "_");
|
||||
}
|
||||
|
||||
private void setupCollectionMetadata(Book book, Collection collection, List<Story> stories, EPUBExportRequest request) {
|
||||
Metadata metadata = book.getMetadata();
|
||||
|
||||
String title = request.getCustomTitle() != null ?
|
||||
request.getCustomTitle() : collection.getName();
|
||||
metadata.addTitle(title);
|
||||
|
||||
// Use collection creator as author, or combine story authors
|
||||
String authorName = "Collection";
|
||||
if (stories.size() == 1) {
|
||||
Story story = stories.get(0);
|
||||
authorName = story.getAuthor() != null ? story.getAuthor().getName() : "Unknown Author";
|
||||
} else {
|
||||
// For multiple stories, use "Various Authors" or collection name
|
||||
authorName = "Various Authors";
|
||||
}
|
||||
|
||||
if (request.getCustomAuthor() != null) {
|
||||
authorName = request.getCustomAuthor();
|
||||
}
|
||||
|
||||
metadata.addAuthor(new Author(authorName));
|
||||
metadata.setLanguage(request.getLanguage() != null ? request.getLanguage() : "en");
|
||||
metadata.addIdentifier(new Identifier("storycove-collection", collection.getId().toString()));
|
||||
|
||||
// Create description from collection description and story list
|
||||
StringBuilder description = new StringBuilder();
|
||||
if (collection.getDescription() != null && !collection.getDescription().trim().isEmpty()) {
|
||||
description.append(collection.getDescription()).append("\n\n");
|
||||
}
|
||||
|
||||
description.append("This collection contains ").append(stories.size()).append(" stories:\n");
|
||||
for (int i = 0; i < stories.size() && i < 10; i++) {
|
||||
Story story = stories.get(i);
|
||||
description.append((i + 1)).append(". ").append(story.getTitle());
|
||||
if (story.getAuthor() != null) {
|
||||
description.append(" by ").append(story.getAuthor().getName());
|
||||
}
|
||||
description.append("\n");
|
||||
}
|
||||
if (stories.size() > 10) {
|
||||
description.append("... and ").append(stories.size() - 10).append(" more stories.");
|
||||
}
|
||||
|
||||
metadata.addDescription(description.toString());
|
||||
|
||||
if (request.getIncludeMetadata()) {
|
||||
metadata.addDate(new Date(java.util.Date.from(
|
||||
collection.getCreatedAt().atZone(java.time.ZoneId.systemDefault()).toInstant()
|
||||
), Date.Event.CREATION));
|
||||
|
||||
// Add collection statistics to description
|
||||
int totalWordCount = stories.stream().mapToInt(s -> s.getWordCount() != null ? s.getWordCount() : 0).sum();
|
||||
description.append("\n\nTotal Word Count: ").append(totalWordCount);
|
||||
description.append("\nGenerated by StoryCove on ")
|
||||
.append(LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME));
|
||||
|
||||
metadata.addDescription(description.toString());
|
||||
}
|
||||
}
|
||||
|
||||
private void addCollectionCoverImage(Book book, Collection collection, EPUBExportRequest request) {
|
||||
if (!request.getIncludeCoverImage()) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
// Try to use collection cover first
|
||||
if (collection.getCoverImagePath() != null) {
|
||||
Path coverPath = Paths.get(collection.getCoverImagePath());
|
||||
if (Files.exists(coverPath)) {
|
||||
byte[] coverImageData = Files.readAllBytes(coverPath);
|
||||
String mimeType = Files.probeContentType(coverPath);
|
||||
if (mimeType == null) {
|
||||
mimeType = "image/jpeg";
|
||||
}
|
||||
|
||||
nl.siegmann.epublib.domain.Resource coverResource =
|
||||
new nl.siegmann.epublib.domain.Resource(coverImageData, "collection-cover.jpg");
|
||||
|
||||
book.setCoverImage(coverResource);
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// TODO: Could generate a composite cover from story covers
|
||||
// For now, skip cover if collection doesn't have one
|
||||
|
||||
} catch (IOException e) {
|
||||
// Skip cover image on error
|
||||
}
|
||||
}
|
||||
|
||||
private void addCollectionContent(Book book, List<Story> stories, EPUBExportRequest request) {
|
||||
// Create table of contents chapter
|
||||
StringBuilder tocContent = new StringBuilder();
|
||||
tocContent.append("<h1>Table of Contents</h1>\n<ul>\n");
|
||||
|
||||
for (int i = 0; i < stories.size(); i++) {
|
||||
Story story = stories.get(i);
|
||||
tocContent.append("<li><a href=\"#story").append(i + 1).append("\">")
|
||||
.append(escapeHtml(story.getTitle()));
|
||||
if (story.getAuthor() != null) {
|
||||
tocContent.append(" by ").append(escapeHtml(story.getAuthor().getName()));
|
||||
}
|
||||
tocContent.append("</a></li>\n");
|
||||
}
|
||||
|
||||
tocContent.append("</ul>\n");
|
||||
|
||||
String tocHtml = createChapterHTML("Table of Contents", tocContent.toString());
|
||||
nl.siegmann.epublib.domain.Resource tocResource =
|
||||
new nl.siegmann.epublib.domain.Resource(tocHtml.getBytes(), "toc.html");
|
||||
book.addSection("Table of Contents", tocResource);
|
||||
|
||||
// Add each story as a chapter
|
||||
for (int i = 0; i < stories.size(); i++) {
|
||||
Story story = stories.get(i);
|
||||
String storyContent = story.getContentHtml();
|
||||
|
||||
if (storyContent == null) {
|
||||
storyContent = story.getContentPlain() != null ?
|
||||
"<p>" + story.getContentPlain().replace("\n", "</p><p>") + "</p>" :
|
||||
"<p>No content available</p>";
|
||||
}
|
||||
|
||||
// Add story metadata header
|
||||
StringBuilder storyHtml = new StringBuilder();
|
||||
storyHtml.append("<div id=\"story").append(i + 1).append("\">\n");
|
||||
storyHtml.append("<h1>").append(escapeHtml(story.getTitle())).append("</h1>\n");
|
||||
if (story.getAuthor() != null) {
|
||||
storyHtml.append("<p><em>by ").append(escapeHtml(story.getAuthor().getName())).append("</em></p>\n");
|
||||
}
|
||||
if (story.getDescription() != null && !story.getDescription().trim().isEmpty()) {
|
||||
storyHtml.append("<div class=\"summary\">\n")
|
||||
.append("<p>").append(escapeHtml(story.getDescription())).append("</p>\n")
|
||||
.append("</div>\n");
|
||||
}
|
||||
storyHtml.append("<hr />\n");
|
||||
storyHtml.append(fixHtmlForXhtml(storyContent));
|
||||
storyHtml.append("</div>\n");
|
||||
|
||||
String chapterTitle = story.getTitle();
|
||||
if (story.getAuthor() != null) {
|
||||
chapterTitle += " by " + story.getAuthor().getName();
|
||||
}
|
||||
|
||||
String html = createChapterHTML(chapterTitle, storyHtml.toString());
|
||||
nl.siegmann.epublib.domain.Resource storyResource =
|
||||
new nl.siegmann.epublib.domain.Resource(html.getBytes(), "story" + (i + 1) + ".html");
|
||||
|
||||
book.addSection(chapterTitle, storyResource);
|
||||
}
|
||||
}
|
||||
|
||||
public boolean canExportStory(UUID storyId) {
|
||||
try {
|
||||
Story story = storyService.findById(storyId);
|
||||
return story.getContentHtml() != null || story.getContentPlain() != null;
|
||||
} catch (ResourceNotFoundException e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
public String getCollectionEPUBFilename(Collection collection) {
|
||||
StringBuilder filename = new StringBuilder();
|
||||
filename.append(sanitizeFilename(collection.getName()));
|
||||
filename.append("_collection.epub");
|
||||
return filename.toString();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,522 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.EPUBImportRequest;
|
||||
import com.storycove.dto.EPUBImportResponse;
|
||||
import com.storycove.dto.ReadingPositionDto;
|
||||
import com.storycove.entity.*;
|
||||
import com.storycove.repository.ReadingPositionRepository;
|
||||
import com.storycove.service.exception.InvalidFileException;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
|
||||
import nl.siegmann.epublib.domain.Book;
|
||||
import nl.siegmann.epublib.domain.Metadata;
|
||||
import nl.siegmann.epublib.domain.Resource;
|
||||
import nl.siegmann.epublib.domain.SpineReference;
|
||||
import nl.siegmann.epublib.epub.EpubReader;
|
||||
|
||||
import org.jsoup.Jsoup;
|
||||
import org.jsoup.nodes.Document;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
import org.springframework.transaction.annotation.Transactional;
|
||||
import org.springframework.web.multipart.MultipartFile;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.io.InputStream;
|
||||
import java.util.ArrayList;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
@Service
|
||||
@Transactional
|
||||
public class EPUBImportService {
|
||||
|
||||
private final StoryService storyService;
|
||||
private final AuthorService authorService;
|
||||
private final SeriesService seriesService;
|
||||
private final TagService tagService;
|
||||
private final ReadingPositionRepository readingPositionRepository;
|
||||
private final HtmlSanitizationService sanitizationService;
|
||||
private final ImageService imageService;
|
||||
|
||||
@Autowired
|
||||
public EPUBImportService(StoryService storyService,
|
||||
AuthorService authorService,
|
||||
SeriesService seriesService,
|
||||
TagService tagService,
|
||||
ReadingPositionRepository readingPositionRepository,
|
||||
HtmlSanitizationService sanitizationService,
|
||||
ImageService imageService) {
|
||||
this.storyService = storyService;
|
||||
this.authorService = authorService;
|
||||
this.seriesService = seriesService;
|
||||
this.tagService = tagService;
|
||||
this.readingPositionRepository = readingPositionRepository;
|
||||
this.sanitizationService = sanitizationService;
|
||||
this.imageService = imageService;
|
||||
}
|
||||
|
||||
public EPUBImportResponse importEPUB(EPUBImportRequest request) {
|
||||
try {
|
||||
MultipartFile epubFile = request.getEpubFile();
|
||||
|
||||
if (epubFile == null || epubFile.isEmpty()) {
|
||||
return EPUBImportResponse.error("EPUB file is required");
|
||||
}
|
||||
|
||||
if (!isValidEPUBFile(epubFile)) {
|
||||
return EPUBImportResponse.error("Invalid EPUB file format");
|
||||
}
|
||||
|
||||
Book book = parseEPUBFile(epubFile);
|
||||
|
||||
Story story = createStoryFromEPUB(book, request);
|
||||
|
||||
Story savedStory = storyService.create(story);
|
||||
|
||||
EPUBImportResponse response = EPUBImportResponse.success(savedStory.getId(), savedStory.getTitle());
|
||||
response.setWordCount(savedStory.getWordCount());
|
||||
response.setTotalChapters(book.getSpine().size());
|
||||
|
||||
if (request.getPreserveReadingPosition() != null && request.getPreserveReadingPosition()) {
|
||||
ReadingPosition readingPosition = extractReadingPosition(book, savedStory);
|
||||
if (readingPosition != null) {
|
||||
ReadingPosition savedPosition = readingPositionRepository.save(readingPosition);
|
||||
response.setReadingPosition(convertToDto(savedPosition));
|
||||
}
|
||||
}
|
||||
|
||||
return response;
|
||||
|
||||
} catch (Exception e) {
|
||||
return EPUBImportResponse.error("Failed to import EPUB: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
private boolean isValidEPUBFile(MultipartFile file) {
|
||||
String filename = file.getOriginalFilename();
|
||||
if (filename == null || !filename.toLowerCase().endsWith(".epub")) {
|
||||
return false;
|
||||
}
|
||||
|
||||
String contentType = file.getContentType();
|
||||
return "application/epub+zip".equals(contentType) ||
|
||||
"application/zip".equals(contentType) ||
|
||||
contentType == null;
|
||||
}
|
||||
|
||||
private Book parseEPUBFile(MultipartFile epubFile) throws IOException {
|
||||
try (InputStream inputStream = epubFile.getInputStream()) {
|
||||
EpubReader epubReader = new EpubReader();
|
||||
return epubReader.readEpub(inputStream);
|
||||
} catch (Exception e) {
|
||||
throw new InvalidFileException("Failed to parse EPUB file: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
private Story createStoryFromEPUB(Book book, EPUBImportRequest request) {
|
||||
Metadata metadata = book.getMetadata();
|
||||
|
||||
String title = extractTitle(metadata);
|
||||
String authorName = extractAuthorName(metadata, request);
|
||||
String description = extractDescription(metadata);
|
||||
String content = extractContent(book);
|
||||
|
||||
Story story = new Story();
|
||||
story.setTitle(title);
|
||||
story.setDescription(description);
|
||||
story.setContentHtml(sanitizationService.sanitize(content));
|
||||
|
||||
// Extract and process cover image
|
||||
if (request.getExtractCover() == null || request.getExtractCover()) {
|
||||
String coverPath = extractAndSaveCoverImage(book);
|
||||
if (coverPath != null) {
|
||||
story.setCoverPath(coverPath);
|
||||
}
|
||||
}
|
||||
|
||||
if (request.getAuthorId() != null) {
|
||||
try {
|
||||
Author author = authorService.findById(request.getAuthorId());
|
||||
story.setAuthor(author);
|
||||
} catch (ResourceNotFoundException e) {
|
||||
if (request.getCreateMissingAuthor()) {
|
||||
Author newAuthor = createAuthor(authorName);
|
||||
story.setAuthor(newAuthor);
|
||||
}
|
||||
}
|
||||
} else if (authorName != null && request.getCreateMissingAuthor()) {
|
||||
Author author = findOrCreateAuthor(authorName);
|
||||
story.setAuthor(author);
|
||||
}
|
||||
|
||||
if (request.getSeriesId() != null && request.getSeriesVolume() != null) {
|
||||
try {
|
||||
Series series = seriesService.findById(request.getSeriesId());
|
||||
story.setSeries(series);
|
||||
story.setVolume(request.getSeriesVolume());
|
||||
} catch (ResourceNotFoundException e) {
|
||||
if (request.getCreateMissingSeries() && request.getSeriesName() != null) {
|
||||
Series newSeries = createSeries(request.getSeriesName());
|
||||
story.setSeries(newSeries);
|
||||
story.setVolume(request.getSeriesVolume());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Handle tags from request or extract from EPUB metadata
|
||||
List<String> allTags = new ArrayList<>();
|
||||
if (request.getTags() != null && !request.getTags().isEmpty()) {
|
||||
allTags.addAll(request.getTags());
|
||||
}
|
||||
|
||||
// Extract subjects/keywords from EPUB metadata
|
||||
List<String> epubTags = extractTags(metadata);
|
||||
if (epubTags != null && !epubTags.isEmpty()) {
|
||||
allTags.addAll(epubTags);
|
||||
}
|
||||
|
||||
// Remove duplicates and create tags
|
||||
allTags.stream()
|
||||
.distinct()
|
||||
.forEach(tagName -> {
|
||||
Tag tag = tagService.findOrCreate(tagName.trim());
|
||||
story.addTag(tag);
|
||||
});
|
||||
|
||||
// Extract additional metadata for potential future use
|
||||
extractAdditionalMetadata(metadata, story);
|
||||
|
||||
return story;
|
||||
}
|
||||
|
||||
private String extractTitle(Metadata metadata) {
|
||||
List<String> titles = metadata.getTitles();
|
||||
if (titles != null && !titles.isEmpty()) {
|
||||
return titles.get(0);
|
||||
}
|
||||
return "Untitled EPUB";
|
||||
}
|
||||
|
||||
private String extractAuthorName(Metadata metadata, EPUBImportRequest request) {
|
||||
if (request.getAuthorName() != null && !request.getAuthorName().trim().isEmpty()) {
|
||||
return request.getAuthorName().trim();
|
||||
}
|
||||
|
||||
if (metadata.getAuthors() != null && !metadata.getAuthors().isEmpty()) {
|
||||
return metadata.getAuthors().get(0).getFirstname() + " " + metadata.getAuthors().get(0).getLastname();
|
||||
}
|
||||
|
||||
return "Unknown Author";
|
||||
}
|
||||
|
||||
private String extractDescription(Metadata metadata) {
|
||||
List<String> descriptions = metadata.getDescriptions();
|
||||
if (descriptions != null && !descriptions.isEmpty()) {
|
||||
return descriptions.get(0);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
private List<String> extractTags(Metadata metadata) {
|
||||
List<String> tags = new ArrayList<>();
|
||||
|
||||
// Extract subjects (main source of tags in EPUB)
|
||||
List<String> subjects = metadata.getSubjects();
|
||||
if (subjects != null && !subjects.isEmpty()) {
|
||||
tags.addAll(subjects);
|
||||
}
|
||||
|
||||
// Extract keywords from meta tags
|
||||
String keywords = metadata.getMetaAttribute("keywords");
|
||||
if (keywords != null && !keywords.trim().isEmpty()) {
|
||||
String[] keywordArray = keywords.split("[,;]");
|
||||
for (String keyword : keywordArray) {
|
||||
String trimmed = keyword.trim();
|
||||
if (!trimmed.isEmpty()) {
|
||||
tags.add(trimmed);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Extract genre information
|
||||
String genre = metadata.getMetaAttribute("genre");
|
||||
if (genre != null && !genre.trim().isEmpty()) {
|
||||
tags.add(genre.trim());
|
||||
}
|
||||
|
||||
return tags;
|
||||
}
|
||||
|
||||
private void extractAdditionalMetadata(Metadata metadata, Story story) {
|
||||
// Extract language (could be useful for future i18n)
|
||||
String language = metadata.getLanguage();
|
||||
if (language != null && !language.trim().isEmpty()) {
|
||||
// Store as metadata in story description if needed
|
||||
// For now, we'll just log it for potential future use
|
||||
System.out.println("EPUB Language: " + language);
|
||||
}
|
||||
|
||||
// Extract publisher information
|
||||
List<String> publishers = metadata.getPublishers();
|
||||
if (publishers != null && !publishers.isEmpty()) {
|
||||
String publisher = publishers.get(0);
|
||||
// Could append to description or store separately in future
|
||||
System.out.println("EPUB Publisher: " + publisher);
|
||||
}
|
||||
|
||||
// Extract publication date
|
||||
List<nl.siegmann.epublib.domain.Date> dates = metadata.getDates();
|
||||
if (dates != null && !dates.isEmpty()) {
|
||||
for (nl.siegmann.epublib.domain.Date date : dates) {
|
||||
System.out.println("EPUB Date (" + date.getEvent() + "): " + date.getValue());
|
||||
}
|
||||
}
|
||||
|
||||
// Extract ISBN or other identifiers
|
||||
List<nl.siegmann.epublib.domain.Identifier> identifiers = metadata.getIdentifiers();
|
||||
if (identifiers != null && !identifiers.isEmpty()) {
|
||||
for (nl.siegmann.epublib.domain.Identifier identifier : identifiers) {
|
||||
System.out.println("EPUB Identifier (" + identifier.getScheme() + "): " + identifier.getValue());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private String extractContent(Book book) {
|
||||
StringBuilder contentBuilder = new StringBuilder();
|
||||
|
||||
List<SpineReference> spine = book.getSpine().getSpineReferences();
|
||||
for (SpineReference spineRef : spine) {
|
||||
try {
|
||||
Resource resource = spineRef.getResource();
|
||||
if (resource != null && resource.getData() != null) {
|
||||
String html = new String(resource.getData(), "UTF-8");
|
||||
|
||||
Document doc = Jsoup.parse(html);
|
||||
doc.select("script, style").remove();
|
||||
|
||||
String chapterContent = doc.body() != null ? doc.body().html() : doc.html();
|
||||
|
||||
contentBuilder.append("<div class=\"chapter\">")
|
||||
.append(chapterContent)
|
||||
.append("</div>");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
// Skip this chapter on error
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
return contentBuilder.toString();
|
||||
}
|
||||
|
||||
private Author findOrCreateAuthor(String authorName) {
|
||||
Optional<Author> existingAuthor = authorService.findByNameOptional(authorName);
|
||||
if (existingAuthor.isPresent()) {
|
||||
return existingAuthor.get();
|
||||
}
|
||||
return createAuthor(authorName);
|
||||
}
|
||||
|
||||
private Author createAuthor(String authorName) {
|
||||
Author author = new Author();
|
||||
author.setName(authorName);
|
||||
return authorService.create(author);
|
||||
}
|
||||
|
||||
private Series createSeries(String seriesName) {
|
||||
Series series = new Series();
|
||||
series.setName(seriesName);
|
||||
return seriesService.create(series);
|
||||
}
|
||||
|
||||
private ReadingPosition extractReadingPosition(Book book, Story story) {
|
||||
try {
|
||||
Metadata metadata = book.getMetadata();
|
||||
|
||||
String positionMeta = metadata.getMetaAttribute("reading-position");
|
||||
String cfiMeta = metadata.getMetaAttribute("epub-cfi");
|
||||
|
||||
ReadingPosition position = new ReadingPosition(story);
|
||||
|
||||
if (cfiMeta != null) {
|
||||
position.setEpubCfi(cfiMeta);
|
||||
}
|
||||
|
||||
if (positionMeta != null) {
|
||||
try {
|
||||
String[] parts = positionMeta.split(":");
|
||||
if (parts.length >= 2) {
|
||||
position.setChapterIndex(Integer.parseInt(parts[0]));
|
||||
position.setWordPosition(Integer.parseInt(parts[1]));
|
||||
}
|
||||
} catch (NumberFormatException e) {
|
||||
// Ignore invalid position format
|
||||
}
|
||||
}
|
||||
|
||||
return position;
|
||||
|
||||
} catch (Exception e) {
|
||||
// Return null if no reading position found
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private String extractAndSaveCoverImage(Book book) {
|
||||
try {
|
||||
Resource coverResource = book.getCoverImage();
|
||||
if (coverResource != null && coverResource.getData() != null) {
|
||||
// Create a temporary MultipartFile from the EPUB cover data
|
||||
byte[] imageData = coverResource.getData();
|
||||
String mediaType = coverResource.getMediaType() != null ?
|
||||
coverResource.getMediaType().toString() : "image/jpeg";
|
||||
|
||||
// Determine file extension from media type
|
||||
String extension = getExtensionFromMediaType(mediaType);
|
||||
String filename = "epub_cover_" + System.currentTimeMillis() + "." + extension;
|
||||
|
||||
// Create a custom MultipartFile implementation for the cover image
|
||||
MultipartFile coverFile = new EPUBCoverMultipartFile(imageData, filename, mediaType);
|
||||
|
||||
// Use ImageService to process and save the cover
|
||||
return imageService.uploadImage(coverFile, ImageService.ImageType.COVER);
|
||||
}
|
||||
} catch (Exception e) {
|
||||
// Log error but don't fail the import
|
||||
System.err.println("Failed to extract cover image: " + e.getMessage());
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
private String getExtensionFromMediaType(String mediaType) {
|
||||
switch (mediaType.toLowerCase()) {
|
||||
case "image/jpeg":
|
||||
case "image/jpg":
|
||||
return "jpg";
|
||||
case "image/png":
|
||||
return "png";
|
||||
case "image/gif":
|
||||
return "gif";
|
||||
case "image/webp":
|
||||
return "webp";
|
||||
default:
|
||||
return "jpg"; // Default fallback
|
||||
}
|
||||
}
|
||||
|
||||
private ReadingPositionDto convertToDto(ReadingPosition position) {
|
||||
if (position == null) return null;
|
||||
|
||||
ReadingPositionDto dto = new ReadingPositionDto();
|
||||
dto.setId(position.getId());
|
||||
dto.setStoryId(position.getStory().getId());
|
||||
dto.setChapterIndex(position.getChapterIndex());
|
||||
dto.setChapterTitle(position.getChapterTitle());
|
||||
dto.setWordPosition(position.getWordPosition());
|
||||
dto.setCharacterPosition(position.getCharacterPosition());
|
||||
dto.setPercentageComplete(position.getPercentageComplete());
|
||||
dto.setEpubCfi(position.getEpubCfi());
|
||||
dto.setContextBefore(position.getContextBefore());
|
||||
dto.setContextAfter(position.getContextAfter());
|
||||
dto.setCreatedAt(position.getCreatedAt());
|
||||
dto.setUpdatedAt(position.getUpdatedAt());
|
||||
|
||||
return dto;
|
||||
}
|
||||
|
||||
public List<String> validateEPUBFile(MultipartFile file) {
|
||||
List<String> errors = new ArrayList<>();
|
||||
|
||||
if (file == null || file.isEmpty()) {
|
||||
errors.add("EPUB file is required");
|
||||
return errors;
|
||||
}
|
||||
|
||||
if (!isValidEPUBFile(file)) {
|
||||
errors.add("Invalid EPUB file format. Only .epub files are supported");
|
||||
}
|
||||
|
||||
if (file.getSize() > 100 * 1024 * 1024) { // 100MB limit
|
||||
errors.add("EPUB file size exceeds 100MB limit");
|
||||
}
|
||||
|
||||
try {
|
||||
Book book = parseEPUBFile(file);
|
||||
if (book.getMetadata() == null) {
|
||||
errors.add("EPUB file contains no metadata");
|
||||
}
|
||||
if (book.getSpine() == null || book.getSpine().isEmpty()) {
|
||||
errors.add("EPUB file contains no readable content");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
errors.add("Failed to parse EPUB file: " + e.getMessage());
|
||||
}
|
||||
|
||||
return errors;
|
||||
}
|
||||
|
||||
/**
|
||||
* Custom MultipartFile implementation for EPUB cover images
|
||||
*/
|
||||
private static class EPUBCoverMultipartFile implements MultipartFile {
|
||||
private final byte[] data;
|
||||
private final String filename;
|
||||
private final String contentType;
|
||||
|
||||
public EPUBCoverMultipartFile(byte[] data, String filename, String contentType) {
|
||||
this.data = data;
|
||||
this.filename = filename;
|
||||
this.contentType = contentType;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String getName() {
|
||||
return "coverImage";
|
||||
}
|
||||
|
||||
@Override
|
||||
public String getOriginalFilename() {
|
||||
return filename;
|
||||
}
|
||||
|
||||
@Override
|
||||
public String getContentType() {
|
||||
return contentType;
|
||||
}
|
||||
|
||||
@Override
|
||||
public boolean isEmpty() {
|
||||
return data == null || data.length == 0;
|
||||
}
|
||||
|
||||
@Override
|
||||
public long getSize() {
|
||||
return data != null ? data.length : 0;
|
||||
}
|
||||
|
||||
@Override
|
||||
public byte[] getBytes() {
|
||||
return data;
|
||||
}
|
||||
|
||||
@Override
|
||||
public InputStream getInputStream() {
|
||||
return new java.io.ByteArrayInputStream(data);
|
||||
}
|
||||
|
||||
@Override
|
||||
public void transferTo(java.io.File dest) throws IOException {
|
||||
try (java.io.FileOutputStream fos = new java.io.FileOutputStream(dest)) {
|
||||
fos.write(data);
|
||||
}
|
||||
}
|
||||
|
||||
@Override
|
||||
public void transferTo(java.nio.file.Path dest) throws IOException {
|
||||
java.nio.file.Files.write(dest, data);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -83,7 +83,26 @@ public class HtmlSanitizationService {
|
||||
}
|
||||
}
|
||||
|
||||
// Remove specific attributes (like href from links for security)
|
||||
// Configure allowed protocols for specific attributes (e.g., href)
|
||||
if (config.getAllowedProtocols() != null) {
|
||||
for (Map.Entry<String, Map<String, List<String>>> tagEntry : config.getAllowedProtocols().entrySet()) {
|
||||
String tag = tagEntry.getKey();
|
||||
Map<String, List<String>> attributeProtocols = tagEntry.getValue();
|
||||
|
||||
if (attributeProtocols != null) {
|
||||
for (Map.Entry<String, List<String>> attrEntry : attributeProtocols.entrySet()) {
|
||||
String attribute = attrEntry.getKey();
|
||||
List<String> protocols = attrEntry.getValue();
|
||||
|
||||
if (protocols != null) {
|
||||
allowlist.addProtocols(tag, attribute, protocols.toArray(new String[0]));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Remove specific attributes if needed (deprecated in favor of protocol control)
|
||||
if (config.getRemovedAttributes() != null) {
|
||||
for (Map.Entry<String, List<String>> entry : config.getRemovedAttributes().entrySet()) {
|
||||
String tag = entry.getKey();
|
||||
|
||||
@@ -20,11 +20,11 @@ import java.util.UUID;
|
||||
public class ImageService {
|
||||
|
||||
private static final Set<String> ALLOWED_CONTENT_TYPES = Set.of(
|
||||
"image/jpeg", "image/jpg", "image/png", "image/webp"
|
||||
"image/jpeg", "image/jpg", "image/png"
|
||||
);
|
||||
|
||||
private static final Set<String> ALLOWED_EXTENSIONS = Set.of(
|
||||
"jpg", "jpeg", "png", "webp"
|
||||
"jpg", "jpeg", "png"
|
||||
);
|
||||
|
||||
@Value("${storycove.images.upload-dir:/app/images}")
|
||||
|
||||
@@ -0,0 +1,28 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
@Service
|
||||
public class ReadingTimeService {
|
||||
|
||||
@Value("${app.reading.speed.default:200}")
|
||||
private int defaultWordsPerMinute;
|
||||
|
||||
/**
|
||||
* Calculate estimated reading time in minutes for the given word count
|
||||
* @param wordCount the number of words to read
|
||||
* @return estimated reading time in minutes (minimum 1 minute)
|
||||
*/
|
||||
public int calculateReadingTime(int wordCount) {
|
||||
return Math.max(1, wordCount / defaultWordsPerMinute);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the current words per minute setting
|
||||
* @return words per minute reading speed
|
||||
*/
|
||||
public int getWordsPerMinute() {
|
||||
return defaultWordsPerMinute;
|
||||
}
|
||||
}
|
||||
@@ -4,6 +4,7 @@ import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Series;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.entity.Tag;
|
||||
import com.storycove.repository.ReadingPositionRepository;
|
||||
import com.storycove.repository.StoryRepository;
|
||||
import com.storycove.repository.TagRepository;
|
||||
import com.storycove.service.exception.DuplicateResourceException;
|
||||
@@ -18,6 +19,7 @@ import org.springframework.transaction.annotation.Transactional;
|
||||
import org.springframework.validation.annotation.Validated;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.ArrayList;
|
||||
import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Optional;
|
||||
@@ -31,6 +33,7 @@ public class StoryService {
|
||||
|
||||
private final StoryRepository storyRepository;
|
||||
private final TagRepository tagRepository;
|
||||
private final ReadingPositionRepository readingPositionRepository;
|
||||
private final AuthorService authorService;
|
||||
private final TagService tagService;
|
||||
private final SeriesService seriesService;
|
||||
@@ -40,6 +43,7 @@ public class StoryService {
|
||||
@Autowired
|
||||
public StoryService(StoryRepository storyRepository,
|
||||
TagRepository tagRepository,
|
||||
ReadingPositionRepository readingPositionRepository,
|
||||
AuthorService authorService,
|
||||
TagService tagService,
|
||||
SeriesService seriesService,
|
||||
@@ -47,6 +51,7 @@ public class StoryService {
|
||||
@Autowired(required = false) TypesenseService typesenseService) {
|
||||
this.storyRepository = storyRepository;
|
||||
this.tagRepository = tagRepository;
|
||||
this.readingPositionRepository = readingPositionRepository;
|
||||
this.authorService = authorService;
|
||||
this.tagService = tagService;
|
||||
this.seriesService = seriesService;
|
||||
@@ -271,6 +276,45 @@ public class StoryService {
|
||||
return savedStory;
|
||||
}
|
||||
|
||||
@Transactional
|
||||
public Story updateReadingProgress(UUID id, Integer position) {
|
||||
if (position != null && position < 0) {
|
||||
throw new IllegalArgumentException("Reading position must be non-negative");
|
||||
}
|
||||
|
||||
Story story = findById(id);
|
||||
story.updateReadingProgress(position);
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new reading progress
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
|
||||
@Transactional
|
||||
public Story updateReadingStatus(UUID id, Boolean isRead) {
|
||||
Story story = findById(id);
|
||||
|
||||
if (Boolean.TRUE.equals(isRead)) {
|
||||
story.markAsRead();
|
||||
} else {
|
||||
story.setIsRead(false);
|
||||
story.setLastReadAt(LocalDateTime.now());
|
||||
}
|
||||
|
||||
Story savedStory = storyRepository.save(story);
|
||||
|
||||
// Update Typesense index with new reading status
|
||||
if (typesenseService != null) {
|
||||
typesenseService.updateStory(savedStory);
|
||||
}
|
||||
|
||||
return savedStory;
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public List<Story> findBySeriesOrderByVolume(UUID seriesId) {
|
||||
return storyRepository.findBySeriesOrderByVolume(seriesId);
|
||||
@@ -393,13 +437,17 @@ public class StoryService {
|
||||
public void delete(UUID id) {
|
||||
Story story = findById(id);
|
||||
|
||||
// Clean up reading positions first (to avoid foreign key constraint violations)
|
||||
readingPositionRepository.deleteByStoryId(id);
|
||||
|
||||
// Remove from series if part of one
|
||||
if (story.getSeries() != null) {
|
||||
story.getSeries().removeStory(story);
|
||||
}
|
||||
|
||||
// Remove tags (this will update tag usage counts)
|
||||
story.getTags().forEach(tag -> story.removeTag(tag));
|
||||
// Create a copy to avoid ConcurrentModificationException
|
||||
new ArrayList<>(story.getTags()).forEach(tag -> story.removeTag(tag));
|
||||
|
||||
// Delete from Typesense first (if available)
|
||||
if (typesenseService != null) {
|
||||
@@ -562,6 +610,7 @@ public class StoryService {
|
||||
if (updateReq.getVolume() != null) {
|
||||
story.setVolume(updateReq.getVolume());
|
||||
}
|
||||
// Handle author - either by ID or by name
|
||||
if (updateReq.getAuthorId() != null) {
|
||||
Author author = authorService.findById(updateReq.getAuthorId());
|
||||
story.setAuthor(author);
|
||||
@@ -593,4 +642,12 @@ public class StoryService {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public List<Story> findPotentialDuplicates(String title, String authorName) {
|
||||
if (title == null || title.trim().isEmpty() || authorName == null || authorName.trim().isEmpty()) {
|
||||
return List.of();
|
||||
}
|
||||
return storyRepository.findByTitleAndAuthorNameIgnoreCase(title.trim(), authorName.trim());
|
||||
}
|
||||
}
|
||||
@@ -192,6 +192,11 @@ public class TagService {
|
||||
return tagRepository.countUsedTags();
|
||||
}
|
||||
|
||||
@Transactional(readOnly = true)
|
||||
public List<Tag> findTagsUsedByCollections() {
|
||||
return tagRepository.findTagsUsedByCollections();
|
||||
}
|
||||
|
||||
private void validateTagForCreate(Tag tag) {
|
||||
if (existsByName(tag.getName())) {
|
||||
throw new DuplicateResourceException("Tag", tag.getName());
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.AuthorSearchDto;
|
||||
import com.storycove.dto.FacetCountDto;
|
||||
import com.storycove.dto.SearchResultDto;
|
||||
import com.storycove.dto.StorySearchDto;
|
||||
import com.storycove.entity.Author;
|
||||
@@ -32,12 +33,15 @@ public class TypesenseService {
|
||||
|
||||
private final Client typesenseClient;
|
||||
private final CollectionStoryRepository collectionStoryRepository;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
|
||||
@Autowired
|
||||
public TypesenseService(Client typesenseClient,
|
||||
@Autowired(required = false) CollectionStoryRepository collectionStoryRepository) {
|
||||
@Autowired(required = false) CollectionStoryRepository collectionStoryRepository,
|
||||
ReadingTimeService readingTimeService) {
|
||||
this.typesenseClient = typesenseClient;
|
||||
this.collectionStoryRepository = collectionStoryRepository;
|
||||
this.readingTimeService = readingTimeService;
|
||||
}
|
||||
|
||||
@PostConstruct
|
||||
@@ -65,19 +69,20 @@ public class TypesenseService {
|
||||
private void createStoriesCollection() throws Exception {
|
||||
List<Field> fields = Arrays.asList(
|
||||
new Field().name("id").type("string").facet(false),
|
||||
new Field().name("title").type("string").facet(false),
|
||||
new Field().name("title").type("string").facet(false).sort(true),
|
||||
new Field().name("summary").type("string").facet(false).optional(true),
|
||||
new Field().name("description").type("string").facet(false),
|
||||
new Field().name("contentPlain").type("string").facet(false),
|
||||
new Field().name("authorId").type("string").facet(true),
|
||||
new Field().name("authorName").type("string").facet(true),
|
||||
new Field().name("authorName").type("string").facet(true).sort(true),
|
||||
new Field().name("seriesId").type("string").facet(true).optional(true),
|
||||
new Field().name("seriesName").type("string").facet(true).optional(true),
|
||||
new Field().name("tagNames").type("string[]").facet(true).optional(true),
|
||||
new Field().name("rating").type("int32").facet(true).optional(true),
|
||||
new Field().name("wordCount").type("int32").facet(true).optional(true),
|
||||
new Field().name("volume").type("int32").facet(true).optional(true),
|
||||
new Field().name("createdAt").type("int64").facet(false),
|
||||
new Field().name("seriesName").type("string").facet(true).sort(true).optional(true),
|
||||
new Field().name("tagNames").type("string[]").facet(true),
|
||||
new Field().name("rating").type("int32").facet(true).sort(true).optional(true),
|
||||
new Field().name("wordCount").type("int32").facet(true).sort(true).optional(true),
|
||||
new Field().name("volume").type("int32").facet(true).sort(true).optional(true),
|
||||
new Field().name("createdAt").type("int64").facet(false).sort(true),
|
||||
new Field().name("lastReadAt").type("int64").facet(false).sort(true).optional(true),
|
||||
new Field().name("sourceUrl").type("string").facet(false).optional(true),
|
||||
new Field().name("coverPath").type("string").facet(false).optional(true)
|
||||
);
|
||||
@@ -101,6 +106,26 @@ public class TypesenseService {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Force recreate the stories collection, deleting it first if it exists
|
||||
*/
|
||||
public void recreateStoriesCollection() throws Exception {
|
||||
try {
|
||||
logger.info("Force deleting stories collection for recreation...");
|
||||
typesenseClient.collections(STORIES_COLLECTION).delete();
|
||||
logger.info("Successfully deleted stories collection");
|
||||
} catch (Exception e) {
|
||||
logger.debug("Stories collection didn't exist for deletion: {}", e.getMessage());
|
||||
}
|
||||
|
||||
// Wait a brief moment to ensure deletion is complete
|
||||
Thread.sleep(100);
|
||||
|
||||
logger.info("Creating stories collection with fresh schema...");
|
||||
createStoriesCollection();
|
||||
logger.info("Successfully created stories collection");
|
||||
}
|
||||
|
||||
/**
|
||||
* Force recreate the authors collection, deleting it first if it exists
|
||||
*/
|
||||
@@ -186,8 +211,6 @@ public class TypesenseService {
|
||||
try {
|
||||
long startTime = System.currentTimeMillis();
|
||||
|
||||
logger.info("SEARCH DEBUG: searchStories called with query='{}', tagFilters={}, authorFilters={}",
|
||||
query, tagFilters, authorFilters);
|
||||
|
||||
// Convert 0-based page (frontend/backend) to 1-based page (Typesense)
|
||||
int typesensePage = page + 1;
|
||||
@@ -206,8 +229,13 @@ public class TypesenseService {
|
||||
.highlightFields("title,description")
|
||||
.highlightStartTag("<mark>")
|
||||
.highlightEndTag("</mark>")
|
||||
.facetBy("tagNames,authorName,rating")
|
||||
.maxFacetValues(100)
|
||||
.sortBy(buildSortParameter(normalizedQuery, sortBy, sortDir));
|
||||
|
||||
logger.debug("Typesense search parameters - facetBy: {}, maxFacetValues: {}",
|
||||
searchParameters.getFacetBy(), searchParameters.getMaxFacetValues());
|
||||
|
||||
// Add filters
|
||||
List<String> filterConditions = new ArrayList<>();
|
||||
|
||||
@@ -219,17 +247,12 @@ public class TypesenseService {
|
||||
}
|
||||
|
||||
if (tagFilters != null && !tagFilters.isEmpty()) {
|
||||
logger.info("SEARCH DEBUG: Processing {} tag filters: {}", tagFilters.size(), tagFilters);
|
||||
String tagFilter = tagFilters.stream()
|
||||
.map(tag -> {
|
||||
String escaped = escapeTypesenseValue(tag);
|
||||
String condition = "tagNames:=" + escaped;
|
||||
logger.info("SEARCH DEBUG: Tag '{}' -> escaped '{}' -> condition '{}'", tag, escaped, condition);
|
||||
return condition;
|
||||
})
|
||||
.collect(Collectors.joining(" || "));
|
||||
logger.info("SEARCH DEBUG: Final tag filter condition: '{}'", tagFilter);
|
||||
filterConditions.add("(" + tagFilter + ")");
|
||||
// Use AND logic for multiple tags - items must have ALL selected tags
|
||||
for (String tag : tagFilters) {
|
||||
String escaped = escapeTypesenseValue(tag);
|
||||
String condition = "tagNames:=" + escaped;
|
||||
filterConditions.add(condition);
|
||||
}
|
||||
}
|
||||
|
||||
if (minRating != null) {
|
||||
@@ -242,19 +265,18 @@ public class TypesenseService {
|
||||
|
||||
if (!filterConditions.isEmpty()) {
|
||||
String finalFilter = String.join(" && ", filterConditions);
|
||||
logger.info("SEARCH DEBUG: Final filter condition: '{}'", finalFilter);
|
||||
searchParameters.filterBy(finalFilter);
|
||||
} else {
|
||||
logger.info("SEARCH DEBUG: No filter conditions applied");
|
||||
}
|
||||
|
||||
SearchResult searchResult = typesenseClient.collections(STORIES_COLLECTION)
|
||||
.documents()
|
||||
.search(searchParameters);
|
||||
|
||||
logger.info("SEARCH DEBUG: Typesense returned {} results", searchResult.getFound());
|
||||
logger.debug("Search result facet counts: {}", searchResult.getFacetCounts());
|
||||
|
||||
List<StorySearchDto> results = convertSearchResult(searchResult);
|
||||
Map<String, List<FacetCountDto>> facets = processFacetCounts(searchResult);
|
||||
long searchTime = System.currentTimeMillis() - startTime;
|
||||
|
||||
return new SearchResultDto<>(
|
||||
@@ -263,7 +285,8 @@ public class TypesenseService {
|
||||
page,
|
||||
perPage,
|
||||
query,
|
||||
searchTime
|
||||
searchTime,
|
||||
facets
|
||||
);
|
||||
|
||||
} catch (Exception e) {
|
||||
@@ -294,15 +317,8 @@ public class TypesenseService {
|
||||
|
||||
public void reindexAllStories(List<Story> stories) {
|
||||
try {
|
||||
// Clear existing collection
|
||||
try {
|
||||
typesenseClient.collections(STORIES_COLLECTION).delete();
|
||||
} catch (Exception e) {
|
||||
logger.debug("Collection didn't exist for deletion: {}", e.getMessage());
|
||||
}
|
||||
|
||||
// Recreate collection
|
||||
createStoriesCollection();
|
||||
// Force recreate collection with proper schema
|
||||
recreateStoriesCollection();
|
||||
|
||||
// Bulk index all stories
|
||||
bulkIndexStories(stories);
|
||||
@@ -363,10 +379,11 @@ public class TypesenseService {
|
||||
List<String> tagNames = story.getTags().stream()
|
||||
.map(tag -> tag.getName())
|
||||
.collect(Collectors.toList());
|
||||
logger.debug("INDEXING DEBUG: Story '{}' has tags: {}", story.getTitle(), tagNames);
|
||||
document.put("tagNames", tagNames);
|
||||
logger.debug("Story '{}' has {} tags: {}", story.getTitle(), tagNames.size(), tagNames);
|
||||
} else {
|
||||
logger.debug("INDEXING DEBUG: Story '{}' has no tags", story.getTitle());
|
||||
document.put("tagNames", new ArrayList<>());
|
||||
logger.debug("Story '{}' has no tags, setting empty array", story.getTitle());
|
||||
}
|
||||
|
||||
document.put("rating", story.getRating() != null ? story.getRating() : 0);
|
||||
@@ -376,6 +393,10 @@ public class TypesenseService {
|
||||
story.getCreatedAt().toEpochSecond(java.time.ZoneOffset.UTC) :
|
||||
java.time.LocalDateTime.now().toEpochSecond(java.time.ZoneOffset.UTC));
|
||||
|
||||
if (story.getLastReadAt() != null) {
|
||||
document.put("lastReadAt", story.getLastReadAt().toEpochSecond(java.time.ZoneOffset.UTC));
|
||||
}
|
||||
|
||||
if (story.getSourceUrl() != null) {
|
||||
document.put("sourceUrl", story.getSourceUrl());
|
||||
}
|
||||
@@ -387,6 +408,70 @@ public class TypesenseService {
|
||||
return document;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private Map<String, List<FacetCountDto>> processFacetCounts(SearchResult searchResult) {
|
||||
Map<String, List<FacetCountDto>> facetMap = new HashMap<>();
|
||||
|
||||
if (searchResult.getFacetCounts() != null) {
|
||||
for (FacetCounts facetCounts : searchResult.getFacetCounts()) {
|
||||
String fieldName = facetCounts.getFieldName();
|
||||
List<FacetCountDto> facetValues = new ArrayList<>();
|
||||
|
||||
if (facetCounts.getCounts() != null) {
|
||||
|
||||
for (Object countObj : facetCounts.getCounts()) {
|
||||
if (countObj instanceof org.typesense.model.FacetCountsCounts) {
|
||||
org.typesense.model.FacetCountsCounts facetCount = (org.typesense.model.FacetCountsCounts) countObj;
|
||||
String value = facetCount.getValue();
|
||||
Integer count = facetCount.getCount();
|
||||
|
||||
if (value != null && count != null && count > 0) {
|
||||
facetValues.add(new FacetCountDto(value, count));
|
||||
}
|
||||
} else if (countObj instanceof Map) {
|
||||
// Fallback for Map-based responses
|
||||
Map<String, Object> countMap = (Map<String, Object>) countObj;
|
||||
String value = (String) countMap.get("value");
|
||||
Object countValue = countMap.get("count");
|
||||
|
||||
if (value != null && countValue != null) {
|
||||
Integer count = null;
|
||||
if (countValue instanceof Integer) {
|
||||
count = (Integer) countValue;
|
||||
} else if (countValue instanceof Number) {
|
||||
count = ((Number) countValue).intValue();
|
||||
}
|
||||
|
||||
if (count != null && count > 0) {
|
||||
facetValues.add(new FacetCountDto(value, count));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!facetValues.isEmpty()) {
|
||||
// Sort by count descending, then by value ascending
|
||||
facetValues.sort((a, b) -> {
|
||||
int countCompare = Integer.compare(b.getCount(), a.getCount());
|
||||
if (countCompare != 0) return countCompare;
|
||||
return a.getValue().compareToIgnoreCase(b.getValue());
|
||||
});
|
||||
|
||||
facetMap.put(fieldName, facetValues);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// DEBUG: Log final facet processing results
|
||||
logger.info("FACET DEBUG: Final facetMap contents: {}", facetMap);
|
||||
if (facetMap.isEmpty()) {
|
||||
logger.info("FACET DEBUG: No facets were processed - investigating why");
|
||||
}
|
||||
|
||||
return facetMap;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private List<StorySearchDto> convertSearchResult(SearchResult searchResult) {
|
||||
return searchResult.getHits().stream()
|
||||
@@ -437,6 +522,12 @@ public class TypesenseService {
|
||||
timestamp, 0, java.time.ZoneOffset.UTC));
|
||||
}
|
||||
|
||||
if (doc.get("lastReadAt") != null) {
|
||||
long timestamp = ((Number) doc.get("lastReadAt")).longValue();
|
||||
dto.setLastReadAt(java.time.LocalDateTime.ofEpochSecond(
|
||||
timestamp, 0, java.time.ZoneOffset.UTC));
|
||||
}
|
||||
|
||||
// Set search-specific fields - handle null for wildcard queries
|
||||
Long textMatch = hit.getTextMatch();
|
||||
dto.setSearchScore(textMatch != null ? textMatch : 0L);
|
||||
@@ -585,6 +676,11 @@ public class TypesenseService {
|
||||
case "created_at":
|
||||
case "date":
|
||||
return "createdAt";
|
||||
case "lastread":
|
||||
case "last_read":
|
||||
case "lastreadat":
|
||||
case "last_read_at":
|
||||
return "lastReadAt";
|
||||
case "rating":
|
||||
return "rating";
|
||||
case "wordcount":
|
||||
@@ -732,8 +828,6 @@ public class TypesenseService {
|
||||
|
||||
public SearchResultDto<AuthorSearchDto> searchAuthors(String query, int page, int perPage, String sortBy, String sortOrder) {
|
||||
try {
|
||||
logger.info("AUTHORS SEARCH DEBUG: Searching collection '{}' with query='{}', sortBy='{}', sortOrder='{}'",
|
||||
AUTHORS_COLLECTION, query, sortBy, sortOrder);
|
||||
SearchParameters searchParameters = new SearchParameters()
|
||||
.q(query != null && !query.trim().isEmpty() ? query : "*")
|
||||
.queryBy("name,notes")
|
||||
@@ -745,8 +839,6 @@ public class TypesenseService {
|
||||
String sortDirection = "desc".equalsIgnoreCase(sortOrder) ? "desc" : "asc";
|
||||
String sortField = mapAuthorSortField(sortBy);
|
||||
String sortString = sortField + ":" + sortDirection;
|
||||
logger.info("AUTHORS SEARCH DEBUG: Original sortBy='{}', mapped to='{}', full sort string='{}'",
|
||||
sortBy, sortField, sortString);
|
||||
searchParameters.sortBy(sortString);
|
||||
}
|
||||
|
||||
@@ -757,17 +849,12 @@ public class TypesenseService {
|
||||
.search(searchParameters);
|
||||
} catch (Exception sortException) {
|
||||
// If sorting fails (likely due to schema issues), retry without sorting
|
||||
logger.error("SORTING ERROR DEBUG: Full exception details", sortException);
|
||||
logger.warn("Sorting failed for authors search, retrying without sort: " + sortException.getMessage());
|
||||
|
||||
// Try to get collection info for debugging
|
||||
try {
|
||||
CollectionResponse collection = typesenseClient.collections(AUTHORS_COLLECTION).retrieve();
|
||||
logger.error("COLLECTION DEBUG: Collection '{}' exists with {} documents and {} fields",
|
||||
collection.getName(), collection.getNumDocuments(), collection.getFields().size());
|
||||
logger.error("COLLECTION DEBUG: Fields: {}", collection.getFields());
|
||||
} catch (Exception debugException) {
|
||||
logger.error("COLLECTION DEBUG: Failed to retrieve collection info", debugException);
|
||||
}
|
||||
|
||||
searchParameters = new SearchParameters()
|
||||
@@ -1007,10 +1094,11 @@ public class TypesenseService {
|
||||
}
|
||||
|
||||
if (tags != null && !tags.isEmpty()) {
|
||||
String tagFilter = tags.stream()
|
||||
.map(tag -> "tags:=" + escapeTypesenseValue(tag))
|
||||
.collect(Collectors.joining(" || "));
|
||||
filterConditions.add("(" + tagFilter + ")");
|
||||
// Use AND logic for multiple tags - collections must have ALL selected tags
|
||||
for (String tag : tags) {
|
||||
String condition = "tags:=" + escapeTypesenseValue(tag);
|
||||
filterConditions.add(condition);
|
||||
}
|
||||
}
|
||||
|
||||
if (!filterConditions.isEmpty()) {
|
||||
@@ -1197,6 +1285,15 @@ public class TypesenseService {
|
||||
collection.setCoverImagePath((String) doc.get("cover_image_path"));
|
||||
collection.setIsArchived((Boolean) doc.get("is_archived"));
|
||||
|
||||
// Set tags from Typesense document
|
||||
if (doc.get("tags") != null) {
|
||||
@SuppressWarnings("unchecked")
|
||||
List<String> tagNames = (List<String>) doc.get("tags");
|
||||
// For search results, we'll store tag names in a special field for frontend
|
||||
// since we don't want to load full Tag entities for performance
|
||||
collection.setTagNames(tagNames);
|
||||
}
|
||||
|
||||
// Set timestamps
|
||||
if (doc.get("created_at") != null) {
|
||||
long createdAtSeconds = ((Number) doc.get("created_at")).longValue();
|
||||
@@ -1210,6 +1307,7 @@ public class TypesenseService {
|
||||
// For list/search views, we create a special lightweight collection that stores
|
||||
// the calculated values directly to avoid lazy loading issues
|
||||
CollectionSearchResult searchCollection = new CollectionSearchResult(collection);
|
||||
searchCollection.setWordsPerMinute(readingTimeService.getWordsPerMinute());
|
||||
|
||||
// Set the calculated statistics from the Typesense document
|
||||
if (doc.get("story_count") != null) {
|
||||
|
||||
@@ -0,0 +1,12 @@
|
||||
package com.storycove.service.exception;
|
||||
|
||||
public class InvalidFileException extends RuntimeException {
|
||||
|
||||
public InvalidFileException(String message) {
|
||||
super(message);
|
||||
}
|
||||
|
||||
public InvalidFileException(String message, Throwable cause) {
|
||||
super(message, cause);
|
||||
}
|
||||
}
|
||||
@@ -16,8 +16,8 @@ spring:
|
||||
|
||||
servlet:
|
||||
multipart:
|
||||
max-file-size: 5MB
|
||||
max-request-size: 10MB
|
||||
max-file-size: 10MB # Reduced for security (was 250MB)
|
||||
max-request-size: 15MB # Slightly higher to account for form data
|
||||
|
||||
server:
|
||||
port: 8080
|
||||
@@ -28,10 +28,10 @@ storycove:
|
||||
cors:
|
||||
allowed-origins: ${STORYCOVE_CORS_ALLOWED_ORIGINS:http://localhost:3000,http://localhost:6925}
|
||||
jwt:
|
||||
secret: ${JWT_SECRET:default-secret-key}
|
||||
secret: ${JWT_SECRET} # REQUIRED: Must be at least 32 characters, no default for security
|
||||
expiration: 86400000 # 24 hours
|
||||
auth:
|
||||
password: ${APP_PASSWORD:admin}
|
||||
password: ${APP_PASSWORD} # REQUIRED: No default password for security
|
||||
typesense:
|
||||
api-key: ${TYPESENSE_API_KEY:xyz}
|
||||
host: ${TYPESENSE_HOST:localhost}
|
||||
@@ -43,5 +43,7 @@ storycove:
|
||||
|
||||
logging:
|
||||
level:
|
||||
com.storycove: DEBUG
|
||||
org.springframework.security: DEBUG
|
||||
com.storycove: ${LOG_LEVEL:INFO} # Use INFO for production, DEBUG for development
|
||||
org.springframework.security: WARN # Reduce security logging
|
||||
org.springframework.web: WARN
|
||||
org.hibernate.SQL: ${SQL_LOG_LEVEL:WARN} # Control SQL logging separately
|
||||
@@ -17,7 +17,7 @@
|
||||
"h4": ["class", "style"],
|
||||
"h5": ["class", "style"],
|
||||
"h6": ["class", "style"],
|
||||
"a": ["class"],
|
||||
"a": ["class", "href", "title"],
|
||||
"table": ["class", "style"],
|
||||
"th": ["class", "style", "colspan", "rowspan"],
|
||||
"td": ["class", "style", "colspan", "rowspan"],
|
||||
@@ -38,8 +38,10 @@
|
||||
"font-weight", "font-style", "text-align", "text-decoration", "margin",
|
||||
"padding", "text-indent", "line-height"
|
||||
],
|
||||
"removedAttributes": {
|
||||
"a": ["href", "target"]
|
||||
"allowedProtocols": {
|
||||
"a": {
|
||||
"href": ["http", "https", "#", "/"]
|
||||
}
|
||||
},
|
||||
"description": "HTML sanitization configuration for StoryCove story content. This configuration is shared between frontend (DOMPurify) and backend (Jsoup) to ensure consistency."
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.repository.AuthorRepository;
|
||||
import com.storycove.service.exception.DuplicateResourceException;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
@@ -24,6 +25,7 @@ import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.any;
|
||||
import static org.mockito.ArgumentMatchers.anyString;
|
||||
import static org.mockito.Mockito.*;
|
||||
import static org.mockito.Mockito.times;
|
||||
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
@DisplayName("Author Service Unit Tests")
|
||||
@@ -32,7 +34,6 @@ class AuthorServiceTest {
|
||||
@Mock
|
||||
private AuthorRepository authorRepository;
|
||||
|
||||
@InjectMocks
|
||||
private AuthorService authorService;
|
||||
|
||||
private Author testAuthor;
|
||||
@@ -44,6 +45,9 @@ class AuthorServiceTest {
|
||||
testAuthor = new Author("Test Author");
|
||||
testAuthor.setId(testId);
|
||||
testAuthor.setNotes("Test notes");
|
||||
|
||||
// Initialize service with null TypesenseService (which is allowed)
|
||||
authorService = new AuthorService(authorRepository, null);
|
||||
}
|
||||
|
||||
@Test
|
||||
@@ -307,4 +311,133 @@ class AuthorServiceTest {
|
||||
assertEquals(5L, count);
|
||||
verify(authorRepository).countRecentAuthors(any(java.time.LocalDateTime.class));
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should set author rating with validation")
|
||||
void shouldSetAuthorRating() {
|
||||
when(authorRepository.findById(testId)).thenReturn(Optional.of(testAuthor));
|
||||
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
||||
|
||||
Author result = authorService.setRating(testId, 4);
|
||||
|
||||
assertEquals(4, testAuthor.getAuthorRating());
|
||||
verify(authorRepository, times(2)).findById(testId); // Called twice: once initially, once after flush
|
||||
verify(authorRepository).save(testAuthor);
|
||||
verify(authorRepository).flush();
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception for invalid rating range")
|
||||
void shouldThrowExceptionForInvalidRating() {
|
||||
assertThrows(IllegalArgumentException.class, () -> authorService.setRating(testId, 0));
|
||||
assertThrows(IllegalArgumentException.class, () -> authorService.setRating(testId, 6));
|
||||
|
||||
verify(authorRepository, never()).findById(any());
|
||||
verify(authorRepository, never()).save(any());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle null rating")
|
||||
void shouldHandleNullRating() {
|
||||
when(authorRepository.findById(testId)).thenReturn(Optional.of(testAuthor));
|
||||
when(authorRepository.save(any(Author.class))).thenReturn(testAuthor);
|
||||
|
||||
Author result = authorService.setRating(testId, null);
|
||||
|
||||
assertNull(testAuthor.getAuthorRating());
|
||||
verify(authorRepository, times(2)).findById(testId); // Called twice: once initially, once after flush
|
||||
verify(authorRepository).save(testAuthor);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find all authors with stories")
|
||||
void shouldFindAllAuthorsWithStories() {
|
||||
List<Author> authors = List.of(testAuthor);
|
||||
when(authorRepository.findAll()).thenReturn(authors);
|
||||
|
||||
List<Author> result = authorService.findAllWithStories();
|
||||
|
||||
assertEquals(1, result.size());
|
||||
verify(authorRepository).findAll();
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should get author rating from database")
|
||||
void shouldGetAuthorRatingFromDb() {
|
||||
when(authorRepository.findAuthorRatingById(testId)).thenReturn(4);
|
||||
|
||||
Integer rating = authorService.getAuthorRatingFromDb(testId);
|
||||
|
||||
assertEquals(4, rating);
|
||||
verify(authorRepository).findAuthorRatingById(testId);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should calculate average story rating")
|
||||
void shouldCalculateAverageStoryRating() {
|
||||
// Setup test author with stories
|
||||
Story story1 = new Story("Story 1");
|
||||
story1.setRating(4);
|
||||
Story story2 = new Story("Story 2");
|
||||
story2.setRating(5);
|
||||
|
||||
testAuthor.getStories().add(story1);
|
||||
testAuthor.getStories().add(story2);
|
||||
|
||||
when(authorRepository.findById(testId)).thenReturn(Optional.of(testAuthor));
|
||||
|
||||
Double avgRating = authorService.calculateAverageStoryRating(testId);
|
||||
|
||||
assertEquals(4.5, avgRating);
|
||||
verify(authorRepository).findById(testId);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find authors with stories using repository method")
|
||||
void shouldFindAuthorsWithStoriesFromRepository() {
|
||||
List<Author> authors = List.of(testAuthor);
|
||||
when(authorRepository.findAuthorsWithStories()).thenReturn(authors);
|
||||
|
||||
List<Author> result = authorService.findAuthorsWithStories();
|
||||
|
||||
assertEquals(1, result.size());
|
||||
verify(authorRepository).findAuthorsWithStories();
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find top rated authors")
|
||||
void shouldFindTopRatedAuthors() {
|
||||
List<Author> authors = List.of(testAuthor);
|
||||
when(authorRepository.findTopRatedAuthors()).thenReturn(authors);
|
||||
|
||||
List<Author> result = authorService.findTopRatedAuthors();
|
||||
|
||||
assertEquals(1, result.size());
|
||||
verify(authorRepository).findTopRatedAuthors();
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find most prolific authors")
|
||||
void shouldFindMostProlificAuthors() {
|
||||
List<Author> authors = List.of(testAuthor);
|
||||
when(authorRepository.findMostProlificAuthors()).thenReturn(authors);
|
||||
|
||||
List<Author> result = authorService.findMostProlificAuthors();
|
||||
|
||||
assertEquals(1, result.size());
|
||||
verify(authorRepository).findMostProlificAuthors();
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should find authors by URL domain")
|
||||
void shouldFindAuthorsByUrlDomain() {
|
||||
List<Author> authors = List.of(testAuthor);
|
||||
when(authorRepository.findByUrlDomain("example.com")).thenReturn(authors);
|
||||
|
||||
List<Author> result = authorService.findByUrlDomain("example.com");
|
||||
|
||||
assertEquals(1, result.size());
|
||||
verify(authorRepository).findByUrlDomain("example.com");
|
||||
}
|
||||
|
||||
}
|
||||
@@ -0,0 +1,221 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.repository.ReadingPositionRepository;
|
||||
import com.storycove.repository.StoryRepository;
|
||||
import com.storycove.repository.TagRepository;
|
||||
import com.storycove.service.exception.ResourceNotFoundException;
|
||||
import org.junit.jupiter.api.BeforeEach;
|
||||
import org.junit.jupiter.api.DisplayName;
|
||||
import org.junit.jupiter.api.Test;
|
||||
import org.junit.jupiter.api.extension.ExtendWith;
|
||||
import org.mockito.Mock;
|
||||
import org.mockito.junit.jupiter.MockitoExtension;
|
||||
|
||||
import java.time.LocalDateTime;
|
||||
import java.util.Optional;
|
||||
import java.util.UUID;
|
||||
|
||||
import static org.junit.jupiter.api.Assertions.*;
|
||||
import static org.mockito.ArgumentMatchers.any;
|
||||
import static org.mockito.Mockito.*;
|
||||
|
||||
@ExtendWith(MockitoExtension.class)
|
||||
@DisplayName("Story Service Unit Tests - Reading Progress")
|
||||
class StoryServiceTest {
|
||||
|
||||
@Mock
|
||||
private StoryRepository storyRepository;
|
||||
|
||||
@Mock
|
||||
private TagRepository tagRepository;
|
||||
|
||||
@Mock
|
||||
private ReadingPositionRepository readingPositionRepository;
|
||||
|
||||
private StoryService storyService;
|
||||
private Story testStory;
|
||||
private UUID testId;
|
||||
|
||||
@BeforeEach
|
||||
void setUp() {
|
||||
testId = UUID.randomUUID();
|
||||
testStory = new Story("Test Story");
|
||||
testStory.setId(testId);
|
||||
testStory.setContentHtml("<p>Test content for reading progress tracking</p>");
|
||||
|
||||
// Create StoryService with only required repositories, all services can be null for these tests
|
||||
storyService = new StoryService(
|
||||
storyRepository,
|
||||
tagRepository,
|
||||
readingPositionRepository, // added for foreign key constraint handling
|
||||
null, // authorService - not needed for reading progress tests
|
||||
null, // tagService - not needed for reading progress tests
|
||||
null, // seriesService - not needed for reading progress tests
|
||||
null, // sanitizationService - not needed for reading progress tests
|
||||
null // typesenseService - will test both with and without
|
||||
);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should update reading progress successfully")
|
||||
void shouldUpdateReadingProgress() {
|
||||
Integer position = 150;
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||
|
||||
Story result = storyService.updateReadingProgress(testId, position);
|
||||
|
||||
assertEquals(position, result.getReadingPosition());
|
||||
assertNotNull(result.getLastReadAt());
|
||||
verify(storyRepository).findById(testId);
|
||||
verify(storyRepository).save(testStory);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should update reading progress with zero position")
|
||||
void shouldUpdateReadingProgressWithZeroPosition() {
|
||||
Integer position = 0;
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||
|
||||
Story result = storyService.updateReadingProgress(testId, position);
|
||||
|
||||
assertEquals(0, result.getReadingPosition());
|
||||
assertNotNull(result.getLastReadAt());
|
||||
verify(storyRepository).save(testStory);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception for negative reading position")
|
||||
void shouldThrowExceptionForNegativeReadingPosition() {
|
||||
Integer position = -1;
|
||||
|
||||
assertThrows(IllegalArgumentException.class,
|
||||
() -> storyService.updateReadingProgress(testId, position));
|
||||
|
||||
verify(storyRepository, never()).findById(any());
|
||||
verify(storyRepository, never()).save(any());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle null reading position")
|
||||
void shouldHandleNullReadingPosition() {
|
||||
Integer position = null;
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||
|
||||
Story result = storyService.updateReadingProgress(testId, position);
|
||||
|
||||
assertNull(result.getReadingPosition());
|
||||
assertNotNull(result.getLastReadAt());
|
||||
verify(storyRepository).save(testStory);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when story not found for reading progress update")
|
||||
void shouldThrowExceptionWhenStoryNotFoundForReadingProgress() {
|
||||
Integer position = 100;
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.empty());
|
||||
|
||||
assertThrows(ResourceNotFoundException.class,
|
||||
() -> storyService.updateReadingProgress(testId, position));
|
||||
|
||||
verify(storyRepository).findById(testId);
|
||||
verify(storyRepository, never()).save(any());
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should mark story as read")
|
||||
void shouldMarkStoryAsRead() {
|
||||
Boolean isRead = true;
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||
|
||||
Story result = storyService.updateReadingStatus(testId, isRead);
|
||||
|
||||
assertTrue(result.getIsRead());
|
||||
assertNotNull(result.getLastReadAt());
|
||||
// When marked as read, position should be set to content length
|
||||
assertTrue(result.getReadingPosition() > 0);
|
||||
verify(storyRepository).findById(testId);
|
||||
verify(storyRepository).save(testStory);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should mark story as unread")
|
||||
void shouldMarkStoryAsUnread() {
|
||||
Boolean isRead = false;
|
||||
// First mark story as read to test transition
|
||||
testStory.markAsRead();
|
||||
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||
|
||||
Story result = storyService.updateReadingStatus(testId, isRead);
|
||||
|
||||
assertFalse(result.getIsRead());
|
||||
assertNotNull(result.getLastReadAt());
|
||||
verify(storyRepository).save(testStory);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should handle null reading status")
|
||||
void shouldHandleNullReadingStatus() {
|
||||
Boolean isRead = null;
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||
|
||||
Story result = storyService.updateReadingStatus(testId, isRead);
|
||||
|
||||
assertFalse(result.getIsRead());
|
||||
assertNotNull(result.getLastReadAt());
|
||||
verify(storyRepository).save(testStory);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should throw exception when story not found for reading status update")
|
||||
void shouldThrowExceptionWhenStoryNotFoundForReadingStatus() {
|
||||
Boolean isRead = true;
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.empty());
|
||||
|
||||
assertThrows(ResourceNotFoundException.class,
|
||||
() -> storyService.updateReadingStatus(testId, isRead));
|
||||
|
||||
verify(storyRepository).findById(testId);
|
||||
verify(storyRepository, never()).save(any());
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
@DisplayName("Should update lastReadAt timestamp when updating progress")
|
||||
void shouldUpdateLastReadAtWhenUpdatingProgress() {
|
||||
Integer position = 50;
|
||||
LocalDateTime beforeUpdate = LocalDateTime.now().minusMinutes(1);
|
||||
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||
|
||||
Story result = storyService.updateReadingProgress(testId, position);
|
||||
|
||||
assertNotNull(result.getLastReadAt());
|
||||
assertTrue(result.getLastReadAt().isAfter(beforeUpdate));
|
||||
verify(storyRepository).save(testStory);
|
||||
}
|
||||
|
||||
@Test
|
||||
@DisplayName("Should update lastReadAt timestamp when updating status")
|
||||
void shouldUpdateLastReadAtWhenUpdatingStatus() {
|
||||
Boolean isRead = true;
|
||||
LocalDateTime beforeUpdate = LocalDateTime.now().minusMinutes(1);
|
||||
|
||||
when(storyRepository.findById(testId)).thenReturn(Optional.of(testStory));
|
||||
when(storyRepository.save(any(Story.class))).thenReturn(testStory);
|
||||
|
||||
Story result = storyService.updateReadingStatus(testId, isRead);
|
||||
|
||||
assertNotNull(result.getLastReadAt());
|
||||
assertTrue(result.getLastReadAt().isAfter(beforeUpdate));
|
||||
verify(storyRepository).save(testStory);
|
||||
}
|
||||
}
|
||||
7
backend/test-fixed-export.epub
Normal file
7
backend/test-fixed-export.epub
Normal file
@@ -0,0 +1,7 @@
|
||||
<html>
|
||||
<head><title>502 Bad Gateway</title></head>
|
||||
<body>
|
||||
<center><h1>502 Bad Gateway</h1></center>
|
||||
<hr><center>nginx/1.29.0</center>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,13 +1,40 @@
|
||||
# Use node 18 alpine for smaller image size
|
||||
FROM node:18-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY package*.json ./
|
||||
RUN npm ci --omit=dev
|
||||
# Install dumb-init for proper signal handling
|
||||
RUN apk add --no-cache dumb-init
|
||||
|
||||
# Copy package files
|
||||
COPY package*.json ./
|
||||
|
||||
# Install all dependencies (including devDependencies needed for build)
|
||||
# Set npm config for better CI performance
|
||||
RUN npm ci --prefer-offline --no-audit
|
||||
|
||||
# Copy source code
|
||||
COPY . .
|
||||
|
||||
# Set Node.js memory limit for build (helpful in constrained environments)
|
||||
ENV NODE_OPTIONS="--max-old-space-size=1024"
|
||||
|
||||
# Build the application
|
||||
RUN npm run build
|
||||
|
||||
# Remove devDependencies after build to reduce image size
|
||||
RUN npm prune --omit=dev
|
||||
|
||||
# Create non-root user for security
|
||||
RUN addgroup -g 1001 -S nodejs
|
||||
RUN adduser -S nextjs -u 1001
|
||||
|
||||
# Change ownership of the app directory
|
||||
RUN chown -R nextjs:nodejs /app
|
||||
USER nextjs
|
||||
|
||||
EXPOSE 3000
|
||||
|
||||
# Use dumb-init to handle signals properly
|
||||
ENTRYPOINT ["dumb-init", "--"]
|
||||
CMD ["npm", "start"]
|
||||
42
frontend/Dockerfile.alternative
Normal file
42
frontend/Dockerfile.alternative
Normal file
@@ -0,0 +1,42 @@
|
||||
# Multi-stage build for better caching and smaller final image
|
||||
FROM node:18-alpine AS dependencies
|
||||
|
||||
WORKDIR /app
|
||||
COPY package*.json ./
|
||||
RUN npm ci
|
||||
|
||||
FROM node:18-alpine AS builder
|
||||
|
||||
WORKDIR /app
|
||||
COPY --from=dependencies /app/node_modules ./node_modules
|
||||
COPY . .
|
||||
|
||||
# Increase memory limit for build
|
||||
ENV NODE_OPTIONS="--max-old-space-size=2048"
|
||||
|
||||
RUN npm run build
|
||||
|
||||
FROM node:18-alpine AS runner
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install dumb-init
|
||||
RUN apk add --no-cache dumb-init
|
||||
|
||||
# Create non-root user
|
||||
RUN addgroup -g 1001 -S nodejs
|
||||
RUN adduser -S nextjs -u 1001
|
||||
|
||||
# Copy necessary files
|
||||
COPY --from=builder /app/public ./public
|
||||
COPY --from=builder /app/.next/standalone ./
|
||||
COPY --from=builder /app/.next/static ./.next/static
|
||||
|
||||
# Set correct permissions
|
||||
RUN chown -R nextjs:nodejs /app
|
||||
USER nextjs
|
||||
|
||||
EXPOSE 3000
|
||||
|
||||
ENTRYPOINT ["dumb-init", "--"]
|
||||
CMD ["node", "server.js"]
|
||||
@@ -1,12 +1,19 @@
|
||||
/** @type {import('next').NextConfig} */
|
||||
const nextConfig = {
|
||||
async rewrites() {
|
||||
return [
|
||||
{
|
||||
source: '/api/:path*',
|
||||
destination: 'http://backend:8080/api/:path*',
|
||||
},
|
||||
];
|
||||
// Removed Next.js rewrites since nginx handles all API routing
|
||||
webpack: (config, { isServer }) => {
|
||||
// Exclude cheerio and its dependencies from client-side bundling
|
||||
if (!isServer) {
|
||||
config.resolve.fallback = {
|
||||
...config.resolve.fallback,
|
||||
fs: false,
|
||||
net: false,
|
||||
tls: false,
|
||||
'undici': false,
|
||||
};
|
||||
config.externals.push('cheerio', 'server-only');
|
||||
}
|
||||
return config;
|
||||
},
|
||||
images: {
|
||||
domains: ['localhost'],
|
||||
|
||||
225
frontend/package-lock.json
generated
225
frontend/package-lock.json
generated
@@ -8,14 +8,17 @@
|
||||
"name": "storycove-frontend",
|
||||
"version": "0.1.0",
|
||||
"dependencies": {
|
||||
"@heroicons/react": "^2.2.0",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"axios": "^1.6.0",
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"dompurify": "^3.0.5",
|
||||
"next": "14.0.0",
|
||||
"postcss": "^8.4.31",
|
||||
"react": "^18",
|
||||
"react-dom": "^18",
|
||||
"react-dropzone": "^14.2.3",
|
||||
"server-only": "^0.0.1",
|
||||
"tailwindcss": "^3.3.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
@@ -137,6 +140,15 @@
|
||||
"node": "^12.22.0 || ^14.17.0 || >=16.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@heroicons/react": {
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@heroicons/react/-/react-2.2.0.tgz",
|
||||
"integrity": "sha512-LMcepvRaS9LYHJGsF0zzmgKCUim/X3N/DQKc4jepAXJ7l8QxJ1PmxJzqplF2Z3FE4PqBAIGyJAQ/w4B5dsqbtQ==",
|
||||
"license": "MIT",
|
||||
"peerDependencies": {
|
||||
"react": ">= 16 || ^19.0.0-rc"
|
||||
}
|
||||
},
|
||||
"node_modules/@humanwhocodes/config-array": {
|
||||
"version": "0.13.0",
|
||||
"resolved": "https://registry.npmjs.org/@humanwhocodes/config-array/-/config-array-0.13.0.tgz",
|
||||
@@ -1398,6 +1410,12 @@
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/boolbase": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/boolbase/-/boolbase-1.0.0.tgz",
|
||||
"integrity": "sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/brace-expansion": {
|
||||
"version": "1.1.12",
|
||||
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz",
|
||||
@@ -1569,6 +1587,44 @@
|
||||
"url": "https://github.com/chalk/chalk?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/cheerio": {
|
||||
"version": "1.0.0-rc.12",
|
||||
"resolved": "https://registry.npmjs.org/cheerio/-/cheerio-1.0.0-rc.12.tgz",
|
||||
"integrity": "sha512-VqR8m68vM46BNnuZ5NtnGBKIE/DfN0cRIzg9n40EIq9NOv90ayxLBXA8fXC5gquFRGJSTRqBq25Jt2ECLR431Q==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"cheerio-select": "^2.1.0",
|
||||
"dom-serializer": "^2.0.0",
|
||||
"domhandler": "^5.0.3",
|
||||
"domutils": "^3.0.1",
|
||||
"htmlparser2": "^8.0.1",
|
||||
"parse5": "^7.0.0",
|
||||
"parse5-htmlparser2-tree-adapter": "^7.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/cheeriojs/cheerio?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/cheerio-select": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/cheerio-select/-/cheerio-select-2.1.0.tgz",
|
||||
"integrity": "sha512-9v9kG0LvzrlcungtnJtpGNxY+fzECQKhK4EGJX2vByejiMX84MFNQw4UxPJl3bFbTMw+Dfs37XaIkCwTZfLh4g==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"boolbase": "^1.0.0",
|
||||
"css-select": "^5.1.0",
|
||||
"css-what": "^6.1.0",
|
||||
"domelementtype": "^2.3.0",
|
||||
"domhandler": "^5.0.3",
|
||||
"domutils": "^3.0.1"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
},
|
||||
"node_modules/chokidar": {
|
||||
"version": "3.6.0",
|
||||
"resolved": "https://registry.npmjs.org/chokidar/-/chokidar-3.6.0.tgz",
|
||||
@@ -1671,6 +1727,34 @@
|
||||
"node": ">= 8"
|
||||
}
|
||||
},
|
||||
"node_modules/css-select": {
|
||||
"version": "5.2.2",
|
||||
"resolved": "https://registry.npmjs.org/css-select/-/css-select-5.2.2.tgz",
|
||||
"integrity": "sha512-TizTzUddG/xYLA3NXodFM0fSbNizXjOKhqiQQwvhlspadZokn1KDy0NZFS0wuEubIYAV5/c1/lAr0TaaFXEXzw==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"boolbase": "^1.0.0",
|
||||
"css-what": "^6.1.0",
|
||||
"domhandler": "^5.0.2",
|
||||
"domutils": "^3.0.1",
|
||||
"nth-check": "^2.0.1"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
},
|
||||
"node_modules/css-what": {
|
||||
"version": "6.2.2",
|
||||
"resolved": "https://registry.npmjs.org/css-what/-/css-what-6.2.2.tgz",
|
||||
"integrity": "sha512-u/O3vwbptzhMs3L1fQE82ZSLHQQfto5gyZzwteVIEyeaY5Fc7R4dapF/BvRoSYFeqfBk4m0V1Vafq5Pjv25wvA==",
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
},
|
||||
"node_modules/cssesc": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/cssesc/-/cssesc-3.0.0.tgz",
|
||||
@@ -1859,6 +1943,47 @@
|
||||
"node": ">=6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/dom-serializer": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-2.0.0.tgz",
|
||||
"integrity": "sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"domelementtype": "^2.3.0",
|
||||
"domhandler": "^5.0.2",
|
||||
"entities": "^4.2.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/cheeriojs/dom-serializer?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/domelementtype": {
|
||||
"version": "2.3.0",
|
||||
"resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-2.3.0.tgz",
|
||||
"integrity": "sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
],
|
||||
"license": "BSD-2-Clause"
|
||||
},
|
||||
"node_modules/domhandler": {
|
||||
"version": "5.0.3",
|
||||
"resolved": "https://registry.npmjs.org/domhandler/-/domhandler-5.0.3.tgz",
|
||||
"integrity": "sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"domelementtype": "^2.3.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/domhandler?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/dompurify": {
|
||||
"version": "3.2.6",
|
||||
"resolved": "https://registry.npmjs.org/dompurify/-/dompurify-3.2.6.tgz",
|
||||
@@ -1868,6 +1993,20 @@
|
||||
"@types/trusted-types": "^2.0.7"
|
||||
}
|
||||
},
|
||||
"node_modules/domutils": {
|
||||
"version": "3.2.2",
|
||||
"resolved": "https://registry.npmjs.org/domutils/-/domutils-3.2.2.tgz",
|
||||
"integrity": "sha512-6kZKyUajlDuqlHKVX1w7gyslj9MPIXzIFiz/rGu35uC1wMi+kMhQwGhl4lt9unC9Vb9INnY9Z3/ZA3+FhASLaw==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"dom-serializer": "^2.0.0",
|
||||
"domelementtype": "^2.3.0",
|
||||
"domhandler": "^5.0.3"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/domutils?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/dunder-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
||||
@@ -1900,6 +2039,18 @@
|
||||
"integrity": "sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/entities": {
|
||||
"version": "4.5.0",
|
||||
"resolved": "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz",
|
||||
"integrity": "sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw==",
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">=0.12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/es-abstract": {
|
||||
"version": "1.24.0",
|
||||
"resolved": "https://registry.npmjs.org/es-abstract/-/es-abstract-1.24.0.tgz",
|
||||
@@ -3096,6 +3247,25 @@
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/htmlparser2": {
|
||||
"version": "8.0.2",
|
||||
"resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-8.0.2.tgz",
|
||||
"integrity": "sha512-GYdjWKDkbRLkZ5geuHs5NY1puJ+PXwP7+fHPRz06Eirsb9ugf6d8kkXav6ADhcODhFFPMIXyxkxSuMf3D6NCFA==",
|
||||
"funding": [
|
||||
"https://github.com/fb55/htmlparser2?sponsor=1",
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"domelementtype": "^2.3.0",
|
||||
"domhandler": "^5.0.3",
|
||||
"domutils": "^3.0.1",
|
||||
"entities": "^4.4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/ignore": {
|
||||
"version": "5.3.2",
|
||||
"resolved": "https://registry.npmjs.org/ignore/-/ignore-5.3.2.tgz",
|
||||
@@ -4063,6 +4233,18 @@
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/nth-check": {
|
||||
"version": "2.1.1",
|
||||
"resolved": "https://registry.npmjs.org/nth-check/-/nth-check-2.1.1.tgz",
|
||||
"integrity": "sha512-lqjrjmaOoAnWfMmBPL+XNnynZh2+swxiX3WUE0s4yEHI6m+AwrK2UZOimIRl3X/4QctVqS8AiZjFqyOGrMXb/w==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"boolbase": "^1.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/nth-check?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/object-assign": {
|
||||
"version": "4.1.1",
|
||||
"resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
|
||||
@@ -4291,6 +4473,43 @@
|
||||
"node": ">=6"
|
||||
}
|
||||
},
|
||||
"node_modules/parse5": {
|
||||
"version": "7.3.0",
|
||||
"resolved": "https://registry.npmjs.org/parse5/-/parse5-7.3.0.tgz",
|
||||
"integrity": "sha512-IInvU7fabl34qmi9gY8XOVxhYyMyuH2xUNpb2q8/Y+7552KlejkRvqvD19nMoUW/uQGGbqNpA6Tufu5FL5BZgw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"entities": "^6.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/inikulin/parse5?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/parse5-htmlparser2-tree-adapter": {
|
||||
"version": "7.1.0",
|
||||
"resolved": "https://registry.npmjs.org/parse5-htmlparser2-tree-adapter/-/parse5-htmlparser2-tree-adapter-7.1.0.tgz",
|
||||
"integrity": "sha512-ruw5xyKs6lrpo9x9rCZqZZnIUntICjQAd0Wsmp396Ul9lN/h+ifgVV1x1gZHi8euej6wTfpqX8j+BFQxF0NS/g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"domhandler": "^5.0.3",
|
||||
"parse5": "^7.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/inikulin/parse5?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/parse5/node_modules/entities": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/entities/-/entities-6.0.1.tgz",
|
||||
"integrity": "sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g==",
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">=0.12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/path-exists": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/path-exists/-/path-exists-4.0.0.tgz",
|
||||
@@ -4843,6 +5062,12 @@
|
||||
"node": ">=10"
|
||||
}
|
||||
},
|
||||
"node_modules/server-only": {
|
||||
"version": "0.0.1",
|
||||
"resolved": "https://registry.npmjs.org/server-only/-/server-only-0.0.1.tgz",
|
||||
"integrity": "sha512-qepMx2JxAa5jjfzxG79yPPq+8BuFToHd1hm7kI+Z4zAq1ftQiP7HcxMhDDItrbtwVeLg/cY2JnKnrcFkmiswNA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/set-function-length": {
|
||||
"version": "1.2.2",
|
||||
"resolved": "https://registry.npmjs.org/set-function-length/-/set-function-length-1.2.2.tgz",
|
||||
|
||||
@@ -10,23 +10,26 @@
|
||||
"type-check": "tsc --noEmit"
|
||||
},
|
||||
"dependencies": {
|
||||
"@heroicons/react": "^2.2.0",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"axios": "^1.6.0",
|
||||
"cheerio": "^1.0.0-rc.12",
|
||||
"dompurify": "^3.0.5",
|
||||
"next": "14.0.0",
|
||||
"postcss": "^8.4.31",
|
||||
"react": "^18",
|
||||
"react-dom": "^18",
|
||||
"axios": "^1.6.0",
|
||||
"dompurify": "^3.0.5",
|
||||
"react-dropzone": "^14.2.3",
|
||||
"tailwindcss": "^3.3.0",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"postcss": "^8.4.31"
|
||||
"server-only": "^0.0.1",
|
||||
"tailwindcss": "^3.3.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"typescript": "^5",
|
||||
"@types/dompurify": "^3.0.5",
|
||||
"@types/node": "^20",
|
||||
"@types/react": "^18",
|
||||
"@types/react-dom": "^18",
|
||||
"@types/dompurify": "^3.0.5",
|
||||
"eslint": "^8",
|
||||
"eslint-config-next": "14.0.0"
|
||||
"eslint-config-next": "14.0.0",
|
||||
"typescript": "^5"
|
||||
}
|
||||
}
|
||||
@@ -1,268 +1,39 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useRef } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
import AppLayout from '../../components/layout/AppLayout';
|
||||
import { Input, Textarea } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import TagInput from '../../components/stories/TagInput';
|
||||
import RichTextEditor from '../../components/stories/RichTextEditor';
|
||||
import ImageUpload from '../../components/ui/ImageUpload';
|
||||
import { storyApi } from '../../lib/api';
|
||||
|
||||
export default function AddStoryPage() {
|
||||
const [formData, setFormData] = useState({
|
||||
title: '',
|
||||
summary: '',
|
||||
authorName: '',
|
||||
contentHtml: '',
|
||||
sourceUrl: '',
|
||||
tags: [] as string[],
|
||||
seriesName: '',
|
||||
volume: '',
|
||||
});
|
||||
|
||||
const [coverImage, setCoverImage] = useState<File | null>(null);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||
import { useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
|
||||
export default function AddStoryRedirectPage() {
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
|
||||
const handleInputChange = (field: string) => (
|
||||
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
|
||||
) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
[field]: e.target.value
|
||||
}));
|
||||
useEffect(() => {
|
||||
// Redirect to the new /import route while preserving query parameters
|
||||
const mode = searchParams.get('mode');
|
||||
const authorId = searchParams.get('authorId');
|
||||
const from = searchParams.get('from');
|
||||
|
||||
// Clear error when user starts typing
|
||||
if (errors[field]) {
|
||||
setErrors(prev => ({ ...prev, [field]: '' }));
|
||||
}
|
||||
};
|
||||
let redirectUrl = '/import';
|
||||
const queryParams = new URLSearchParams();
|
||||
|
||||
const handleContentChange = (html: string) => {
|
||||
setFormData(prev => ({ ...prev, contentHtml: html }));
|
||||
if (errors.contentHtml) {
|
||||
setErrors(prev => ({ ...prev, contentHtml: '' }));
|
||||
}
|
||||
};
|
||||
if (mode) queryParams.set('mode', mode);
|
||||
if (authorId) queryParams.set('authorId', authorId);
|
||||
if (from) queryParams.set('from', from);
|
||||
|
||||
const handleTagsChange = (tags: string[]) => {
|
||||
setFormData(prev => ({ ...prev, tags }));
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
const newErrors: Record<string, string> = {};
|
||||
|
||||
if (!formData.title.trim()) {
|
||||
newErrors.title = 'Title is required';
|
||||
const queryString = queryParams.toString();
|
||||
if (queryString) {
|
||||
redirectUrl += '?' + queryString;
|
||||
}
|
||||
|
||||
if (!formData.authorName.trim()) {
|
||||
newErrors.authorName = 'Author name is required';
|
||||
}
|
||||
|
||||
if (!formData.contentHtml.trim()) {
|
||||
newErrors.contentHtml = 'Story content is required';
|
||||
}
|
||||
|
||||
if (formData.seriesName && !formData.volume) {
|
||||
newErrors.volume = 'Volume number is required when series is specified';
|
||||
}
|
||||
|
||||
if (formData.volume && !formData.seriesName.trim()) {
|
||||
newErrors.seriesName = 'Series name is required when volume is specified';
|
||||
}
|
||||
|
||||
setErrors(newErrors);
|
||||
return Object.keys(newErrors).length === 0;
|
||||
};
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!validateForm()) {
|
||||
return;
|
||||
}
|
||||
|
||||
setLoading(true);
|
||||
|
||||
try {
|
||||
// First, create the story with JSON data
|
||||
const storyData = {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: formData.contentHtml,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
seriesName: formData.seriesName || undefined,
|
||||
authorName: formData.authorName || undefined,
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
};
|
||||
|
||||
const story = await storyApi.createStory(storyData);
|
||||
|
||||
// If there's a cover image, upload it separately
|
||||
if (coverImage) {
|
||||
await storyApi.uploadCover(story.id, coverImage);
|
||||
}
|
||||
|
||||
router.push(`/stories/${story.id}`);
|
||||
} catch (error: any) {
|
||||
console.error('Failed to create story:', error);
|
||||
const errorMessage = error.response?.data?.message || 'Failed to create story';
|
||||
setErrors({ submit: errorMessage });
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
router.replace(redirectUrl);
|
||||
}, [router, searchParams]);
|
||||
|
||||
return (
|
||||
<AppLayout>
|
||||
<div className="max-w-4xl mx-auto">
|
||||
<div className="mb-8">
|
||||
<h1 className="text-3xl font-bold theme-header">Add New Story</h1>
|
||||
<p className="theme-text mt-2">
|
||||
Add a story to your personal collection
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<form onSubmit={handleSubmit} className="space-y-6">
|
||||
{/* Title */}
|
||||
<Input
|
||||
label="Title *"
|
||||
value={formData.title}
|
||||
onChange={handleInputChange('title')}
|
||||
placeholder="Enter the story title"
|
||||
error={errors.title}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Author */}
|
||||
<Input
|
||||
label="Author *"
|
||||
value={formData.authorName}
|
||||
onChange={handleInputChange('authorName')}
|
||||
placeholder="Enter the author's name"
|
||||
error={errors.authorName}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Summary */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Summary
|
||||
</label>
|
||||
<Textarea
|
||||
value={formData.summary}
|
||||
onChange={handleInputChange('summary')}
|
||||
placeholder="Brief summary or description of the story..."
|
||||
rows={3}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-1">
|
||||
Optional summary that will be displayed on the story detail page
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Cover Image Upload */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Cover Image
|
||||
</label>
|
||||
<ImageUpload
|
||||
onImageSelect={setCoverImage}
|
||||
accept="image/jpeg,image/png,image/webp"
|
||||
maxSizeMB={5}
|
||||
aspectRatio="3:4"
|
||||
placeholder="Drop a cover image here or click to select"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Story Content *
|
||||
</label>
|
||||
<RichTextEditor
|
||||
value={formData.contentHtml}
|
||||
onChange={handleContentChange}
|
||||
placeholder="Write or paste your story content here..."
|
||||
error={errors.contentHtml}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Tags
|
||||
</label>
|
||||
<TagInput
|
||||
tags={formData.tags}
|
||||
onChange={handleTagsChange}
|
||||
placeholder="Add tags to categorize your story..."
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Series and Volume */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<Input
|
||||
label="Series (optional)"
|
||||
value={formData.seriesName}
|
||||
onChange={handleInputChange('seriesName')}
|
||||
placeholder="Enter series name if part of a series"
|
||||
error={errors.seriesName}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Volume/Part (optional)"
|
||||
type="number"
|
||||
min="1"
|
||||
value={formData.volume}
|
||||
onChange={handleInputChange('volume')}
|
||||
placeholder="Enter volume/part number"
|
||||
error={errors.volume}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Source URL */}
|
||||
<Input
|
||||
label="Source URL (optional)"
|
||||
type="url"
|
||||
value={formData.sourceUrl}
|
||||
onChange={handleInputChange('sourceUrl')}
|
||||
placeholder="https://example.com/original-story-url"
|
||||
/>
|
||||
|
||||
{/* Submit Error */}
|
||||
{errors.submit && (
|
||||
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<p className="text-red-800 dark:text-red-200">{errors.submit}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4 pt-6">
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={() => router.back()}
|
||||
disabled={loading}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
type="submit"
|
||||
loading={loading}
|
||||
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
|
||||
>
|
||||
Add Story
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
<div className="min-h-screen flex items-center justify-center">
|
||||
<div className="text-center">
|
||||
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600 mx-auto mb-4"></div>
|
||||
<p className="text-gray-600">Redirecting...</p>
|
||||
</div>
|
||||
</AppLayout>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -269,7 +269,7 @@ export default function EditAuthorPage() {
|
||||
</label>
|
||||
<ImageUpload
|
||||
onImageSelect={setAvatarImage}
|
||||
accept="image/jpeg,image/png,image/webp"
|
||||
accept="image/jpeg,image/png"
|
||||
maxSizeMB={5}
|
||||
aspectRatio="1:1"
|
||||
placeholder="Drop an avatar image here or click to select"
|
||||
|
||||
@@ -207,15 +207,20 @@ export default function AuthorDetailPage() {
|
||||
<div className="lg:col-span-2 space-y-6">
|
||||
<div className="flex items-center justify-between">
|
||||
<h2 className="text-2xl font-semibold theme-header">Stories</h2>
|
||||
<p className="theme-text">
|
||||
{stories.length} {stories.length === 1 ? 'story' : 'stories'}
|
||||
</p>
|
||||
<div className="flex items-center gap-4">
|
||||
<p className="theme-text">
|
||||
{stories.length} {stories.length === 1 ? 'story' : 'stories'}
|
||||
</p>
|
||||
<Button href={`/import?authorId=${authorId}`}>
|
||||
Add Story
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{stories.length === 0 ? (
|
||||
<div className="text-center py-12 theme-card theme-shadow rounded-lg">
|
||||
<p className="theme-text text-lg mb-4">No stories by this author yet.</p>
|
||||
<Button href="/add-story">Add a Story</Button>
|
||||
<Button href="/import">Add a Story</Button>
|
||||
</div>
|
||||
) : (
|
||||
<div className="space-y-4">
|
||||
|
||||
@@ -26,19 +26,27 @@ export default function CollectionsPage() {
|
||||
const [totalCollections, setTotalCollections] = useState(0);
|
||||
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
||||
|
||||
// Load tags for filtering
|
||||
useEffect(() => {
|
||||
const loadTags = async () => {
|
||||
try {
|
||||
const tagsResult = await tagApi.getTags({ page: 0, size: 1000 });
|
||||
setTags(tagsResult?.content || []);
|
||||
} catch (error) {
|
||||
console.error('Failed to load tags:', error);
|
||||
}
|
||||
};
|
||||
|
||||
loadTags();
|
||||
}, []);
|
||||
// Extract tags from current collection results with counts
|
||||
const extractTagsFromResults = (collections: Collection[]): Tag[] => {
|
||||
const tagCounts: { [key: string]: number } = {};
|
||||
|
||||
collections.forEach(collection => {
|
||||
collection.tagNames?.forEach(tagName => {
|
||||
if (tagCounts[tagName]) {
|
||||
tagCounts[tagName]++;
|
||||
} else {
|
||||
tagCounts[tagName] = 1;
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
return Object.entries(tagCounts).map(([tagName, count]) => ({
|
||||
id: tagName, // Use tag name as ID since we don't have actual IDs from search results
|
||||
name: tagName,
|
||||
collectionCount: count
|
||||
}));
|
||||
};
|
||||
|
||||
// Load collections with search and filters
|
||||
useEffect(() => {
|
||||
@@ -55,9 +63,14 @@ export default function CollectionsPage() {
|
||||
archived: showArchived,
|
||||
});
|
||||
|
||||
setCollections(result?.results || []);
|
||||
const currentCollections = result?.results || [];
|
||||
setCollections(currentCollections);
|
||||
setTotalPages(Math.ceil((result?.totalHits || 0) / pageSize));
|
||||
setTotalCollections(result?.totalHits || 0);
|
||||
|
||||
// Always update tags based on current search results (including initial wildcard search)
|
||||
const resultTags = extractTagsFromResults(currentCollections);
|
||||
setTags(resultTags);
|
||||
} catch (error) {
|
||||
console.error('Failed to load collections:', error);
|
||||
setCollections([]);
|
||||
@@ -223,6 +236,7 @@ export default function CollectionsPage() {
|
||||
tags={tags}
|
||||
selectedTags={selectedTags}
|
||||
onTagToggle={handleTagToggle}
|
||||
showCollectionCount={true}
|
||||
/>
|
||||
</div>
|
||||
|
||||
|
||||
380
frontend/src/app/import/bulk/page.tsx
Normal file
380
frontend/src/app/import/bulk/page.tsx
Normal file
@@ -0,0 +1,380 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
import Link from 'next/link';
|
||||
import BulkImportProgress from '@/components/BulkImportProgress';
|
||||
import ImportLayout from '@/components/layout/ImportLayout';
|
||||
import Button from '@/components/ui/Button';
|
||||
import { Textarea } from '@/components/ui/Input';
|
||||
|
||||
interface ImportResult {
|
||||
url: string;
|
||||
status: 'imported' | 'skipped' | 'error';
|
||||
reason?: string;
|
||||
title?: string;
|
||||
author?: string;
|
||||
error?: string;
|
||||
storyId?: string;
|
||||
}
|
||||
|
||||
interface BulkImportResponse {
|
||||
results: ImportResult[];
|
||||
summary: {
|
||||
total: number;
|
||||
imported: number;
|
||||
skipped: number;
|
||||
errors: number;
|
||||
};
|
||||
combinedStory?: {
|
||||
title: string;
|
||||
author: string;
|
||||
content: string;
|
||||
summary?: string;
|
||||
sourceUrl: string;
|
||||
tags?: string[];
|
||||
};
|
||||
}
|
||||
|
||||
export default function BulkImportPage() {
|
||||
const router = useRouter();
|
||||
const [urls, setUrls] = useState('');
|
||||
const [combineIntoOne, setCombineIntoOne] = useState(false);
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const [results, setResults] = useState<BulkImportResponse | null>(null);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const [sessionId, setSessionId] = useState<string | null>(null);
|
||||
const [showProgress, setShowProgress] = useState(false);
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!urls.trim()) {
|
||||
setError('Please enter at least one URL');
|
||||
return;
|
||||
}
|
||||
|
||||
setIsLoading(true);
|
||||
setError(null);
|
||||
setResults(null);
|
||||
|
||||
try {
|
||||
// Parse URLs from textarea (one per line)
|
||||
const urlList = urls
|
||||
.split('\n')
|
||||
.map(url => url.trim())
|
||||
.filter(url => url.length > 0);
|
||||
|
||||
if (urlList.length === 0) {
|
||||
setError('Please enter at least one valid URL');
|
||||
setIsLoading(false);
|
||||
return;
|
||||
}
|
||||
|
||||
if (urlList.length > 200) {
|
||||
setError('Maximum 200 URLs allowed per bulk import');
|
||||
setIsLoading(false);
|
||||
return;
|
||||
}
|
||||
|
||||
// Generate session ID for progress tracking
|
||||
const newSessionId = `bulk-import-${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
|
||||
setSessionId(newSessionId);
|
||||
setShowProgress(true);
|
||||
|
||||
// Get auth token for server-side API calls
|
||||
const token = localStorage.getItem('auth-token');
|
||||
|
||||
const response = await fetch('/scrape/bulk', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': token ? `Bearer ${token}` : '',
|
||||
},
|
||||
body: JSON.stringify({ urls: urlList, combineIntoOne, sessionId: newSessionId }),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json();
|
||||
throw new Error(errorData.error || 'Failed to start bulk import');
|
||||
}
|
||||
|
||||
const startData = await response.json();
|
||||
console.log('Bulk import started:', startData);
|
||||
|
||||
// The progress component will handle the rest via SSE
|
||||
|
||||
} catch (err) {
|
||||
console.error('Bulk import error:', err);
|
||||
setError(err instanceof Error ? err.message : 'Failed to import stories');
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleReset = () => {
|
||||
setUrls('');
|
||||
setCombineIntoOne(false);
|
||||
setResults(null);
|
||||
setError(null);
|
||||
setSessionId(null);
|
||||
setShowProgress(false);
|
||||
};
|
||||
|
||||
const handleProgressComplete = (data?: any) => {
|
||||
// Progress component will handle this when the operation completes
|
||||
setShowProgress(false);
|
||||
setIsLoading(false);
|
||||
|
||||
// Handle completion data
|
||||
if (data) {
|
||||
if (data.combinedStory && combineIntoOne) {
|
||||
// For combine mode, redirect to import page with the combined content
|
||||
localStorage.setItem('pendingStory', JSON.stringify(data.combinedStory));
|
||||
router.push('/import?from=bulk-combine');
|
||||
return;
|
||||
} else if (data.results && data.summary) {
|
||||
// For individual mode, show the results
|
||||
setResults({
|
||||
results: data.results,
|
||||
summary: data.summary
|
||||
});
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback: just hide progress and let user know it completed
|
||||
console.log('Import completed successfully');
|
||||
};
|
||||
|
||||
const handleProgressError = (errorMessage: string) => {
|
||||
setError(errorMessage);
|
||||
setIsLoading(false);
|
||||
setShowProgress(false);
|
||||
};
|
||||
|
||||
const getStatusColor = (status: string) => {
|
||||
switch (status) {
|
||||
case 'imported': return 'text-green-700 bg-green-50 border-green-200';
|
||||
case 'skipped': return 'text-yellow-700 bg-yellow-50 border-yellow-200';
|
||||
case 'error': return 'text-red-700 bg-red-50 border-red-200';
|
||||
default: return 'text-gray-700 bg-gray-50 border-gray-200';
|
||||
}
|
||||
};
|
||||
|
||||
const getStatusIcon = (status: string) => {
|
||||
switch (status) {
|
||||
case 'imported': return '✓';
|
||||
case 'skipped': return '⚠';
|
||||
case 'error': return '✗';
|
||||
default: return '';
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<ImportLayout
|
||||
title="Bulk Import Stories"
|
||||
description="Import multiple stories at once by providing a list of URLs"
|
||||
>
|
||||
|
||||
{!results ? (
|
||||
// Import Form
|
||||
<form onSubmit={handleSubmit} className="space-y-6">
|
||||
<div>
|
||||
<label htmlFor="urls" className="block text-sm font-medium theme-header mb-2">
|
||||
Story URLs
|
||||
</label>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
Enter one URL per line. Maximum 200 URLs per import.
|
||||
</p>
|
||||
<Textarea
|
||||
id="urls"
|
||||
value={urls}
|
||||
onChange={(e) => setUrls(e.target.value)}
|
||||
placeholder="https://example.com/story1
|
||||
https://example.com/story2
|
||||
https://example.com/story3"
|
||||
rows={12}
|
||||
disabled={isLoading}
|
||||
/>
|
||||
<p className="mt-2 text-sm theme-text">
|
||||
URLs: {urls.split('\n').filter(url => url.trim().length > 0).length}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center">
|
||||
<input
|
||||
id="combine-into-one"
|
||||
type="checkbox"
|
||||
checked={combineIntoOne}
|
||||
onChange={(e) => setCombineIntoOne(e.target.checked)}
|
||||
className="h-4 w-4 theme-accent focus:ring-theme-accent theme-border rounded"
|
||||
disabled={isLoading}
|
||||
/>
|
||||
<label htmlFor="combine-into-one" className="ml-2 block text-sm theme-text">
|
||||
Combine all URL content into a single story
|
||||
</label>
|
||||
</div>
|
||||
|
||||
{combineIntoOne && (
|
||||
<div className="bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg p-4">
|
||||
<div className="text-sm text-blue-800 dark:text-blue-200">
|
||||
<p className="font-medium mb-2">Combined Story Mode:</p>
|
||||
<ul className="list-disc list-inside space-y-1 text-blue-700 dark:text-blue-300">
|
||||
<li>All URLs will be scraped and their content combined into one story</li>
|
||||
<li>Story title and author will be taken from the first URL</li>
|
||||
<li>Import will fail if any URL has no content (title/author can be empty)</li>
|
||||
<li>You'll be redirected to the story creation page to review and edit</li>
|
||||
{urls.split('\n').filter(url => url.trim().length > 0).length > 50 && (
|
||||
<li className="text-yellow-700 dark:text-yellow-300 font-medium">⚠️ Large imports (50+ URLs) may take several minutes and could be truncated if too large</li>
|
||||
)}
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{error && (
|
||||
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg p-4">
|
||||
<div className="flex">
|
||||
<div className="ml-3">
|
||||
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">Error</h3>
|
||||
<div className="mt-2 text-sm text-red-700 dark:text-red-300">
|
||||
{error}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="flex gap-4">
|
||||
<Button
|
||||
type="submit"
|
||||
disabled={isLoading || !urls.trim()}
|
||||
loading={isLoading}
|
||||
>
|
||||
{isLoading ? 'Importing...' : 'Start Import'}
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
type="button"
|
||||
variant="secondary"
|
||||
onClick={handleReset}
|
||||
disabled={isLoading}
|
||||
>
|
||||
Clear
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{/* Progress Component */}
|
||||
{showProgress && sessionId && (
|
||||
<BulkImportProgress
|
||||
sessionId={sessionId}
|
||||
onComplete={handleProgressComplete}
|
||||
onError={handleProgressError}
|
||||
combineMode={combineIntoOne}
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Fallback loading indicator if progress isn't shown yet */}
|
||||
{isLoading && !showProgress && (
|
||||
<div className="bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg p-4">
|
||||
<div className="flex items-center">
|
||||
<div className="animate-spin rounded-full h-5 w-5 border-b-2 border-theme-accent mr-3"></div>
|
||||
<div>
|
||||
<p className="text-sm font-medium text-blue-800 dark:text-blue-200">Starting import...</p>
|
||||
<p className="text-sm text-blue-600 dark:text-blue-300">
|
||||
Preparing to process {urls.split('\n').filter(url => url.trim().length > 0).length} URLs.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</form>
|
||||
) : (
|
||||
// Results
|
||||
<div className="space-y-6">
|
||||
{/* Summary */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Import Summary</h2>
|
||||
<div className="grid grid-cols-2 md:grid-cols-4 gap-4">
|
||||
<div className="text-center">
|
||||
<div className="text-2xl font-bold theme-header">{results.summary.total}</div>
|
||||
<div className="text-sm theme-text">Total URLs</div>
|
||||
</div>
|
||||
<div className="text-center">
|
||||
<div className="text-2xl font-bold text-green-600 dark:text-green-400">{results.summary.imported}</div>
|
||||
<div className="text-sm theme-text">Imported</div>
|
||||
</div>
|
||||
<div className="text-center">
|
||||
<div className="text-2xl font-bold text-yellow-600 dark:text-yellow-400">{results.summary.skipped}</div>
|
||||
<div className="text-sm theme-text">Skipped</div>
|
||||
</div>
|
||||
<div className="text-center">
|
||||
<div className="text-2xl font-bold text-red-600 dark:text-red-400">{results.summary.errors}</div>
|
||||
<div className="text-sm theme-text">Errors</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Detailed Results */}
|
||||
<div className="theme-card theme-shadow rounded-lg">
|
||||
<div className="px-6 py-4 border-b theme-border">
|
||||
<h3 className="text-lg font-medium theme-header">Detailed Results</h3>
|
||||
</div>
|
||||
<div className="divide-y theme-border">
|
||||
{results.results.map((result, index) => (
|
||||
<div key={index} className="p-6">
|
||||
<div className="flex items-start justify-between">
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 mb-2">
|
||||
<span className={`inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium border ${getStatusColor(result.status)}`}>
|
||||
{getStatusIcon(result.status)} {result.status.charAt(0).toUpperCase() + result.status.slice(1)}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<p className="text-sm theme-header font-medium truncate mb-1">
|
||||
{result.url}
|
||||
</p>
|
||||
|
||||
{result.title && result.author && (
|
||||
<p className="text-sm theme-text mb-1">
|
||||
"{result.title}" by {result.author}
|
||||
</p>
|
||||
)}
|
||||
|
||||
{result.reason && (
|
||||
<p className="text-sm theme-text">
|
||||
{result.reason}
|
||||
</p>
|
||||
)}
|
||||
|
||||
{result.error && (
|
||||
<p className="text-sm text-red-600 dark:text-red-400">
|
||||
Error: {result.error}
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex gap-4">
|
||||
<Button onClick={handleReset}>
|
||||
Import More URLs
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
variant="secondary"
|
||||
onClick={() => router.push('/library')}
|
||||
>
|
||||
View Stories
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</ImportLayout>
|
||||
);
|
||||
}
|
||||
409
frontend/src/app/import/epub/page.tsx
Normal file
409
frontend/src/app/import/epub/page.tsx
Normal file
@@ -0,0 +1,409 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
import { DocumentArrowUpIcon } from '@heroicons/react/24/outline';
|
||||
import Button from '@/components/ui/Button';
|
||||
import { Input } from '@/components/ui/Input';
|
||||
import ImportLayout from '@/components/layout/ImportLayout';
|
||||
|
||||
interface EPUBImportResponse {
|
||||
success: boolean;
|
||||
message: string;
|
||||
storyId?: string;
|
||||
storyTitle?: string;
|
||||
totalChapters?: number;
|
||||
wordCount?: number;
|
||||
readingPosition?: any;
|
||||
warnings?: string[];
|
||||
errors?: string[];
|
||||
}
|
||||
|
||||
export default function EPUBImportPage() {
|
||||
const router = useRouter();
|
||||
const [selectedFile, setSelectedFile] = useState<File | null>(null);
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const [isValidating, setIsValidating] = useState(false);
|
||||
const [validationResult, setValidationResult] = useState<any>(null);
|
||||
const [importResult, setImportResult] = useState<EPUBImportResponse | null>(null);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
// Import options
|
||||
const [authorName, setAuthorName] = useState<string>('');
|
||||
const [seriesName, setSeriesName] = useState<string>('');
|
||||
const [seriesVolume, setSeriesVolume] = useState<string>('');
|
||||
const [tags, setTags] = useState<string>('');
|
||||
const [preserveReadingPosition, setPreserveReadingPosition] = useState(true);
|
||||
const [overwriteExisting, setOverwriteExisting] = useState(false);
|
||||
const [createMissingAuthor, setCreateMissingAuthor] = useState(true);
|
||||
const [createMissingSeries, setCreateMissingSeries] = useState(true);
|
||||
|
||||
const handleFileChange = async (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
const file = e.target.files?.[0];
|
||||
if (file) {
|
||||
setSelectedFile(file);
|
||||
setValidationResult(null);
|
||||
setImportResult(null);
|
||||
setError(null);
|
||||
|
||||
if (file.name.toLowerCase().endsWith('.epub')) {
|
||||
await validateFile(file);
|
||||
} else {
|
||||
setError('Please select a valid EPUB file (.epub extension)');
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const validateFile = async (file: File) => {
|
||||
setIsValidating(true);
|
||||
try {
|
||||
const token = localStorage.getItem('auth-token');
|
||||
const formData = new FormData();
|
||||
formData.append('file', file);
|
||||
|
||||
const response = await fetch('/api/stories/epub/validate', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': token ? `Bearer ${token}` : '',
|
||||
},
|
||||
body: formData,
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const result = await response.json();
|
||||
setValidationResult(result);
|
||||
if (!result.valid) {
|
||||
setError('EPUB file validation failed: ' + result.errors.join(', '));
|
||||
}
|
||||
} else if (response.status === 401 || response.status === 403) {
|
||||
setError('Authentication required. Please log in.');
|
||||
} else {
|
||||
setError('Failed to validate EPUB file');
|
||||
}
|
||||
} catch (err) {
|
||||
setError('Error validating EPUB file: ' + (err as Error).message);
|
||||
} finally {
|
||||
setIsValidating(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!selectedFile) {
|
||||
setError('Please select an EPUB file');
|
||||
return;
|
||||
}
|
||||
|
||||
if (validationResult && !validationResult.valid) {
|
||||
setError('Cannot import invalid EPUB file');
|
||||
return;
|
||||
}
|
||||
|
||||
setIsLoading(true);
|
||||
setError(null);
|
||||
|
||||
try {
|
||||
const token = localStorage.getItem('auth-token');
|
||||
const formData = new FormData();
|
||||
formData.append('file', selectedFile);
|
||||
|
||||
if (authorName) formData.append('authorName', authorName);
|
||||
if (seriesName) formData.append('seriesName', seriesName);
|
||||
if (seriesVolume) formData.append('seriesVolume', seriesVolume);
|
||||
if (tags) {
|
||||
const tagList = tags.split(',').map(t => t.trim()).filter(t => t.length > 0);
|
||||
tagList.forEach(tag => formData.append('tags', tag));
|
||||
}
|
||||
|
||||
formData.append('preserveReadingPosition', preserveReadingPosition.toString());
|
||||
formData.append('overwriteExisting', overwriteExisting.toString());
|
||||
formData.append('createMissingAuthor', createMissingAuthor.toString());
|
||||
formData.append('createMissingSeries', createMissingSeries.toString());
|
||||
|
||||
const response = await fetch('/api/stories/epub/import', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': token ? `Bearer ${token}` : '',
|
||||
},
|
||||
body: formData,
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (response.ok && result.success) {
|
||||
setImportResult(result);
|
||||
} else if (response.status === 401 || response.status === 403) {
|
||||
setError('Authentication required. Please log in.');
|
||||
} else {
|
||||
setError(result.message || 'Failed to import EPUB');
|
||||
}
|
||||
} catch (err) {
|
||||
setError('Error importing EPUB: ' + (err as Error).message);
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const resetForm = () => {
|
||||
setSelectedFile(null);
|
||||
setValidationResult(null);
|
||||
setImportResult(null);
|
||||
setError(null);
|
||||
setAuthorName('');
|
||||
setSeriesName('');
|
||||
setSeriesVolume('');
|
||||
setTags('');
|
||||
};
|
||||
|
||||
if (importResult?.success) {
|
||||
return (
|
||||
<ImportLayout
|
||||
title="EPUB Import Successful"
|
||||
description="Your EPUB has been successfully imported into StoryCove"
|
||||
>
|
||||
<div className="space-y-6">
|
||||
<div className="bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold text-green-600 dark:text-green-400 mb-2">Import Completed</h2>
|
||||
<p className="theme-text">
|
||||
Your EPUB has been successfully imported into StoryCove.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<div className="space-y-4">
|
||||
<div>
|
||||
<span className="font-semibold theme-header">Story Title:</span>
|
||||
<p className="theme-text">{importResult.storyTitle}</p>
|
||||
</div>
|
||||
|
||||
{importResult.wordCount && (
|
||||
<div>
|
||||
<span className="font-semibold theme-header">Word Count:</span>
|
||||
<p className="theme-text">{importResult.wordCount.toLocaleString()} words</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{importResult.totalChapters && (
|
||||
<div>
|
||||
<span className="font-semibold theme-header">Chapters:</span>
|
||||
<p className="theme-text">{importResult.totalChapters}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{importResult.warnings && importResult.warnings.length > 0 && (
|
||||
<div className="bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg p-4">
|
||||
<strong className="text-yellow-800 dark:text-yellow-200">Warnings:</strong>
|
||||
<ul className="list-disc list-inside mt-2 text-yellow-700 dark:text-yellow-300">
|
||||
{importResult.warnings.map((warning, index) => (
|
||||
<li key={index}>{warning}</li>
|
||||
))}
|
||||
</ul>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="flex gap-4 mt-6">
|
||||
<Button
|
||||
onClick={() => router.push(`/stories/${importResult.storyId}`)}
|
||||
>
|
||||
View Story
|
||||
</Button>
|
||||
<Button
|
||||
onClick={resetForm}
|
||||
variant="secondary"
|
||||
>
|
||||
Import Another EPUB
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</ImportLayout>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<ImportLayout
|
||||
title="Import EPUB"
|
||||
description="Upload an EPUB file to import it as a story into your library"
|
||||
>
|
||||
{error && (
|
||||
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg p-4 mb-6">
|
||||
<p className="text-red-800 dark:text-red-200">{error}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<form onSubmit={handleSubmit} className="space-y-6">
|
||||
{/* File Upload */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<div className="mb-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-2">Select EPUB File</h3>
|
||||
<p className="theme-text">
|
||||
Choose an EPUB file from your device to import.
|
||||
</p>
|
||||
</div>
|
||||
<div className="space-y-4">
|
||||
<div>
|
||||
<label htmlFor="epub-file" className="block text-sm font-medium theme-header mb-1">EPUB File</label>
|
||||
<Input
|
||||
id="epub-file"
|
||||
type="file"
|
||||
accept=".epub,application/epub+zip"
|
||||
onChange={handleFileChange}
|
||||
disabled={isLoading || isValidating}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{selectedFile && (
|
||||
<div className="flex items-center gap-2">
|
||||
<DocumentArrowUpIcon className="h-5 w-5 theme-text" />
|
||||
<span className="text-sm theme-text">
|
||||
{selectedFile.name} ({(selectedFile.size / 1024 / 1024).toFixed(2)} MB)
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{isValidating && (
|
||||
<div className="text-sm theme-accent">
|
||||
Validating EPUB file...
|
||||
</div>
|
||||
)}
|
||||
|
||||
{validationResult && (
|
||||
<div className="text-sm">
|
||||
{validationResult.valid ? (
|
||||
<span className="inline-flex items-center px-2 py-1 rounded text-xs font-medium bg-green-100 dark:bg-green-900/20 text-green-800 dark:text-green-200">
|
||||
Valid EPUB
|
||||
</span>
|
||||
) : (
|
||||
<span className="inline-flex items-center px-2 py-1 rounded text-xs font-medium bg-red-100 dark:bg-red-900/20 text-red-800 dark:text-red-200">
|
||||
Invalid EPUB
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Import Options */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<div className="mb-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-2">Import Options</h3>
|
||||
<p className="theme-text">
|
||||
Configure how the EPUB should be imported.
|
||||
</p>
|
||||
</div>
|
||||
<div className="space-y-4">
|
||||
<div>
|
||||
<label htmlFor="author-name" className="block text-sm font-medium theme-header mb-1">Author Name (Override)</label>
|
||||
<Input
|
||||
id="author-name"
|
||||
value={authorName}
|
||||
onChange={(e) => setAuthorName(e.target.value)}
|
||||
placeholder="Leave empty to use EPUB metadata"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<label htmlFor="series-name" className="block text-sm font-medium theme-header mb-1">Series Name</label>
|
||||
<Input
|
||||
id="series-name"
|
||||
value={seriesName}
|
||||
onChange={(e) => setSeriesName(e.target.value)}
|
||||
placeholder="Optional: Add to a series"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{seriesName && (
|
||||
<div>
|
||||
<label htmlFor="series-volume" className="block text-sm font-medium theme-header mb-1">Series Volume</label>
|
||||
<Input
|
||||
id="series-volume"
|
||||
type="number"
|
||||
value={seriesVolume}
|
||||
onChange={(e) => setSeriesVolume(e.target.value)}
|
||||
placeholder="Volume number in series"
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div>
|
||||
<label htmlFor="tags" className="block text-sm font-medium theme-header mb-1">Tags</label>
|
||||
<Input
|
||||
id="tags"
|
||||
value={tags}
|
||||
onChange={(e) => setTags(e.target.value)}
|
||||
placeholder="Comma-separated tags (e.g., fantasy, adventure, romance)"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className="space-y-3">
|
||||
<div className="flex items-center">
|
||||
<input
|
||||
type="checkbox"
|
||||
id="preserve-reading-position"
|
||||
checked={preserveReadingPosition}
|
||||
onChange={(e) => setPreserveReadingPosition(e.target.checked)}
|
||||
className="mr-2"
|
||||
/>
|
||||
<label htmlFor="preserve-reading-position" className="text-sm theme-text">
|
||||
Preserve reading position from EPUB metadata
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center">
|
||||
<input
|
||||
type="checkbox"
|
||||
id="create-missing-author"
|
||||
checked={createMissingAuthor}
|
||||
onChange={(e) => setCreateMissingAuthor(e.target.checked)}
|
||||
className="mr-2"
|
||||
/>
|
||||
<label htmlFor="create-missing-author" className="text-sm theme-text">
|
||||
Create author if not found
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center">
|
||||
<input
|
||||
type="checkbox"
|
||||
id="create-missing-series"
|
||||
checked={createMissingSeries}
|
||||
onChange={(e) => setCreateMissingSeries(e.target.checked)}
|
||||
className="mr-2"
|
||||
/>
|
||||
<label htmlFor="create-missing-series" className="text-sm theme-text">
|
||||
Create series if not found
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center">
|
||||
<input
|
||||
type="checkbox"
|
||||
id="overwrite-existing"
|
||||
checked={overwriteExisting}
|
||||
onChange={(e) => setOverwriteExisting(e.target.checked)}
|
||||
className="mr-2"
|
||||
/>
|
||||
<label htmlFor="overwrite-existing" className="text-sm theme-text">
|
||||
Overwrite existing story with same title and author
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Submit Button */}
|
||||
<div className="flex justify-end">
|
||||
<Button
|
||||
type="submit"
|
||||
disabled={!selectedFile || isLoading || isValidating || (validationResult && !validationResult.valid)}
|
||||
loading={isLoading}
|
||||
>
|
||||
{isLoading ? 'Importing...' : 'Import EPUB'}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</ImportLayout>
|
||||
);
|
||||
}
|
||||
545
frontend/src/app/import/page.tsx
Normal file
545
frontend/src/app/import/page.tsx
Normal file
@@ -0,0 +1,545 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useRef, useEffect } from 'react';
|
||||
import { useRouter, useSearchParams } from 'next/navigation';
|
||||
import { useAuth } from '../../contexts/AuthContext';
|
||||
import ImportLayout from '../../components/layout/ImportLayout';
|
||||
import { Input, Textarea } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
import TagInput from '../../components/stories/TagInput';
|
||||
import RichTextEditor from '../../components/stories/RichTextEditor';
|
||||
import ImageUpload from '../../components/ui/ImageUpload';
|
||||
import AuthorSelector from '../../components/stories/AuthorSelector';
|
||||
import { storyApi, authorApi } from '../../lib/api';
|
||||
|
||||
export default function AddStoryPage() {
|
||||
const [importMode, setImportMode] = useState<'manual' | 'url'>('manual');
|
||||
const [importUrl, setImportUrl] = useState('');
|
||||
const [scraping, setScraping] = useState(false);
|
||||
const [formData, setFormData] = useState({
|
||||
title: '',
|
||||
summary: '',
|
||||
authorName: '',
|
||||
authorId: undefined as string | undefined,
|
||||
contentHtml: '',
|
||||
sourceUrl: '',
|
||||
tags: [] as string[],
|
||||
seriesName: '',
|
||||
volume: '',
|
||||
});
|
||||
|
||||
const [coverImage, setCoverImage] = useState<File | null>(null);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||
const [duplicateWarning, setDuplicateWarning] = useState<{
|
||||
show: boolean;
|
||||
count: number;
|
||||
duplicates: Array<{
|
||||
id: string;
|
||||
title: string;
|
||||
authorName: string;
|
||||
createdAt: string;
|
||||
}>;
|
||||
}>({ show: false, count: 0, duplicates: [] });
|
||||
const [checkingDuplicates, setCheckingDuplicates] = useState(false);
|
||||
|
||||
const router = useRouter();
|
||||
const searchParams = useSearchParams();
|
||||
const { isAuthenticated } = useAuth();
|
||||
|
||||
// Handle URL parameters
|
||||
useEffect(() => {
|
||||
const authorId = searchParams.get('authorId');
|
||||
const mode = searchParams.get('mode');
|
||||
|
||||
// Set import mode if specified in URL
|
||||
if (mode === 'url') {
|
||||
setImportMode('url');
|
||||
}
|
||||
|
||||
// Pre-fill author if authorId is provided in URL
|
||||
if (authorId) {
|
||||
const loadAuthor = async () => {
|
||||
try {
|
||||
const author = await authorApi.getAuthor(authorId);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName: author.name,
|
||||
authorId: author.id
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error('Failed to load author:', error);
|
||||
}
|
||||
};
|
||||
loadAuthor();
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Load pending story data from bulk combine operation
|
||||
useEffect(() => {
|
||||
const fromBulkCombine = searchParams.get('from') === 'bulk-combine';
|
||||
if (fromBulkCombine) {
|
||||
const pendingStoryData = localStorage.getItem('pendingStory');
|
||||
if (pendingStoryData) {
|
||||
try {
|
||||
const storyData = JSON.parse(pendingStoryData);
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
title: storyData.title || '',
|
||||
authorName: storyData.author || '',
|
||||
authorId: undefined, // Reset author ID for bulk combined stories
|
||||
contentHtml: storyData.content || '',
|
||||
sourceUrl: storyData.sourceUrl || '',
|
||||
summary: storyData.summary || '',
|
||||
tags: storyData.tags || []
|
||||
}));
|
||||
// Clear the pending data
|
||||
localStorage.removeItem('pendingStory');
|
||||
} catch (error) {
|
||||
console.error('Failed to load pending story data:', error);
|
||||
}
|
||||
}
|
||||
}
|
||||
}, [searchParams]);
|
||||
|
||||
// Check for duplicates when title and author are both present
|
||||
useEffect(() => {
|
||||
const checkDuplicates = async () => {
|
||||
const title = formData.title.trim();
|
||||
const authorName = formData.authorName.trim();
|
||||
|
||||
// Don't check if user isn't authenticated or if title/author are empty
|
||||
if (!isAuthenticated || !title || !authorName) {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
return;
|
||||
}
|
||||
|
||||
// Debounce the check to avoid too many API calls
|
||||
const timeoutId = setTimeout(async () => {
|
||||
try {
|
||||
setCheckingDuplicates(true);
|
||||
const result = await storyApi.checkDuplicate(title, authorName);
|
||||
|
||||
if (result.hasDuplicates) {
|
||||
setDuplicateWarning({
|
||||
show: true,
|
||||
count: result.count,
|
||||
duplicates: result.duplicates
|
||||
});
|
||||
} else {
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to check for duplicates:', error);
|
||||
// Clear any existing duplicate warnings on error
|
||||
setDuplicateWarning({ show: false, count: 0, duplicates: [] });
|
||||
// Don't show error to user as this is just a helpful warning
|
||||
// Authentication errors will be handled by the API interceptor
|
||||
} finally {
|
||||
setCheckingDuplicates(false);
|
||||
}
|
||||
}, 500); // 500ms debounce
|
||||
|
||||
return () => clearTimeout(timeoutId);
|
||||
};
|
||||
|
||||
checkDuplicates();
|
||||
}, [formData.title, formData.authorName, isAuthenticated]);
|
||||
|
||||
const handleInputChange = (field: string) => (
|
||||
e: React.ChangeEvent<HTMLInputElement | HTMLTextAreaElement>
|
||||
) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
[field]: e.target.value
|
||||
}));
|
||||
|
||||
// Clear error when user starts typing
|
||||
if (errors[field]) {
|
||||
setErrors(prev => ({ ...prev, [field]: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleContentChange = (html: string) => {
|
||||
setFormData(prev => ({ ...prev, contentHtml: html }));
|
||||
if (errors.contentHtml) {
|
||||
setErrors(prev => ({ ...prev, contentHtml: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleTagsChange = (tags: string[]) => {
|
||||
setFormData(prev => ({ ...prev, tags }));
|
||||
};
|
||||
|
||||
const handleAuthorChange = (authorName: string, authorId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName,
|
||||
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes author
|
||||
if (errors.authorName) {
|
||||
setErrors(prev => ({ ...prev, authorName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const handleImportFromUrl = async () => {
|
||||
if (!importUrl.trim()) {
|
||||
setErrors({ importUrl: 'URL is required' });
|
||||
return;
|
||||
}
|
||||
|
||||
setScraping(true);
|
||||
setErrors({});
|
||||
|
||||
try {
|
||||
const response = await fetch('/scrape/story', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ url: importUrl }),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json();
|
||||
throw new Error(errorData.error || 'Failed to scrape story');
|
||||
}
|
||||
|
||||
const scrapedStory = await response.json();
|
||||
|
||||
// Pre-fill the form with scraped data
|
||||
setFormData({
|
||||
title: scrapedStory.title || '',
|
||||
summary: scrapedStory.summary || '',
|
||||
authorName: scrapedStory.author || '',
|
||||
authorId: undefined, // Reset author ID when importing from URL (likely new author)
|
||||
contentHtml: scrapedStory.content || '',
|
||||
sourceUrl: scrapedStory.sourceUrl || importUrl,
|
||||
tags: scrapedStory.tags || [],
|
||||
seriesName: '',
|
||||
volume: '',
|
||||
});
|
||||
|
||||
// Switch to manual mode so user can edit the pre-filled data
|
||||
setImportMode('manual');
|
||||
setImportUrl('');
|
||||
|
||||
// Show success message
|
||||
setErrors({ success: 'Story data imported successfully! Review and edit as needed before saving.' });
|
||||
} catch (error: any) {
|
||||
console.error('Failed to import story:', error);
|
||||
setErrors({ importUrl: error.message });
|
||||
} finally {
|
||||
setScraping(false);
|
||||
}
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
const newErrors: Record<string, string> = {};
|
||||
|
||||
if (!formData.title.trim()) {
|
||||
newErrors.title = 'Title is required';
|
||||
}
|
||||
|
||||
if (!formData.authorName.trim()) {
|
||||
newErrors.authorName = 'Author name is required';
|
||||
}
|
||||
|
||||
if (!formData.contentHtml.trim()) {
|
||||
newErrors.contentHtml = 'Story content is required';
|
||||
}
|
||||
|
||||
if (formData.seriesName && !formData.volume) {
|
||||
newErrors.volume = 'Volume number is required when series is specified';
|
||||
}
|
||||
|
||||
if (formData.volume && !formData.seriesName.trim()) {
|
||||
newErrors.seriesName = 'Series name is required when volume is specified';
|
||||
}
|
||||
|
||||
setErrors(newErrors);
|
||||
return Object.keys(newErrors).length === 0;
|
||||
};
|
||||
|
||||
const handleSubmit = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
if (!validateForm()) {
|
||||
return;
|
||||
}
|
||||
|
||||
setLoading(true);
|
||||
|
||||
try {
|
||||
// First, create the story with JSON data
|
||||
const storyData = {
|
||||
title: formData.title,
|
||||
summary: formData.summary || undefined,
|
||||
contentHtml: formData.contentHtml,
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
seriesName: formData.seriesName || undefined,
|
||||
// Send authorId if we have it (existing author), otherwise send authorName (new author)
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags.length > 0 ? formData.tags : undefined,
|
||||
};
|
||||
|
||||
const story = await storyApi.createStory(storyData);
|
||||
|
||||
// If there's a cover image, upload it separately
|
||||
if (coverImage) {
|
||||
await storyApi.uploadCover(story.id, coverImage);
|
||||
}
|
||||
|
||||
router.push(`/stories/${story.id}`);
|
||||
} catch (error: any) {
|
||||
console.error('Failed to create story:', error);
|
||||
const errorMessage = error.response?.data?.message || 'Failed to create story';
|
||||
setErrors({ submit: errorMessage });
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<ImportLayout
|
||||
title="Add New Story"
|
||||
description="Add a story to your personal collection"
|
||||
>
|
||||
{/* URL Import Section */}
|
||||
{importMode === 'url' && (
|
||||
<div className="space-y-6">
|
||||
<div className="bg-gray-50 dark:bg-gray-800/50 rounded-lg p-6">
|
||||
<h3 className="text-lg font-medium theme-header mb-4">Import Story from URL</h3>
|
||||
<p className="theme-text text-sm mb-4">
|
||||
Enter a URL from a supported story site to automatically extract the story content, title, author, and other metadata.
|
||||
</p>
|
||||
|
||||
<div className="space-y-4">
|
||||
<Input
|
||||
label="Story URL"
|
||||
type="url"
|
||||
value={importUrl}
|
||||
onChange={(e) => setImportUrl(e.target.value)}
|
||||
placeholder="https://example.com/story-url"
|
||||
error={errors.importUrl}
|
||||
disabled={scraping}
|
||||
/>
|
||||
|
||||
<div className="flex gap-3">
|
||||
<Button
|
||||
type="button"
|
||||
onClick={handleImportFromUrl}
|
||||
loading={scraping}
|
||||
disabled={!importUrl.trim() || scraping}
|
||||
>
|
||||
{scraping ? 'Importing...' : 'Import Story'}
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={() => setImportMode('manual')}
|
||||
disabled={scraping}
|
||||
>
|
||||
Enter Manually Instead
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<div className="text-xs theme-text">
|
||||
<p className="font-medium mb-1">Supported Sites:</p>
|
||||
<p>Archive of Our Own, DeviantArt, FanFiction.Net, Literotica, Royal Road, Wattpad, and more</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Success Message */}
|
||||
{errors.success && (
|
||||
<div className="p-4 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-lg mb-6">
|
||||
<p className="text-green-800 dark:text-green-200">{errors.success}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Manual Entry Form */}
|
||||
{importMode === 'manual' && (
|
||||
<form onSubmit={handleSubmit} className="space-y-6">
|
||||
{/* Title */}
|
||||
<Input
|
||||
label="Title *"
|
||||
value={formData.title}
|
||||
onChange={handleInputChange('title')}
|
||||
placeholder="Enter the story title"
|
||||
error={errors.title}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Author Selector */}
|
||||
<AuthorSelector
|
||||
label="Author *"
|
||||
value={formData.authorName}
|
||||
onChange={handleAuthorChange}
|
||||
placeholder="Select or enter author name"
|
||||
error={errors.authorName}
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Duplicate Warning */}
|
||||
{duplicateWarning.show && (
|
||||
<div className="p-4 bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-lg">
|
||||
<div className="flex items-start gap-3">
|
||||
<div className="text-yellow-600 dark:text-yellow-400 mt-0.5">
|
||||
⚠️
|
||||
</div>
|
||||
<div>
|
||||
<h4 className="font-medium text-yellow-800 dark:text-yellow-200">
|
||||
Potential Duplicate Detected
|
||||
</h4>
|
||||
<p className="text-sm text-yellow-700 dark:text-yellow-300 mt-1">
|
||||
Found {duplicateWarning.count} existing {duplicateWarning.count === 1 ? 'story' : 'stories'} with the same title and author:
|
||||
</p>
|
||||
<ul className="mt-2 space-y-1">
|
||||
{duplicateWarning.duplicates.map((duplicate, index) => (
|
||||
<li key={duplicate.id} className="text-sm text-yellow-700 dark:text-yellow-300">
|
||||
• <span className="font-medium">{duplicate.title}</span> by {duplicate.authorName}
|
||||
<span className="text-xs ml-2">
|
||||
(added {new Date(duplicate.createdAt).toLocaleDateString()})
|
||||
</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-xs text-yellow-600 dark:text-yellow-400 mt-2">
|
||||
You can still create this story if it's different from the existing ones.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Checking indicator */}
|
||||
{checkingDuplicates && (
|
||||
<div className="flex items-center gap-2 text-sm theme-text">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||
Checking for duplicates...
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Summary */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Summary
|
||||
</label>
|
||||
<Textarea
|
||||
value={formData.summary}
|
||||
onChange={handleInputChange('summary')}
|
||||
placeholder="Brief summary or description of the story..."
|
||||
rows={3}
|
||||
/>
|
||||
<p className="text-sm theme-text mt-1">
|
||||
Optional summary that will be displayed on the story detail page
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{/* Cover Image Upload */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Cover Image
|
||||
</label>
|
||||
<ImageUpload
|
||||
onImageSelect={setCoverImage}
|
||||
accept="image/jpeg,image/png"
|
||||
maxSizeMB={5}
|
||||
aspectRatio="3:4"
|
||||
placeholder="Drop a cover image here or click to select"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Story Content *
|
||||
</label>
|
||||
<RichTextEditor
|
||||
value={formData.contentHtml}
|
||||
onChange={handleContentChange}
|
||||
placeholder="Write or paste your story content here..."
|
||||
error={errors.contentHtml}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Tags
|
||||
</label>
|
||||
<TagInput
|
||||
tags={formData.tags}
|
||||
onChange={handleTagsChange}
|
||||
placeholder="Add tags to categorize your story..."
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Series and Volume */}
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
|
||||
<Input
|
||||
label="Series (optional)"
|
||||
value={formData.seriesName}
|
||||
onChange={handleInputChange('seriesName')}
|
||||
placeholder="Enter series name if part of a series"
|
||||
error={errors.seriesName}
|
||||
/>
|
||||
|
||||
<Input
|
||||
label="Volume/Part (optional)"
|
||||
type="number"
|
||||
min="1"
|
||||
value={formData.volume}
|
||||
onChange={handleInputChange('volume')}
|
||||
placeholder="Enter volume/part number"
|
||||
error={errors.volume}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Source URL */}
|
||||
<Input
|
||||
label="Source URL (optional)"
|
||||
type="url"
|
||||
value={formData.sourceUrl}
|
||||
onChange={handleInputChange('sourceUrl')}
|
||||
placeholder="https://example.com/original-story-url"
|
||||
/>
|
||||
|
||||
{/* Submit Error */}
|
||||
{errors.submit && (
|
||||
<div className="p-4 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg">
|
||||
<p className="text-red-800 dark:text-red-200">{errors.submit}</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4 pt-6">
|
||||
<Button
|
||||
type="button"
|
||||
variant="ghost"
|
||||
onClick={() => router.back()}
|
||||
disabled={loading}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
|
||||
<Button
|
||||
type="submit"
|
||||
loading={loading}
|
||||
disabled={!formData.title || !formData.authorName || !formData.contentHtml}
|
||||
>
|
||||
Add Story
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
)}
|
||||
</ImportLayout>
|
||||
);
|
||||
}
|
||||
@@ -1,8 +1,8 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { searchApi, tagApi } from '../../lib/api';
|
||||
import { Story, Tag } from '../../types/api';
|
||||
import { searchApi } from '../../lib/api';
|
||||
import { Story, Tag, FacetCount } from '../../types/api';
|
||||
import AppLayout from '../../components/layout/AppLayout';
|
||||
import { Input } from '../../components/ui/Input';
|
||||
import Button from '../../components/ui/Button';
|
||||
@@ -11,16 +11,17 @@ import TagFilter from '../../components/stories/TagFilter';
|
||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||
|
||||
type ViewMode = 'grid' | 'list';
|
||||
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating';
|
||||
type SortOption = 'createdAt' | 'title' | 'authorName' | 'rating' | 'wordCount' | 'lastRead';
|
||||
|
||||
export default function LibraryPage() {
|
||||
const [stories, setStories] = useState<Story[]>([]);
|
||||
const [tags, setTags] = useState<Tag[]>([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [searchLoading, setSearchLoading] = useState(false);
|
||||
const [searchQuery, setSearchQuery] = useState('');
|
||||
const [selectedTags, setSelectedTags] = useState<string[]>([]);
|
||||
const [viewMode, setViewMode] = useState<ViewMode>('list');
|
||||
const [sortOption, setSortOption] = useState<SortOption>('createdAt');
|
||||
const [sortOption, setSortOption] = useState<SortOption>('lastRead');
|
||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
||||
const [page, setPage] = useState(0);
|
||||
const [totalPages, setTotalPages] = useState(1);
|
||||
@@ -28,26 +29,32 @@ export default function LibraryPage() {
|
||||
const [refreshTrigger, setRefreshTrigger] = useState(0);
|
||||
|
||||
|
||||
// Load tags for filtering
|
||||
useEffect(() => {
|
||||
const loadTags = async () => {
|
||||
try {
|
||||
const tagsResult = await tagApi.getTags({ page: 0, size: 1000 });
|
||||
setTags(tagsResult?.content || []);
|
||||
} catch (error) {
|
||||
console.error('Failed to load tags:', error);
|
||||
}
|
||||
};
|
||||
|
||||
loadTags();
|
||||
}, []);
|
||||
// Convert facet counts to Tag objects for the UI
|
||||
const convertFacetsToTags = (facets?: Record<string, FacetCount[]>): Tag[] => {
|
||||
if (!facets || !facets.tagNames) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return facets.tagNames.map(facet => ({
|
||||
id: facet.value, // Use tag name as ID since we don't have actual IDs from search results
|
||||
name: facet.value,
|
||||
storyCount: facet.count
|
||||
}));
|
||||
};
|
||||
|
||||
// Debounce search to avoid too many API calls
|
||||
useEffect(() => {
|
||||
const debounceTimer = setTimeout(() => {
|
||||
const performSearch = async () => {
|
||||
try {
|
||||
setLoading(true);
|
||||
// Use searchLoading for background search, loading only for initial load
|
||||
const isInitialLoad = stories.length === 0 && !searchQuery && selectedTags.length === 0;
|
||||
if (isInitialLoad) {
|
||||
setLoading(true);
|
||||
} else {
|
||||
setSearchLoading(true);
|
||||
}
|
||||
|
||||
// Always use search API for consistency - use '*' for match-all when no query
|
||||
const result = await searchApi.search({
|
||||
@@ -57,21 +64,28 @@ export default function LibraryPage() {
|
||||
tags: selectedTags.length > 0 ? selectedTags : undefined,
|
||||
sortBy: sortOption,
|
||||
sortDir: sortDirection,
|
||||
facetBy: ['tagNames'], // Request tag facets for the filter UI
|
||||
});
|
||||
|
||||
setStories(result?.results || []);
|
||||
const currentStories = result?.results || [];
|
||||
setStories(currentStories);
|
||||
setTotalPages(Math.ceil((result?.totalHits || 0) / 20));
|
||||
setTotalElements(result?.totalHits || 0);
|
||||
|
||||
// Update tags from facets - these represent all matching stories, not just current page
|
||||
const resultTags = convertFacetsToTags(result?.facets);
|
||||
setTags(resultTags);
|
||||
} catch (error) {
|
||||
console.error('Failed to load stories:', error);
|
||||
setStories([]);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
setSearchLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
performSearch();
|
||||
}, searchQuery ? 300 : 0); // Debounce search, but not other changes
|
||||
}, searchQuery ? 500 : 0); // 500ms debounce for search, immediate for other changes
|
||||
|
||||
return () => clearTimeout(debounceTimer);
|
||||
}, [searchQuery, selectedTags, page, sortOption, sortDirection, refreshTrigger]);
|
||||
@@ -99,16 +113,21 @@ export default function LibraryPage() {
|
||||
};
|
||||
|
||||
const handleSortChange = (newSortOption: SortOption) => {
|
||||
if (newSortOption === sortOption) {
|
||||
// Toggle direction if same option
|
||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||
setSortOption(newSortOption);
|
||||
// Set appropriate default direction for the sort option
|
||||
if (newSortOption === 'title' || newSortOption === 'authorName') {
|
||||
setSortDirection('asc'); // Alphabetical fields default to ascending
|
||||
} else {
|
||||
setSortOption(newSortOption);
|
||||
setSortDirection('desc'); // Default to desc for new sort option
|
||||
setSortDirection('desc'); // Numeric/date fields default to descending
|
||||
}
|
||||
resetPage();
|
||||
};
|
||||
|
||||
const toggleSortDirection = () => {
|
||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||
resetPage();
|
||||
};
|
||||
|
||||
const clearFilters = () => {
|
||||
setSearchQuery('');
|
||||
setSelectedTags([]);
|
||||
@@ -143,16 +162,21 @@ export default function LibraryPage() {
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<Button href="/add-story">
|
||||
Add New Story
|
||||
</Button>
|
||||
<div className="flex gap-2">
|
||||
<Button href="/import">
|
||||
Add New Story
|
||||
</Button>
|
||||
<Button href="/import/epub" variant="secondary">
|
||||
📖 Import EPUB
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Search and Filters */}
|
||||
<div className="space-y-4">
|
||||
{/* Search Bar */}
|
||||
<div className="flex flex-col sm:flex-row gap-4">
|
||||
<div className="flex-1">
|
||||
<div className="flex-1 relative">
|
||||
<Input
|
||||
type="search"
|
||||
placeholder="Search by title, author, or tags..."
|
||||
@@ -160,6 +184,11 @@ export default function LibraryPage() {
|
||||
onChange={handleSearchChange}
|
||||
className="w-full"
|
||||
/>
|
||||
{searchLoading && (
|
||||
<div className="absolute right-3 top-1/2 transform -translate-y-1/2">
|
||||
<div className="animate-spin h-4 w-4 border-2 border-theme-accent border-t-transparent rounded-full"></div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* View Mode Toggle */}
|
||||
@@ -203,7 +232,19 @@ export default function LibraryPage() {
|
||||
<option value="title">Title</option>
|
||||
<option value="authorName">Author</option>
|
||||
<option value="rating">Rating</option>
|
||||
<option value="wordCount">Word Count</option>
|
||||
<option value="lastRead">Last Read</option>
|
||||
</select>
|
||||
|
||||
{/* Sort Direction Toggle */}
|
||||
<button
|
||||
onClick={toggleSortDirection}
|
||||
className="p-2 rounded-lg theme-card theme-text hover:bg-opacity-80 transition-colors border theme-border"
|
||||
title={`Sort ${sortDirection === 'asc' ? 'Ascending' : 'Descending'}`}
|
||||
aria-label={`Toggle sort direction - currently ${sortDirection === 'asc' ? 'ascending' : 'descending'}`}
|
||||
>
|
||||
{sortDirection === 'asc' ? '↑' : '↓'}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{/* Clear Filters */}
|
||||
@@ -236,7 +277,7 @@ export default function LibraryPage() {
|
||||
Clear Filters
|
||||
</Button>
|
||||
) : (
|
||||
<Button href="/add-story">
|
||||
<Button href="/import">
|
||||
Add Your First Story
|
||||
</Button>
|
||||
)}
|
||||
|
||||
72
frontend/src/app/scrape/author/route.ts
Normal file
72
frontend/src/app/scrape/author/route.ts
Normal file
@@ -0,0 +1,72 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
const body = await request.json();
|
||||
const { url } = body;
|
||||
|
||||
if (!url || typeof url !== 'string') {
|
||||
return NextResponse.json(
|
||||
{ error: 'URL is required and must be a string' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
// Dynamic import to prevent client-side bundling
|
||||
const { StoryScraper } = await import('@/lib/scraper/scraper');
|
||||
|
||||
const scraper = new StoryScraper();
|
||||
const stories = await scraper.scrapeAuthorPage(url);
|
||||
|
||||
return NextResponse.json({ stories });
|
||||
} catch (error) {
|
||||
console.error('Author page scraping error:', error);
|
||||
|
||||
// Check if it's a ScraperError without importing at module level
|
||||
if (error && typeof error === 'object' && error.constructor.name === 'ScraperError') {
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: (error as any).message,
|
||||
url: (error as any).url
|
||||
},
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
if (error instanceof Error) {
|
||||
// Handle specific error types
|
||||
if (error.message.includes('Invalid URL')) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid URL provided' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
if (error.message.includes('not supported')) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Author page scraping is not supported for this website' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
if (error.message.includes('HTTP 404')) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Author page not found at the provided URL' },
|
||||
{ status: 404 }
|
||||
);
|
||||
}
|
||||
|
||||
if (error.message.includes('timeout')) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Request timed out while fetching content' },
|
||||
{ status: 408 }
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return NextResponse.json(
|
||||
{ error: 'Failed to scrape author page. Please try again.' },
|
||||
{ status: 500 }
|
||||
);
|
||||
}
|
||||
}
|
||||
93
frontend/src/app/scrape/bulk/progress/route.ts
Normal file
93
frontend/src/app/scrape/bulk/progress/route.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import { NextRequest } from 'next/server';
|
||||
|
||||
// Configure route timeout for long-running progress streams
|
||||
export const maxDuration = 900; // 15 minutes (900 seconds)
|
||||
|
||||
interface ProgressUpdate {
|
||||
type: 'progress' | 'completed' | 'error';
|
||||
current: number;
|
||||
total: number;
|
||||
message: string;
|
||||
url?: string;
|
||||
title?: string;
|
||||
author?: string;
|
||||
wordCount?: number;
|
||||
totalWordCount?: number;
|
||||
error?: string;
|
||||
combinedStory?: any;
|
||||
results?: any[];
|
||||
summary?: any;
|
||||
}
|
||||
|
||||
// Global progress storage (in production, use Redis or database)
|
||||
const progressStore = new Map<string, ProgressUpdate[]>();
|
||||
|
||||
export async function GET(request: NextRequest) {
|
||||
const searchParams = request.nextUrl.searchParams;
|
||||
const sessionId = searchParams.get('sessionId');
|
||||
|
||||
if (!sessionId) {
|
||||
return new Response('Session ID required', { status: 400 });
|
||||
}
|
||||
|
||||
// Set up Server-Sent Events
|
||||
const stream = new ReadableStream({
|
||||
start(controller) {
|
||||
const encoder = new TextEncoder();
|
||||
|
||||
// Send initial connection message
|
||||
const data = `data: ${JSON.stringify({ type: 'connected', sessionId })}\n\n`;
|
||||
controller.enqueue(encoder.encode(data));
|
||||
|
||||
// Check for progress updates every 500ms
|
||||
const interval = setInterval(() => {
|
||||
const updates = progressStore.get(sessionId);
|
||||
if (updates && updates.length > 0) {
|
||||
// Send all pending updates
|
||||
updates.forEach(update => {
|
||||
const data = `data: ${JSON.stringify(update)}\n\n`;
|
||||
controller.enqueue(encoder.encode(data));
|
||||
});
|
||||
|
||||
// Clear sent updates
|
||||
progressStore.delete(sessionId);
|
||||
|
||||
// If this was a completion or error, close the stream
|
||||
const lastUpdate = updates[updates.length - 1];
|
||||
if (lastUpdate.type === 'completed' || lastUpdate.type === 'error') {
|
||||
clearInterval(interval);
|
||||
controller.close();
|
||||
}
|
||||
}
|
||||
}, 500);
|
||||
|
||||
// Cleanup after timeout
|
||||
setTimeout(() => {
|
||||
clearInterval(interval);
|
||||
progressStore.delete(sessionId);
|
||||
controller.close();
|
||||
}, 900000); // 15 minutes
|
||||
}
|
||||
});
|
||||
|
||||
return new Response(stream, {
|
||||
headers: {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Headers': 'Cache-Control',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
// Helper function for other routes to send progress updates
|
||||
export function sendProgressUpdate(sessionId: string, update: ProgressUpdate) {
|
||||
if (!progressStore.has(sessionId)) {
|
||||
progressStore.set(sessionId, []);
|
||||
}
|
||||
progressStore.get(sessionId)!.push(update);
|
||||
}
|
||||
|
||||
// Export the helper for other modules to use
|
||||
export { progressStore };
|
||||
564
frontend/src/app/scrape/bulk/route.ts
Normal file
564
frontend/src/app/scrape/bulk/route.ts
Normal file
@@ -0,0 +1,564 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
// Configure route timeout for long-running scraping operations
|
||||
export const maxDuration = 900; // 15 minutes (900 seconds)
|
||||
|
||||
// Import progress tracking helper
|
||||
async function sendProgressUpdate(sessionId: string, update: any) {
|
||||
try {
|
||||
// Dynamic import to avoid circular dependency
|
||||
const { sendProgressUpdate: sendUpdate } = await import('./progress/route');
|
||||
sendUpdate(sessionId, update);
|
||||
} catch (error) {
|
||||
console.warn('Failed to send progress update:', error);
|
||||
}
|
||||
}
|
||||
|
||||
interface BulkImportRequest {
|
||||
urls: string[];
|
||||
combineIntoOne?: boolean;
|
||||
sessionId?: string; // For progress tracking
|
||||
}
|
||||
|
||||
interface ImportResult {
|
||||
url: string;
|
||||
status: 'imported' | 'skipped' | 'error';
|
||||
reason?: string;
|
||||
title?: string;
|
||||
author?: string;
|
||||
error?: string;
|
||||
storyId?: string;
|
||||
}
|
||||
|
||||
interface BulkImportResponse {
|
||||
results: ImportResult[];
|
||||
summary: {
|
||||
total: number;
|
||||
imported: number;
|
||||
skipped: number;
|
||||
errors: number;
|
||||
};
|
||||
combinedStory?: {
|
||||
title: string;
|
||||
author: string;
|
||||
content: string;
|
||||
summary?: string;
|
||||
sourceUrl: string;
|
||||
tags?: string[];
|
||||
};
|
||||
}
|
||||
|
||||
// Background processing function for combined mode
|
||||
async function processCombinedMode(
|
||||
urls: string[],
|
||||
sessionId: string,
|
||||
authorization: string,
|
||||
scraper: any
|
||||
) {
|
||||
const results: ImportResult[] = [];
|
||||
let importedCount = 0;
|
||||
let errorCount = 0;
|
||||
|
||||
const combinedContent: string[] = [];
|
||||
let baseTitle = '';
|
||||
let baseAuthor = '';
|
||||
let baseSummary = '';
|
||||
let baseSourceUrl = '';
|
||||
const combinedTags = new Set<string>();
|
||||
let totalWordCount = 0;
|
||||
|
||||
// Send initial progress update
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'progress',
|
||||
current: 0,
|
||||
total: urls.length,
|
||||
message: `Starting to scrape ${urls.length} URLs for combining...`,
|
||||
totalWordCount: 0
|
||||
});
|
||||
|
||||
for (let i = 0; i < urls.length; i++) {
|
||||
const url = urls[i];
|
||||
console.log(`Scraping URL ${i + 1}/${urls.length} for combine: ${url}`);
|
||||
|
||||
// Send progress update
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'progress',
|
||||
current: i,
|
||||
total: urls.length,
|
||||
message: `Scraping URL ${i + 1} of ${urls.length}...`,
|
||||
url: url,
|
||||
totalWordCount
|
||||
});
|
||||
|
||||
try {
|
||||
const trimmedUrl = url.trim();
|
||||
if (!trimmedUrl) {
|
||||
results.push({
|
||||
url: url || 'Empty URL',
|
||||
status: 'error',
|
||||
error: 'Empty URL in combined mode'
|
||||
});
|
||||
errorCount++;
|
||||
continue;
|
||||
}
|
||||
|
||||
const scrapedStory = await scraper.scrapeStory(trimmedUrl);
|
||||
|
||||
// Check if we got content - this is required for combined mode
|
||||
if (!scrapedStory.content || scrapedStory.content.trim() === '') {
|
||||
results.push({
|
||||
url: trimmedUrl,
|
||||
status: 'error',
|
||||
error: 'No content found - required for combined mode'
|
||||
});
|
||||
errorCount++;
|
||||
continue;
|
||||
}
|
||||
|
||||
// Use first URL for base metadata (title can be empty for combined mode)
|
||||
if (i === 0) {
|
||||
baseTitle = scrapedStory.title || 'Combined Story';
|
||||
baseAuthor = scrapedStory.author || 'Unknown Author';
|
||||
baseSummary = scrapedStory.summary || '';
|
||||
baseSourceUrl = trimmedUrl;
|
||||
}
|
||||
|
||||
// Add content with URL separator
|
||||
combinedContent.push(`<!-- Content from: ${trimmedUrl} -->`);
|
||||
if (scrapedStory.title && i > 0) {
|
||||
combinedContent.push(`<h2>${scrapedStory.title}</h2>`);
|
||||
}
|
||||
combinedContent.push(scrapedStory.content);
|
||||
combinedContent.push('<hr/>'); // Visual separator between parts
|
||||
|
||||
// Calculate word count for this story
|
||||
const textContent = scrapedStory.content.replace(/<[^>]*>/g, ''); // Strip HTML
|
||||
const wordCount = textContent.split(/\s+/).filter((word: string) => word.length > 0).length;
|
||||
totalWordCount += wordCount;
|
||||
|
||||
// Collect tags from all stories
|
||||
if (scrapedStory.tags) {
|
||||
scrapedStory.tags.forEach((tag: string) => combinedTags.add(tag));
|
||||
}
|
||||
|
||||
results.push({
|
||||
url: trimmedUrl,
|
||||
status: 'imported',
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author
|
||||
});
|
||||
importedCount++;
|
||||
|
||||
// Send progress update with word count
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'progress',
|
||||
current: i + 1,
|
||||
total: urls.length,
|
||||
message: `Scraped "${scrapedStory.title}" (${wordCount.toLocaleString()} words)`,
|
||||
url: trimmedUrl,
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author,
|
||||
wordCount: wordCount,
|
||||
totalWordCount: totalWordCount
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error(`Error processing URL ${url} in combined mode:`, error);
|
||||
results.push({
|
||||
url: url,
|
||||
status: 'error',
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
errorCount++;
|
||||
}
|
||||
}
|
||||
|
||||
// If we have any errors, fail the entire combined operation
|
||||
if (errorCount > 0) {
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'error',
|
||||
current: urls.length,
|
||||
total: urls.length,
|
||||
message: 'Combined mode failed: some URLs could not be processed',
|
||||
error: `${errorCount} URLs failed to process`
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Check content size to prevent response size issues
|
||||
const combinedContentString = combinedContent.join('\n');
|
||||
const contentSizeInMB = new Blob([combinedContentString]).size / (1024 * 1024);
|
||||
|
||||
console.log(`Combined content size: ${contentSizeInMB.toFixed(2)} MB`);
|
||||
console.log(`Combined content character length: ${combinedContentString.length}`);
|
||||
console.log(`Combined content parts count: ${combinedContent.length}`);
|
||||
|
||||
// Return the combined story data via progress update
|
||||
const combinedStory = {
|
||||
title: baseTitle,
|
||||
author: baseAuthor,
|
||||
content: contentSizeInMB > 10 ?
|
||||
combinedContentString.substring(0, Math.floor(combinedContentString.length * (10 / contentSizeInMB))) + '\n\n<!-- Content truncated due to size limit -->' :
|
||||
combinedContentString,
|
||||
summary: contentSizeInMB > 10 ? baseSummary + ' (Content truncated due to size limit)' : baseSummary,
|
||||
sourceUrl: baseSourceUrl,
|
||||
tags: Array.from(combinedTags)
|
||||
};
|
||||
|
||||
// Send completion notification for combine mode
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'completed',
|
||||
current: urls.length,
|
||||
total: urls.length,
|
||||
message: `Combined scraping completed: ${totalWordCount.toLocaleString()} words from ${importedCount} stories`,
|
||||
totalWordCount: totalWordCount,
|
||||
combinedStory: combinedStory
|
||||
});
|
||||
|
||||
console.log(`Combined scraping completed: ${importedCount} URLs combined into one story`);
|
||||
}
|
||||
|
||||
// Background processing function for individual mode
|
||||
async function processIndividualMode(
|
||||
urls: string[],
|
||||
sessionId: string,
|
||||
authorization: string,
|
||||
scraper: any
|
||||
) {
|
||||
const results: ImportResult[] = [];
|
||||
let importedCount = 0;
|
||||
let skippedCount = 0;
|
||||
let errorCount = 0;
|
||||
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'progress',
|
||||
current: 0,
|
||||
total: urls.length,
|
||||
message: `Starting to import ${urls.length} URLs individually...`
|
||||
});
|
||||
|
||||
for (let i = 0; i < urls.length; i++) {
|
||||
const url = urls[i];
|
||||
console.log(`Processing URL ${i + 1}/${urls.length}: ${url}`);
|
||||
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'progress',
|
||||
current: i,
|
||||
total: urls.length,
|
||||
message: `Processing URL ${i + 1} of ${urls.length}...`,
|
||||
url: url
|
||||
});
|
||||
|
||||
try {
|
||||
// Validate URL format
|
||||
if (!url || typeof url !== 'string' || url.trim() === '') {
|
||||
results.push({
|
||||
url: url || 'Empty URL',
|
||||
status: 'error',
|
||||
error: 'Invalid URL format'
|
||||
});
|
||||
errorCount++;
|
||||
continue;
|
||||
}
|
||||
|
||||
const trimmedUrl = url.trim();
|
||||
|
||||
// Scrape the story
|
||||
const scrapedStory = await scraper.scrapeStory(trimmedUrl);
|
||||
|
||||
// Validate required fields
|
||||
if (!scrapedStory.title || !scrapedStory.author || !scrapedStory.content) {
|
||||
const missingFields = [];
|
||||
if (!scrapedStory.title) missingFields.push('title');
|
||||
if (!scrapedStory.author) missingFields.push('author');
|
||||
if (!scrapedStory.content) missingFields.push('content');
|
||||
|
||||
results.push({
|
||||
url: trimmedUrl,
|
||||
status: 'skipped',
|
||||
reason: `Missing required fields: ${missingFields.join(', ')}`,
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author
|
||||
});
|
||||
skippedCount++;
|
||||
continue;
|
||||
}
|
||||
|
||||
// Check for duplicates using query parameters
|
||||
try {
|
||||
const duplicateCheckUrl = `http://backend:8080/api/stories/check-duplicate`;
|
||||
const params = new URLSearchParams({
|
||||
title: scrapedStory.title,
|
||||
authorName: scrapedStory.author
|
||||
});
|
||||
|
||||
const duplicateCheckResponse = await fetch(`${duplicateCheckUrl}?${params.toString()}`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Authorization': authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
if (duplicateCheckResponse.ok) {
|
||||
const duplicateResult = await duplicateCheckResponse.json();
|
||||
if (duplicateResult.hasDuplicates) {
|
||||
results.push({
|
||||
url: trimmedUrl,
|
||||
status: 'skipped',
|
||||
reason: `Duplicate story found (${duplicateResult.count} existing)`,
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author
|
||||
});
|
||||
skippedCount++;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('Duplicate check failed:', error);
|
||||
// Continue with import if duplicate check fails
|
||||
}
|
||||
|
||||
// Create the story
|
||||
try {
|
||||
const storyData = {
|
||||
title: scrapedStory.title,
|
||||
summary: scrapedStory.summary || undefined,
|
||||
contentHtml: scrapedStory.content,
|
||||
sourceUrl: scrapedStory.sourceUrl || trimmedUrl,
|
||||
authorName: scrapedStory.author,
|
||||
tagNames: scrapedStory.tags && scrapedStory.tags.length > 0 ? scrapedStory.tags : undefined,
|
||||
};
|
||||
|
||||
const createUrl = `http://backend:8080/api/stories`;
|
||||
const createResponse = await fetch(createUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify(storyData),
|
||||
});
|
||||
|
||||
if (!createResponse.ok) {
|
||||
const errorData = await createResponse.json();
|
||||
throw new Error(errorData.message || 'Failed to create story');
|
||||
}
|
||||
|
||||
const createdStory = await createResponse.json();
|
||||
|
||||
results.push({
|
||||
url: trimmedUrl,
|
||||
status: 'imported',
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author,
|
||||
storyId: createdStory.id
|
||||
});
|
||||
importedCount++;
|
||||
|
||||
console.log(`Successfully imported: ${scrapedStory.title} by ${scrapedStory.author} (ID: ${createdStory.id})`);
|
||||
|
||||
// Send progress update for successful import
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'progress',
|
||||
current: i + 1,
|
||||
total: urls.length,
|
||||
message: `Imported "${scrapedStory.title}" by ${scrapedStory.author}`,
|
||||
url: trimmedUrl,
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error(`Failed to create story for ${trimmedUrl}:`, error);
|
||||
|
||||
let errorMessage = 'Failed to create story';
|
||||
if (error instanceof Error) {
|
||||
errorMessage = error.message;
|
||||
}
|
||||
|
||||
results.push({
|
||||
url: trimmedUrl,
|
||||
status: 'error',
|
||||
error: errorMessage,
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author
|
||||
});
|
||||
errorCount++;
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.error(`Error processing URL ${url}:`, error);
|
||||
|
||||
let errorMessage = 'Unknown error';
|
||||
if (error instanceof Error) {
|
||||
errorMessage = error.message;
|
||||
}
|
||||
|
||||
results.push({
|
||||
url: url,
|
||||
status: 'error',
|
||||
error: errorMessage
|
||||
});
|
||||
errorCount++;
|
||||
}
|
||||
}
|
||||
|
||||
// Send completion notification
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'completed',
|
||||
current: urls.length,
|
||||
total: urls.length,
|
||||
message: `Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`,
|
||||
results: results,
|
||||
summary: {
|
||||
total: urls.length,
|
||||
imported: importedCount,
|
||||
skipped: skippedCount,
|
||||
errors: errorCount
|
||||
}
|
||||
});
|
||||
|
||||
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
|
||||
|
||||
// Trigger Typesense reindex if any stories were imported
|
||||
if (importedCount > 0) {
|
||||
try {
|
||||
console.log('Triggering Typesense reindex after bulk import...');
|
||||
const reindexUrl = `http://backend:8080/api/stories/reindex-typesense`;
|
||||
const reindexResponse = await fetch(reindexUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Authorization': authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
if (reindexResponse.ok) {
|
||||
const reindexResult = await reindexResponse.json();
|
||||
console.log('Typesense reindex completed:', reindexResult);
|
||||
} else {
|
||||
console.warn('Typesense reindex failed:', reindexResponse.status);
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('Failed to trigger Typesense reindex:', error);
|
||||
// Don't fail the whole request if reindex fails
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Background processing function
|
||||
async function processBulkImport(
|
||||
urls: string[],
|
||||
combineIntoOne: boolean,
|
||||
sessionId: string,
|
||||
authorization: string
|
||||
) {
|
||||
try {
|
||||
// Dynamic imports to prevent client-side bundling
|
||||
const { StoryScraper } = await import('@/lib/scraper/scraper');
|
||||
|
||||
const scraper = new StoryScraper();
|
||||
|
||||
console.log(`Starting bulk scraping for ${urls.length} URLs${combineIntoOne ? ' (combine mode)' : ''}`);
|
||||
console.log(`Session ID: ${sessionId}`);
|
||||
|
||||
// Quick test to verify backend connectivity
|
||||
try {
|
||||
console.log(`Testing backend connectivity at: http://backend:8080/api/stories/check-duplicate`);
|
||||
const testResponse = await fetch(`http://backend:8080/api/stories/check-duplicate?title=test&authorName=test`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Authorization': authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
console.log(`Backend test response status: ${testResponse.status}`);
|
||||
} catch (error) {
|
||||
console.error(`Backend connectivity test failed:`, error);
|
||||
}
|
||||
|
||||
// Handle combined mode
|
||||
if (combineIntoOne) {
|
||||
await processCombinedMode(urls, sessionId, authorization, scraper);
|
||||
} else {
|
||||
// Normal individual processing mode
|
||||
await processIndividualMode(urls, sessionId, authorization, scraper);
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.error('Background bulk import error:', error);
|
||||
await sendProgressUpdate(sessionId, {
|
||||
type: 'error',
|
||||
current: 0,
|
||||
total: urls.length,
|
||||
message: 'Bulk import failed due to an error',
|
||||
error: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
// Check for authentication
|
||||
const authorization = request.headers.get('authorization');
|
||||
if (!authorization) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Authentication required for bulk import' },
|
||||
{ status: 401 }
|
||||
);
|
||||
}
|
||||
|
||||
const body = await request.json();
|
||||
const { urls, combineIntoOne = false, sessionId } = body as BulkImportRequest;
|
||||
|
||||
if (!urls || !Array.isArray(urls) || urls.length === 0) {
|
||||
return NextResponse.json(
|
||||
{ error: 'URLs array is required and must not be empty' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
if (urls.length > 200) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Maximum 200 URLs allowed per bulk import' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
if (!sessionId) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Session ID is required for progress tracking' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
// Start the background processing
|
||||
processBulkImport(urls, combineIntoOne, sessionId, authorization).catch(error => {
|
||||
console.error('Failed to start background processing:', error);
|
||||
});
|
||||
|
||||
// Return immediately with session info
|
||||
return NextResponse.json({
|
||||
message: 'Bulk import started',
|
||||
sessionId: sessionId,
|
||||
totalUrls: urls.length,
|
||||
combineMode: combineIntoOne
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Bulk import initialization error:', error);
|
||||
|
||||
if (error instanceof Error) {
|
||||
return NextResponse.json(
|
||||
{ error: `Bulk import failed to start: ${error.message}` },
|
||||
{ status: 500 }
|
||||
);
|
||||
}
|
||||
|
||||
return NextResponse.json(
|
||||
{ error: 'Bulk import failed to start due to an unknown error' },
|
||||
{ status: 500 }
|
||||
);
|
||||
}
|
||||
}
|
||||
85
frontend/src/app/scrape/story/route.ts
Normal file
85
frontend/src/app/scrape/story/route.ts
Normal file
@@ -0,0 +1,85 @@
|
||||
import { NextRequest, NextResponse } from 'next/server';
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
const body = await request.json();
|
||||
const { url } = body;
|
||||
|
||||
if (!url || typeof url !== 'string') {
|
||||
return NextResponse.json(
|
||||
{ error: 'URL is required and must be a string' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
// Dynamic import to prevent client-side bundling
|
||||
const { StoryScraper } = await import('@/lib/scraper/scraper');
|
||||
const { ScraperError } = await import('@/lib/scraper/types');
|
||||
|
||||
const scraper = new StoryScraper();
|
||||
const story = await scraper.scrapeStory(url);
|
||||
|
||||
// Debug logging
|
||||
console.log('Scraped story data:', {
|
||||
url: url,
|
||||
title: story.title,
|
||||
author: story.author,
|
||||
summary: story.summary,
|
||||
contentLength: story.content?.length || 0,
|
||||
contentPreview: story.content?.substring(0, 200) + '...',
|
||||
tags: story.tags,
|
||||
coverImage: story.coverImage
|
||||
});
|
||||
|
||||
return NextResponse.json(story);
|
||||
} catch (error) {
|
||||
console.error('Story scraping error:', error);
|
||||
|
||||
// Check if it's a ScraperError without importing at module level
|
||||
if (error && typeof error === 'object' && error.constructor.name === 'ScraperError') {
|
||||
return NextResponse.json(
|
||||
{
|
||||
error: (error as any).message,
|
||||
url: (error as any).url
|
||||
},
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
if (error instanceof Error) {
|
||||
// Handle specific error types
|
||||
if (error.message.includes('Invalid URL')) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Invalid URL provided' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
if (error.message.includes('Unsupported site')) {
|
||||
return NextResponse.json(
|
||||
{ error: 'This website is not supported for scraping' },
|
||||
{ status: 400 }
|
||||
);
|
||||
}
|
||||
|
||||
if (error.message.includes('HTTP 404')) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Story not found at the provided URL' },
|
||||
{ status: 404 }
|
||||
);
|
||||
}
|
||||
|
||||
if (error.message.includes('timeout')) {
|
||||
return NextResponse.json(
|
||||
{ error: 'Request timed out while fetching content' },
|
||||
{ status: 408 }
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return NextResponse.json(
|
||||
{ error: 'Failed to scrape story. Please try again.' },
|
||||
{ status: 500 }
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -4,7 +4,7 @@ import { useState, useEffect } from 'react';
|
||||
import AppLayout from '../../components/layout/AppLayout';
|
||||
import { useTheme } from '../../lib/theme';
|
||||
import Button from '../../components/ui/Button';
|
||||
import { storyApi, authorApi } from '../../lib/api';
|
||||
import { storyApi, authorApi, databaseApi } from '../../lib/api';
|
||||
|
||||
type FontFamily = 'serif' | 'sans' | 'mono';
|
||||
type FontSize = 'small' | 'medium' | 'large' | 'extra-large';
|
||||
@@ -15,6 +15,7 @@ interface Settings {
|
||||
fontFamily: FontFamily;
|
||||
fontSize: FontSize;
|
||||
readingWidth: ReadingWidth;
|
||||
readingSpeed: number; // words per minute
|
||||
}
|
||||
|
||||
const defaultSettings: Settings = {
|
||||
@@ -22,6 +23,7 @@ const defaultSettings: Settings = {
|
||||
fontFamily: 'serif',
|
||||
fontSize: 'medium',
|
||||
readingWidth: 'medium',
|
||||
readingSpeed: 200,
|
||||
};
|
||||
|
||||
export default function SettingsPage() {
|
||||
@@ -37,6 +39,15 @@ export default function SettingsPage() {
|
||||
});
|
||||
const [authorsSchema, setAuthorsSchema] = useState<any>(null);
|
||||
const [showSchema, setShowSchema] = useState(false);
|
||||
const [databaseStatus, setDatabaseStatus] = useState<{
|
||||
completeBackup: { loading: boolean; message: string; success?: boolean };
|
||||
completeRestore: { loading: boolean; message: string; success?: boolean };
|
||||
completeClear: { loading: boolean; message: string; success?: boolean };
|
||||
}>({
|
||||
completeBackup: { loading: false, message: '' },
|
||||
completeRestore: { loading: false, message: '' },
|
||||
completeClear: { loading: false, message: '' }
|
||||
});
|
||||
|
||||
// Load settings from localStorage on mount
|
||||
useEffect(() => {
|
||||
@@ -155,6 +166,147 @@ export default function SettingsPage() {
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
const handleCompleteBackup = async () => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: true, message: 'Creating complete backup...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const backupBlob = await databaseApi.backupComplete();
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(backupBlob);
|
||||
const link = document.createElement('a');
|
||||
link.href = url;
|
||||
|
||||
const timestamp = new Date().toISOString().replace(/[:.]/g, '-').slice(0, 19);
|
||||
link.download = `storycove_complete_backup_${timestamp}.zip`;
|
||||
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: 'Complete backup downloaded successfully', success: true }
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: error.message || 'Complete backup failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 5 seconds
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeBackup: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 5000);
|
||||
};
|
||||
|
||||
const handleCompleteRestore = async (event: React.ChangeEvent<HTMLInputElement>) => {
|
||||
const file = event.target.files?.[0];
|
||||
if (!file) return;
|
||||
|
||||
// Reset the input so the same file can be selected again
|
||||
event.target.value = '';
|
||||
|
||||
if (!file.name.endsWith('.zip')) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: 'Please select a .zip file', success: false }
|
||||
}));
|
||||
return;
|
||||
}
|
||||
|
||||
const confirmed = window.confirm(
|
||||
'Are you sure you want to restore the complete backup? This will PERMANENTLY DELETE all current data AND files (cover images, avatars) and replace them with the backup data. This action cannot be undone!'
|
||||
);
|
||||
|
||||
if (!confirmed) return;
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: true, message: 'Restoring complete backup...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await databaseApi.restoreComplete(file);
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: {
|
||||
loading: false,
|
||||
message: result.success ? result.message : result.message,
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: error.message || 'Complete restore failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 10 seconds for restore (longer because it's important)
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeRestore: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 10000);
|
||||
};
|
||||
|
||||
const handleCompleteClear = async () => {
|
||||
const confirmed = window.confirm(
|
||||
'Are you ABSOLUTELY SURE you want to clear the entire database AND all files? This will PERMANENTLY DELETE ALL stories, authors, series, tags, collections, AND all uploaded images (covers, avatars). This action cannot be undone!'
|
||||
);
|
||||
|
||||
if (!confirmed) return;
|
||||
|
||||
const doubleConfirmed = window.confirm(
|
||||
'This is your final warning! Clicking OK will DELETE EVERYTHING in your StoryCove database AND all uploaded files. Are you completely certain you want to proceed?'
|
||||
);
|
||||
|
||||
if (!doubleConfirmed) return;
|
||||
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: true, message: 'Clearing database and files...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await databaseApi.clearComplete();
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: {
|
||||
loading: false,
|
||||
message: result.success
|
||||
? `Database and files cleared successfully. Deleted ${result.deletedRecords} records.`
|
||||
: result.message,
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
} catch (error: any) {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: false, message: error.message || 'Clear operation failed', success: false }
|
||||
}));
|
||||
}
|
||||
|
||||
// Clear message after 10 seconds for clear (longer because it's important)
|
||||
setTimeout(() => {
|
||||
setDatabaseStatus(prev => ({
|
||||
...prev,
|
||||
completeClear: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 10000);
|
||||
};
|
||||
|
||||
return (
|
||||
<AppLayout>
|
||||
<div className="max-w-2xl mx-auto space-y-8">
|
||||
@@ -288,6 +440,33 @@ export default function SettingsPage() {
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Reading Speed */}
|
||||
<div>
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
Reading Speed (words per minute)
|
||||
</label>
|
||||
<div className="flex items-center gap-4">
|
||||
<input
|
||||
type="range"
|
||||
min="100"
|
||||
max="400"
|
||||
step="25"
|
||||
value={settings.readingSpeed}
|
||||
onChange={(e) => updateSetting('readingSpeed', parseInt(e.target.value))}
|
||||
className="flex-1 h-2 bg-gray-200 rounded-lg appearance-none cursor-pointer dark:bg-gray-700"
|
||||
/>
|
||||
<div className="min-w-[80px] text-center">
|
||||
<span className="text-lg font-medium theme-header">{settings.readingSpeed}</span>
|
||||
<div className="text-xs theme-text">WPM</div>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex justify-between text-xs theme-text mt-1">
|
||||
<span>Slow (100)</span>
|
||||
<span>Average (200)</span>
|
||||
<span>Fast (400)</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -434,6 +613,111 @@ export default function SettingsPage() {
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Database Management */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Database Management</h2>
|
||||
<p className="theme-text mb-6">
|
||||
Backup, restore, or clear your StoryCove database and files. These comprehensive operations include both your data and uploaded images.
|
||||
</p>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Complete Backup Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-blue-200 dark:border-blue-800">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">📦 Create Backup</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
Download a complete backup as a ZIP file. This includes your database AND all uploaded files (cover images, avatars). This is a comprehensive backup of your entire StoryCove installation.
|
||||
</p>
|
||||
<Button
|
||||
onClick={handleCompleteBackup}
|
||||
disabled={databaseStatus.completeBackup.loading}
|
||||
loading={databaseStatus.completeBackup.loading}
|
||||
variant="primary"
|
||||
className="w-full sm:w-auto"
|
||||
>
|
||||
{databaseStatus.completeBackup.loading ? 'Creating Backup...' : 'Download Backup'}
|
||||
</Button>
|
||||
{databaseStatus.completeBackup.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeBackup.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeBackup.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Restore Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-orange-200 dark:border-orange-800">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">📥 Restore Backup</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
<strong className="text-orange-600 dark:text-orange-400">⚠️ Warning:</strong> This will completely replace your current database AND all files with the backup. All existing data and uploaded files will be permanently deleted.
|
||||
</p>
|
||||
<div className="flex items-center gap-3">
|
||||
<input
|
||||
type="file"
|
||||
accept=".zip"
|
||||
onChange={handleCompleteRestore}
|
||||
disabled={databaseStatus.completeRestore.loading}
|
||||
className="flex-1 text-sm theme-text file:mr-4 file:py-2 file:px-4 file:rounded-lg file:border-0 file:text-sm file:font-medium file:theme-accent-bg file:text-white hover:file:bg-opacity-90 file:cursor-pointer"
|
||||
/>
|
||||
</div>
|
||||
{databaseStatus.completeRestore.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeRestore.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeRestore.message}
|
||||
</div>
|
||||
)}
|
||||
{databaseStatus.completeRestore.loading && (
|
||||
<div className="text-sm theme-text mt-3 flex items-center gap-2">
|
||||
<div className="animate-spin w-4 h-4 border-2 border-blue-500 border-t-transparent rounded-full"></div>
|
||||
Restoring backup...
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Clear Everything Section */}
|
||||
<div className="border theme-border rounded-lg p-4 border-red-200 dark:border-red-800 bg-red-50 dark:bg-red-900/10">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">🗑️ Clear Everything</h3>
|
||||
<p className="text-sm theme-text mb-3">
|
||||
<strong className="text-red-600 dark:text-red-400">⚠️ Danger Zone:</strong> This will permanently delete ALL data from your database AND all uploaded files (cover images, avatars). Everything will be completely removed. This action cannot be undone!
|
||||
</p>
|
||||
<Button
|
||||
onClick={handleCompleteClear}
|
||||
disabled={databaseStatus.completeClear.loading}
|
||||
loading={databaseStatus.completeClear.loading}
|
||||
variant="secondary"
|
||||
className="w-full sm:w-auto bg-red-700 hover:bg-red-800 text-white border-red-700"
|
||||
>
|
||||
{databaseStatus.completeClear.loading ? 'Clearing Everything...' : 'Clear Everything'}
|
||||
</Button>
|
||||
{databaseStatus.completeClear.message && (
|
||||
<div className={`text-sm p-2 rounded mt-3 ${
|
||||
databaseStatus.completeClear.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{databaseStatus.completeClear.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="text-sm theme-text bg-blue-50 dark:bg-blue-900/20 p-3 rounded-lg">
|
||||
<p className="font-medium mb-1">💡 Best Practices:</p>
|
||||
<ul className="text-xs space-y-1 ml-4">
|
||||
<li>• <strong>Always backup</strong> before performing restore or clear operations</li>
|
||||
<li>• <strong>Store backups safely</strong> in multiple locations for important data</li>
|
||||
<li>• <strong>Test restores</strong> in a development environment when possible</li>
|
||||
<li>• <strong>Backup files (.zip)</strong> contain both database and all uploaded files</li>
|
||||
<li>• <strong>Verify backup files</strong> are complete before relying on them</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex justify-end gap-4">
|
||||
<Button
|
||||
|
||||
@@ -9,6 +9,7 @@ import { Story, Collection } from '../../../../types/api';
|
||||
import AppLayout from '../../../../components/layout/AppLayout';
|
||||
import Button from '../../../../components/ui/Button';
|
||||
import LoadingSpinner from '../../../../components/ui/LoadingSpinner';
|
||||
import { calculateReadingTime } from '../../../../lib/settings';
|
||||
|
||||
export default function StoryDetailPage() {
|
||||
const params = useParams();
|
||||
@@ -20,6 +21,7 @@ export default function StoryDetailPage() {
|
||||
const [collections, setCollections] = useState<Collection[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [updating, setUpdating] = useState(false);
|
||||
const [isExporting, setIsExporting] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
const loadStoryData = async () => {
|
||||
@@ -64,6 +66,53 @@ export default function StoryDetailPage() {
|
||||
}
|
||||
};
|
||||
|
||||
const handleEPUBExport = async () => {
|
||||
if (!story) return;
|
||||
|
||||
setIsExporting(true);
|
||||
try {
|
||||
const token = localStorage.getItem('auth-token');
|
||||
const response = await fetch(`/api/stories/${story.id}/epub`, {
|
||||
method: 'GET',
|
||||
headers: {
|
||||
'Authorization': token ? `Bearer ${token}` : '',
|
||||
},
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const blob = await response.blob();
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const link = document.createElement('a');
|
||||
link.href = url;
|
||||
|
||||
// Get filename from Content-Disposition header or create default
|
||||
const contentDisposition = response.headers.get('Content-Disposition');
|
||||
let filename = `${story.title}.epub`;
|
||||
if (contentDisposition) {
|
||||
const match = contentDisposition.match(/filename[^;=\n]*=((['"]).*?\2|[^;\n]*)/);
|
||||
if (match && match[1]) {
|
||||
filename = match[1].replace(/['"]/g, '');
|
||||
}
|
||||
}
|
||||
|
||||
link.download = filename;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
window.URL.revokeObjectURL(url);
|
||||
document.body.removeChild(link);
|
||||
} else if (response.status === 401 || response.status === 403) {
|
||||
alert('Authentication required. Please log in.');
|
||||
} else {
|
||||
throw new Error('Failed to export EPUB');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error exporting EPUB:', error);
|
||||
alert('Failed to export EPUB. Please try again.');
|
||||
} finally {
|
||||
setIsExporting(false);
|
||||
}
|
||||
};
|
||||
|
||||
const formatDate = (dateString: string) => {
|
||||
return new Date(dateString).toLocaleDateString('en-US', {
|
||||
year: 'numeric',
|
||||
@@ -73,9 +122,7 @@ export default function StoryDetailPage() {
|
||||
};
|
||||
|
||||
const estimateReadingTime = (wordCount: number) => {
|
||||
const wordsPerMinute = 200; // Average reading speed
|
||||
const minutes = Math.ceil(wordCount / wordsPerMinute);
|
||||
return minutes;
|
||||
return calculateReadingTime(wordCount);
|
||||
};
|
||||
|
||||
if (loading) {
|
||||
@@ -359,6 +406,14 @@ export default function StoryDetailPage() {
|
||||
>
|
||||
📚 Start Reading
|
||||
</Button>
|
||||
<Button
|
||||
onClick={handleEPUBExport}
|
||||
variant="ghost"
|
||||
size="lg"
|
||||
disabled={isExporting}
|
||||
>
|
||||
{isExporting ? 'Exporting...' : '📖 Export EPUB'}
|
||||
</Button>
|
||||
<Button
|
||||
href={`/stories/${story.id}/edit`}
|
||||
variant="ghost"
|
||||
|
||||
@@ -8,6 +8,7 @@ import Button from '../../../../components/ui/Button';
|
||||
import TagInput from '../../../../components/stories/TagInput';
|
||||
import RichTextEditor from '../../../../components/stories/RichTextEditor';
|
||||
import ImageUpload from '../../../../components/ui/ImageUpload';
|
||||
import AuthorSelector from '../../../../components/stories/AuthorSelector';
|
||||
import LoadingSpinner from '../../../../components/ui/LoadingSpinner';
|
||||
import { storyApi } from '../../../../lib/api';
|
||||
import { Story } from '../../../../types/api';
|
||||
@@ -26,6 +27,7 @@ export default function EditStoryPage() {
|
||||
title: '',
|
||||
summary: '',
|
||||
authorName: '',
|
||||
authorId: undefined as string | undefined,
|
||||
contentHtml: '',
|
||||
sourceUrl: '',
|
||||
tags: [] as string[],
|
||||
@@ -47,6 +49,7 @@ export default function EditStoryPage() {
|
||||
title: storyData.title,
|
||||
summary: storyData.summary || '',
|
||||
authorName: storyData.authorName,
|
||||
authorId: storyData.authorId,
|
||||
contentHtml: storyData.contentHtml,
|
||||
sourceUrl: storyData.sourceUrl || '',
|
||||
tags: storyData.tags?.map(tag => tag.name) || [],
|
||||
@@ -91,6 +94,19 @@ export default function EditStoryPage() {
|
||||
setFormData(prev => ({ ...prev, tags }));
|
||||
};
|
||||
|
||||
const handleAuthorChange = (authorName: string, authorId?: string) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
authorName,
|
||||
authorId: authorId // This will be undefined if creating new author, which clears the existing ID
|
||||
}));
|
||||
|
||||
// Clear error when user changes author
|
||||
if (errors.authorName) {
|
||||
setErrors(prev => ({ ...prev, authorName: '' }));
|
||||
}
|
||||
};
|
||||
|
||||
const validateForm = () => {
|
||||
const newErrors: Record<string, string> = {};
|
||||
|
||||
@@ -136,7 +152,8 @@ export default function EditStoryPage() {
|
||||
sourceUrl: formData.sourceUrl || undefined,
|
||||
volume: formData.seriesName ? parseInt(formData.volume) : undefined,
|
||||
seriesName: formData.seriesName || undefined,
|
||||
authorId: story.authorId, // Keep existing author ID
|
||||
// Send authorId if we have it (existing author), otherwise send authorName (new/changed author)
|
||||
...(formData.authorId ? { authorId: formData.authorId } : { authorName: formData.authorName }),
|
||||
tagNames: formData.tags,
|
||||
};
|
||||
|
||||
@@ -216,18 +233,15 @@ export default function EditStoryPage() {
|
||||
required
|
||||
/>
|
||||
|
||||
{/* Author - Display only, not editable in edit mode for simplicity */}
|
||||
<Input
|
||||
{/* Author Selector */}
|
||||
<AuthorSelector
|
||||
label="Author *"
|
||||
value={formData.authorName}
|
||||
onChange={handleInputChange('authorName')}
|
||||
placeholder="Enter the author's name"
|
||||
onChange={handleAuthorChange}
|
||||
placeholder="Select or enter author name"
|
||||
error={errors.authorName}
|
||||
disabled
|
||||
required
|
||||
/>
|
||||
<p className="text-sm theme-text mt-1">
|
||||
Author changes should be done through Author management
|
||||
</p>
|
||||
|
||||
{/* Summary */}
|
||||
<div>
|
||||
@@ -252,7 +266,7 @@ export default function EditStoryPage() {
|
||||
</label>
|
||||
<ImageUpload
|
||||
onImageSelect={setCoverImage}
|
||||
accept="image/jpeg,image/png,image/webp"
|
||||
accept="image/jpeg,image/png"
|
||||
maxSizeMB={5}
|
||||
aspectRatio="3:4"
|
||||
placeholder="Drop a new cover image here or click to select"
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect } from 'react';
|
||||
import { useState, useEffect, useRef, useCallback } from 'react';
|
||||
import { useParams, useRouter } from 'next/navigation';
|
||||
import Link from 'next/link';
|
||||
import { storyApi, seriesApi } from '../../../lib/api';
|
||||
@@ -19,9 +19,85 @@ export default function StoryReadingPage() {
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const [readingProgress, setReadingProgress] = useState(0);
|
||||
const [sanitizedContent, setSanitizedContent] = useState<string>('');
|
||||
const [hasScrolledToPosition, setHasScrolledToPosition] = useState(false);
|
||||
const contentRef = useRef<HTMLDivElement>(null);
|
||||
const saveTimeoutRef = useRef<NodeJS.Timeout | null>(null);
|
||||
|
||||
const storyId = params.id as string;
|
||||
|
||||
// Convert scroll position to approximate character position in the content
|
||||
const getCharacterPositionFromScroll = useCallback((): number => {
|
||||
if (!contentRef.current || !story) return 0;
|
||||
|
||||
const content = contentRef.current;
|
||||
const scrolled = window.scrollY;
|
||||
const contentTop = content.offsetTop;
|
||||
const contentHeight = content.scrollHeight;
|
||||
const windowHeight = window.innerHeight;
|
||||
|
||||
// Calculate how far through the content we are (0-1)
|
||||
const scrollRatio = Math.min(1, Math.max(0,
|
||||
(scrolled - contentTop + windowHeight * 0.3) / contentHeight
|
||||
));
|
||||
|
||||
// Convert to character position in the plain text content
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
return Math.floor(scrollRatio * textLength);
|
||||
}, [story]);
|
||||
|
||||
// Convert character position back to scroll position for auto-scroll
|
||||
const scrollToCharacterPosition = useCallback((position: number) => {
|
||||
if (!contentRef.current || !story || hasScrolledToPosition) return;
|
||||
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
if (textLength === 0 || position === 0) return;
|
||||
|
||||
const ratio = position / textLength;
|
||||
const content = contentRef.current;
|
||||
const contentTop = content.offsetTop;
|
||||
const contentHeight = content.scrollHeight;
|
||||
const windowHeight = window.innerHeight;
|
||||
|
||||
// Calculate target scroll position
|
||||
const targetScroll = contentTop + (ratio * contentHeight) - (windowHeight * 0.3);
|
||||
|
||||
// Smooth scroll to position
|
||||
window.scrollTo({
|
||||
top: Math.max(0, targetScroll),
|
||||
behavior: 'smooth'
|
||||
});
|
||||
|
||||
setHasScrolledToPosition(true);
|
||||
}, [story, hasScrolledToPosition]);
|
||||
|
||||
// Debounced function to save reading position
|
||||
const saveReadingPosition = useCallback(async (position: number) => {
|
||||
if (!story || position === story.readingPosition) {
|
||||
console.log('Skipping save - no story or position unchanged:', { story: !!story, position, current: story?.readingPosition });
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('Saving reading position:', position, 'for story:', story.id);
|
||||
try {
|
||||
const updatedStory = await storyApi.updateReadingProgress(story.id, position);
|
||||
console.log('Reading position saved successfully, updated story:', updatedStory.readingPosition);
|
||||
setStory(prev => prev ? { ...prev, readingPosition: position, lastReadAt: updatedStory.lastReadAt } : null);
|
||||
} catch (error) {
|
||||
console.error('Failed to save reading position:', error);
|
||||
}
|
||||
}, [story]);
|
||||
|
||||
// Debounced version of saveReadingPosition
|
||||
const debouncedSavePosition = useCallback((position: number) => {
|
||||
if (saveTimeoutRef.current) {
|
||||
clearTimeout(saveTimeoutRef.current);
|
||||
}
|
||||
|
||||
saveTimeoutRef.current = setTimeout(() => {
|
||||
saveReadingPosition(position);
|
||||
}, 2000); // Save after 2 seconds of no scrolling
|
||||
}, [saveReadingPosition]);
|
||||
|
||||
useEffect(() => {
|
||||
const loadStory = async () => {
|
||||
try {
|
||||
@@ -57,7 +133,27 @@ export default function StoryReadingPage() {
|
||||
}
|
||||
}, [storyId]);
|
||||
|
||||
// Track reading progress
|
||||
// Auto-scroll to saved reading position when story content is loaded
|
||||
useEffect(() => {
|
||||
if (story && sanitizedContent && !hasScrolledToPosition) {
|
||||
// Use a small delay to ensure content is rendered
|
||||
const timeout = setTimeout(() => {
|
||||
console.log('Initializing reading position tracking, saved position:', story.readingPosition);
|
||||
if (story.readingPosition && story.readingPosition > 0) {
|
||||
console.log('Auto-scrolling to saved position:', story.readingPosition);
|
||||
scrollToCharacterPosition(story.readingPosition);
|
||||
} else {
|
||||
// Even if there's no saved position, mark as ready for tracking
|
||||
console.log('No saved position, starting fresh tracking');
|
||||
setHasScrolledToPosition(true);
|
||||
}
|
||||
}, 500);
|
||||
|
||||
return () => clearTimeout(timeout);
|
||||
}
|
||||
}, [story, sanitizedContent, scrollToCharacterPosition, hasScrolledToPosition]);
|
||||
|
||||
// Track reading progress and save position
|
||||
useEffect(() => {
|
||||
const handleScroll = () => {
|
||||
const article = document.querySelector('[data-reading-content]') as HTMLElement;
|
||||
@@ -72,12 +168,27 @@ export default function StoryReadingPage() {
|
||||
));
|
||||
|
||||
setReadingProgress(progress);
|
||||
|
||||
// Save reading position (debounced)
|
||||
if (hasScrolledToPosition) { // Only save after initial auto-scroll
|
||||
const characterPosition = getCharacterPositionFromScroll();
|
||||
console.log('Scroll detected, character position:', characterPosition);
|
||||
debouncedSavePosition(characterPosition);
|
||||
} else {
|
||||
console.log('Scroll detected but not ready for tracking yet');
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
window.addEventListener('scroll', handleScroll);
|
||||
return () => window.removeEventListener('scroll', handleScroll);
|
||||
}, [story]);
|
||||
return () => {
|
||||
window.removeEventListener('scroll', handleScroll);
|
||||
// Clean up timeout on unmount
|
||||
if (saveTimeoutRef.current) {
|
||||
clearTimeout(saveTimeoutRef.current);
|
||||
}
|
||||
};
|
||||
}, [story, hasScrolledToPosition, getCharacterPositionFromScroll, debouncedSavePosition]);
|
||||
|
||||
const handleRatingUpdate = async (newRating: number) => {
|
||||
if (!story) return;
|
||||
@@ -90,6 +201,7 @@ export default function StoryReadingPage() {
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
const findNextStory = (): Story | null => {
|
||||
if (!story?.seriesId || seriesStories.length <= 1) return null;
|
||||
|
||||
@@ -229,6 +341,7 @@ export default function StoryReadingPage() {
|
||||
|
||||
{/* Story Content */}
|
||||
<div
|
||||
ref={contentRef}
|
||||
className="reading-content"
|
||||
dangerouslySetInnerHTML={{ __html: sanitizedContent }}
|
||||
/>
|
||||
|
||||
21
frontend/src/app/stories/import/bulk/page.tsx
Normal file
21
frontend/src/app/stories/import/bulk/page.tsx
Normal file
@@ -0,0 +1,21 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
|
||||
export default function BulkImportRedirectPage() {
|
||||
const router = useRouter();
|
||||
|
||||
useEffect(() => {
|
||||
router.replace('/import/bulk');
|
||||
}, [router]);
|
||||
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center">
|
||||
<div className="text-center">
|
||||
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600 mx-auto mb-4"></div>
|
||||
<p className="text-gray-600">Redirecting...</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
21
frontend/src/app/stories/import/epub/page.tsx
Normal file
21
frontend/src/app/stories/import/epub/page.tsx
Normal file
@@ -0,0 +1,21 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
|
||||
export default function EpubImportRedirectPage() {
|
||||
const router = useRouter();
|
||||
|
||||
useEffect(() => {
|
||||
router.replace('/import/epub');
|
||||
}, [router]);
|
||||
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center">
|
||||
<div className="text-center">
|
||||
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600 mx-auto mb-4"></div>
|
||||
<p className="text-gray-600">Redirecting...</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
21
frontend/src/app/stories/import/page.tsx
Normal file
21
frontend/src/app/stories/import/page.tsx
Normal file
@@ -0,0 +1,21 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
|
||||
export default function ImportRedirectPage() {
|
||||
const router = useRouter();
|
||||
|
||||
useEffect(() => {
|
||||
router.replace('/import');
|
||||
}, [router]);
|
||||
|
||||
return (
|
||||
<div className="min-h-screen flex items-center justify-center">
|
||||
<div className="text-center">
|
||||
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600 mx-auto mb-4"></div>
|
||||
<p className="text-gray-600">Redirecting...</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
23
frontend/src/app/stories/page.tsx
Normal file
23
frontend/src/app/stories/page.tsx
Normal file
@@ -0,0 +1,23 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect } from 'react';
|
||||
import { useRouter } from 'next/navigation';
|
||||
import LoadingSpinner from '../../components/ui/LoadingSpinner';
|
||||
|
||||
export default function StoriesRedirectPage() {
|
||||
const router = useRouter();
|
||||
|
||||
useEffect(() => {
|
||||
// Redirect to library page
|
||||
router.replace('/library');
|
||||
}, [router]);
|
||||
|
||||
return (
|
||||
<div className="min-h-screen theme-bg flex items-center justify-center">
|
||||
<div className="text-center">
|
||||
<LoadingSpinner size="lg" />
|
||||
<p className="theme-text mt-4">Redirecting to Library...</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
207
frontend/src/components/BulkImportProgress.tsx
Normal file
207
frontend/src/components/BulkImportProgress.tsx
Normal file
@@ -0,0 +1,207 @@
|
||||
'use client';
|
||||
|
||||
import { useEffect, useState } from 'react';
|
||||
|
||||
interface ProgressUpdate {
|
||||
type: 'progress' | 'completed' | 'error' | 'connected';
|
||||
current: number;
|
||||
total: number;
|
||||
message: string;
|
||||
url?: string;
|
||||
title?: string;
|
||||
author?: string;
|
||||
wordCount?: number;
|
||||
totalWordCount?: number;
|
||||
error?: string;
|
||||
sessionId?: string;
|
||||
}
|
||||
|
||||
interface BulkImportProgressProps {
|
||||
sessionId: string;
|
||||
onComplete?: (data?: any) => void;
|
||||
onError?: (error: string) => void;
|
||||
combineMode?: boolean;
|
||||
}
|
||||
|
||||
export default function BulkImportProgress({
|
||||
sessionId,
|
||||
onComplete,
|
||||
onError,
|
||||
combineMode = false
|
||||
}: BulkImportProgressProps) {
|
||||
const [progress, setProgress] = useState<ProgressUpdate>({
|
||||
type: 'progress',
|
||||
current: 0,
|
||||
total: 1,
|
||||
message: 'Connecting...'
|
||||
});
|
||||
const [isConnected, setIsConnected] = useState(false);
|
||||
const [recentActivities, setRecentActivities] = useState<string[]>([]);
|
||||
|
||||
useEffect(() => {
|
||||
const eventSource = new EventSource(`/scrape/bulk/progress?sessionId=${sessionId}`);
|
||||
|
||||
eventSource.onmessage = (event) => {
|
||||
try {
|
||||
const data: ProgressUpdate = JSON.parse(event.data);
|
||||
|
||||
if (data.type === 'connected') {
|
||||
setIsConnected(true);
|
||||
return;
|
||||
}
|
||||
|
||||
setProgress(data);
|
||||
|
||||
// Add to recent activities (keep last 5)
|
||||
if (data.message) {
|
||||
setRecentActivities(prev => [
|
||||
data.message,
|
||||
...prev.slice(0, 4)
|
||||
]);
|
||||
}
|
||||
|
||||
if (data.type === 'completed') {
|
||||
setTimeout(() => {
|
||||
onComplete?.(data);
|
||||
eventSource.close();
|
||||
}, 2000); // Show completion message for 2 seconds
|
||||
} else if (data.type === 'error') {
|
||||
onError?.(data.error || 'Unknown error occurred');
|
||||
eventSource.close();
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to parse progress update:', error);
|
||||
}
|
||||
};
|
||||
|
||||
eventSource.onerror = (error) => {
|
||||
console.error('EventSource error:', error);
|
||||
setIsConnected(false);
|
||||
onError?.('Connection to progress stream failed');
|
||||
eventSource.close();
|
||||
};
|
||||
|
||||
return () => {
|
||||
eventSource.close();
|
||||
};
|
||||
}, [sessionId, onComplete, onError]);
|
||||
|
||||
const progressPercentage = progress.total > 0
|
||||
? Math.round((progress.current / progress.total) * 100)
|
||||
: 0;
|
||||
|
||||
const getStatusColor = () => {
|
||||
switch (progress.type) {
|
||||
case 'completed': return 'bg-green-600';
|
||||
case 'error': return 'bg-red-600';
|
||||
default: return 'bg-blue-600';
|
||||
}
|
||||
};
|
||||
|
||||
const getStatusIcon = () => {
|
||||
switch (progress.type) {
|
||||
case 'completed': return '✓';
|
||||
case 'error': return '✗';
|
||||
default: return null;
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="bg-white border border-gray-200 rounded-lg p-6">
|
||||
<div className="mb-4">
|
||||
<div className="flex items-center justify-between mb-2">
|
||||
<h3 className="text-lg font-medium text-gray-900">
|
||||
{combineMode ? 'Combining Stories' : 'Bulk Import Progress'}
|
||||
</h3>
|
||||
<div className="flex items-center gap-2">
|
||||
{!isConnected && (
|
||||
<div className="h-2 w-2 bg-yellow-400 rounded-full animate-pulse"></div>
|
||||
)}
|
||||
<span className="text-sm text-gray-600">
|
||||
{progress.current} of {progress.total}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Progress Bar */}
|
||||
<div className="w-full bg-gray-200 rounded-full h-3 mb-3">
|
||||
<div
|
||||
className={`h-3 rounded-full transition-all duration-500 ${getStatusColor()}`}
|
||||
style={{ width: `${progressPercentage}%` }}
|
||||
></div>
|
||||
</div>
|
||||
|
||||
{/* Progress Percentage */}
|
||||
<div className="flex items-center justify-between">
|
||||
<span className="text-sm font-medium text-gray-900">
|
||||
{progressPercentage}%
|
||||
</span>
|
||||
{progress.type === 'completed' && (
|
||||
<span className="text-green-600 font-medium">
|
||||
{getStatusIcon()} Complete
|
||||
</span>
|
||||
)}
|
||||
{progress.type === 'error' && (
|
||||
<span className="text-red-600 font-medium">
|
||||
{getStatusIcon()} Error
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Current Status Message */}
|
||||
<div className="mb-4">
|
||||
<div className="flex items-center gap-2">
|
||||
{progress.type === 'progress' && (
|
||||
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-blue-600"></div>
|
||||
)}
|
||||
<p className="text-sm text-gray-700">{progress.message}</p>
|
||||
</div>
|
||||
|
||||
{/* Word Count for Combine Mode */}
|
||||
{combineMode && progress.totalWordCount !== undefined && (
|
||||
<p className="text-sm text-gray-500 mt-1">
|
||||
Total words collected: {progress.totalWordCount.toLocaleString()}
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Current URL being processed */}
|
||||
{progress.url && (
|
||||
<div className="mb-4 p-3 bg-gray-50 rounded-md">
|
||||
<p className="text-sm text-gray-600 mb-1">Currently processing:</p>
|
||||
<p className="text-sm font-mono text-gray-800 truncate">{progress.url}</p>
|
||||
{progress.title && progress.author && (
|
||||
<p className="text-sm text-gray-600 mt-1">
|
||||
"{progress.title}" by {progress.author}
|
||||
{progress.wordCount && (
|
||||
<span className="ml-2 text-gray-500">
|
||||
({progress.wordCount.toLocaleString()} words)
|
||||
</span>
|
||||
)}
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Recent Activities */}
|
||||
{recentActivities.length > 0 && (
|
||||
<div>
|
||||
<h4 className="text-sm font-medium text-gray-900 mb-2">Recent Activity</h4>
|
||||
<div className="space-y-1 max-h-32 overflow-y-auto">
|
||||
{recentActivities.map((activity, index) => (
|
||||
<p
|
||||
key={index}
|
||||
className={`text-xs text-gray-600 ${
|
||||
index === 0 ? 'font-medium text-gray-800' : ''
|
||||
}`}
|
||||
>
|
||||
{activity}
|
||||
</p>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -227,7 +227,7 @@ export default function CollectionForm({
|
||||
<input
|
||||
id="coverImage"
|
||||
type="file"
|
||||
accept="image/jpeg,image/png,image/webp"
|
||||
accept="image/jpeg,image/png"
|
||||
onChange={handleCoverImageChange}
|
||||
className="w-full px-3 py-2 border theme-border rounded-lg theme-card theme-text focus:outline-none focus:ring-2 focus:ring-theme-accent"
|
||||
/>
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect, useRef, useCallback } from 'react';
|
||||
import { StoryWithCollectionContext } from '../../types/api';
|
||||
import { storyApi } from '../../lib/api';
|
||||
import Button from '../ui/Button';
|
||||
import Link from 'next/link';
|
||||
|
||||
@@ -16,6 +18,120 @@ export default function CollectionReadingView({
|
||||
onBackToCollection
|
||||
}: CollectionReadingViewProps) {
|
||||
const { story, collection } = data;
|
||||
const [hasScrolledToPosition, setHasScrolledToPosition] = useState(false);
|
||||
const contentRef = useRef<HTMLDivElement>(null);
|
||||
const saveTimeoutRef = useRef<NodeJS.Timeout | null>(null);
|
||||
|
||||
// Convert scroll position to approximate character position in the content
|
||||
const getCharacterPositionFromScroll = useCallback((): number => {
|
||||
if (!contentRef.current || !story) return 0;
|
||||
|
||||
const content = contentRef.current;
|
||||
const scrolled = window.scrollY;
|
||||
const contentTop = content.offsetTop;
|
||||
const contentHeight = content.scrollHeight;
|
||||
const windowHeight = window.innerHeight;
|
||||
|
||||
// Calculate how far through the content we are (0-1)
|
||||
const scrollRatio = Math.min(1, Math.max(0,
|
||||
(scrolled - contentTop + windowHeight * 0.3) / contentHeight
|
||||
));
|
||||
|
||||
// Convert to character position in the plain text content
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
return Math.floor(scrollRatio * textLength);
|
||||
}, [story]);
|
||||
|
||||
// Convert character position back to scroll position for auto-scroll
|
||||
const scrollToCharacterPosition = useCallback((position: number) => {
|
||||
if (!contentRef.current || !story || hasScrolledToPosition) return;
|
||||
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
if (textLength === 0 || position === 0) return;
|
||||
|
||||
const ratio = position / textLength;
|
||||
const content = contentRef.current;
|
||||
const contentTop = content.offsetTop;
|
||||
const contentHeight = content.scrollHeight;
|
||||
const windowHeight = window.innerHeight;
|
||||
|
||||
// Calculate target scroll position
|
||||
const targetScroll = contentTop + (ratio * contentHeight) - (windowHeight * 0.3);
|
||||
|
||||
// Smooth scroll to position
|
||||
window.scrollTo({
|
||||
top: Math.max(0, targetScroll),
|
||||
behavior: 'smooth'
|
||||
});
|
||||
|
||||
setHasScrolledToPosition(true);
|
||||
}, [story, hasScrolledToPosition]);
|
||||
|
||||
// Debounced function to save reading position
|
||||
const saveReadingPosition = useCallback(async (position: number) => {
|
||||
if (!story || position === story.readingPosition) {
|
||||
console.log('Collection view - skipping save - no story or position unchanged:', { story: !!story, position, current: story?.readingPosition });
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('Collection view - saving reading position:', position, 'for story:', story.id);
|
||||
try {
|
||||
await storyApi.updateReadingProgress(story.id, position);
|
||||
console.log('Collection view - reading position saved successfully');
|
||||
} catch (error) {
|
||||
console.error('Collection view - failed to save reading position:', error);
|
||||
}
|
||||
}, [story]);
|
||||
|
||||
// Debounced version of saveReadingPosition
|
||||
const debouncedSavePosition = useCallback((position: number) => {
|
||||
if (saveTimeoutRef.current) {
|
||||
clearTimeout(saveTimeoutRef.current);
|
||||
}
|
||||
|
||||
saveTimeoutRef.current = setTimeout(() => {
|
||||
saveReadingPosition(position);
|
||||
}, 2000);
|
||||
}, [saveReadingPosition]);
|
||||
|
||||
// Auto-scroll to saved reading position when story content is loaded
|
||||
useEffect(() => {
|
||||
if (story && !hasScrolledToPosition) {
|
||||
const timeout = setTimeout(() => {
|
||||
console.log('Collection view - initializing reading position tracking, saved position:', story.readingPosition);
|
||||
if (story.readingPosition && story.readingPosition > 0) {
|
||||
console.log('Collection view - auto-scrolling to saved position:', story.readingPosition);
|
||||
scrollToCharacterPosition(story.readingPosition);
|
||||
} else {
|
||||
console.log('Collection view - no saved position, starting fresh tracking');
|
||||
setHasScrolledToPosition(true);
|
||||
}
|
||||
}, 500);
|
||||
|
||||
return () => clearTimeout(timeout);
|
||||
}
|
||||
}, [story, scrollToCharacterPosition, hasScrolledToPosition]);
|
||||
|
||||
// Track reading progress and save position
|
||||
useEffect(() => {
|
||||
const handleScroll = () => {
|
||||
if (hasScrolledToPosition) {
|
||||
const characterPosition = getCharacterPositionFromScroll();
|
||||
console.log('Collection view - scroll detected, character position:', characterPosition);
|
||||
debouncedSavePosition(characterPosition);
|
||||
} else {
|
||||
console.log('Collection view - scroll detected but not ready for tracking yet');
|
||||
}
|
||||
};
|
||||
|
||||
window.addEventListener('scroll', handleScroll);
|
||||
return () => {
|
||||
window.removeEventListener('scroll', handleScroll);
|
||||
if (saveTimeoutRef.current) {
|
||||
clearTimeout(saveTimeoutRef.current);
|
||||
}
|
||||
};
|
||||
}, [hasScrolledToPosition, getCharacterPositionFromScroll, debouncedSavePosition]);
|
||||
|
||||
const handlePrevious = () => {
|
||||
if (collection.previousStoryId) {
|
||||
@@ -180,6 +296,7 @@ export default function CollectionReadingView({
|
||||
{/* Story Content */}
|
||||
<div className="theme-card p-8">
|
||||
<div
|
||||
ref={contentRef}
|
||||
className="prose prose-lg max-w-none theme-text"
|
||||
dangerouslySetInnerHTML={{ __html: story.contentHtml }}
|
||||
/>
|
||||
|
||||
@@ -7,6 +7,7 @@ import { useRouter } from 'next/navigation';
|
||||
import { useAuth } from '../../contexts/AuthContext';
|
||||
import { useTheme } from '../../lib/theme';
|
||||
import Button from '../ui/Button';
|
||||
import Dropdown from '../ui/Dropdown';
|
||||
|
||||
export default function Header() {
|
||||
const [isMenuOpen, setIsMenuOpen] = useState(false);
|
||||
@@ -14,6 +15,29 @@ export default function Header() {
|
||||
const { theme, toggleTheme } = useTheme();
|
||||
const router = useRouter();
|
||||
|
||||
const addStoryItems = [
|
||||
{
|
||||
href: '/import',
|
||||
label: 'Manual Entry',
|
||||
description: 'Add a story by manually entering details'
|
||||
},
|
||||
{
|
||||
href: '/import?mode=url',
|
||||
label: 'Import from URL',
|
||||
description: 'Import a single story from a website'
|
||||
},
|
||||
{
|
||||
href: '/import/epub',
|
||||
label: 'Import EPUB',
|
||||
description: 'Import a story from an EPUB file'
|
||||
},
|
||||
{
|
||||
href: '/import/bulk',
|
||||
label: 'Bulk Import',
|
||||
description: 'Import multiple stories from a list of URLs'
|
||||
}
|
||||
];
|
||||
|
||||
const handleLogout = () => {
|
||||
logout();
|
||||
router.push('/login');
|
||||
@@ -57,12 +81,10 @@ export default function Header() {
|
||||
>
|
||||
Authors
|
||||
</Link>
|
||||
<Link
|
||||
href="/add-story"
|
||||
className="theme-text hover:theme-accent transition-colors font-medium"
|
||||
>
|
||||
Add Story
|
||||
</Link>
|
||||
<Dropdown
|
||||
trigger="Add Story"
|
||||
items={addStoryItems}
|
||||
/>
|
||||
</nav>
|
||||
|
||||
{/* Right side actions */}
|
||||
@@ -131,13 +153,39 @@ export default function Header() {
|
||||
>
|
||||
Authors
|
||||
</Link>
|
||||
<Link
|
||||
href="/add-story"
|
||||
className="theme-text hover:theme-accent transition-colors font-medium px-2 py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Add Story
|
||||
</Link>
|
||||
<div className="px-2 py-1">
|
||||
<div className="font-medium theme-text mb-1">Add Story</div>
|
||||
<div className="pl-4 space-y-1">
|
||||
<Link
|
||||
href="/import"
|
||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Manual Entry
|
||||
</Link>
|
||||
<Link
|
||||
href="/import?mode=url"
|
||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Import from URL
|
||||
</Link>
|
||||
<Link
|
||||
href="/import/epub"
|
||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Import EPUB
|
||||
</Link>
|
||||
<Link
|
||||
href="/import/bulk"
|
||||
className="block theme-text hover:theme-accent transition-colors text-sm py-1"
|
||||
onClick={() => setIsMenuOpen(false)}
|
||||
>
|
||||
Bulk Import
|
||||
</Link>
|
||||
</div>
|
||||
</div>
|
||||
<Link
|
||||
href="/settings"
|
||||
className="theme-text hover:theme-accent transition-colors font-medium px-2 py-1"
|
||||
|
||||
128
frontend/src/components/layout/ImportLayout.tsx
Normal file
128
frontend/src/components/layout/ImportLayout.tsx
Normal file
@@ -0,0 +1,128 @@
|
||||
'use client';
|
||||
|
||||
import { ReactNode } from 'react';
|
||||
import Link from 'next/link';
|
||||
import { usePathname, useSearchParams } from 'next/navigation';
|
||||
import AppLayout from './AppLayout';
|
||||
|
||||
interface ImportTab {
|
||||
id: string;
|
||||
label: string;
|
||||
href: string;
|
||||
description: string;
|
||||
}
|
||||
|
||||
interface ImportLayoutProps {
|
||||
children: ReactNode;
|
||||
title: string;
|
||||
description?: string;
|
||||
}
|
||||
|
||||
const importTabs: ImportTab[] = [
|
||||
{
|
||||
id: 'manual',
|
||||
label: 'Manual Entry',
|
||||
href: '/import',
|
||||
description: 'Add a story by manually entering details'
|
||||
},
|
||||
{
|
||||
id: 'url',
|
||||
label: 'Import from URL',
|
||||
href: '/import?mode=url',
|
||||
description: 'Import a single story from a website'
|
||||
},
|
||||
{
|
||||
id: 'epub',
|
||||
label: 'Import EPUB',
|
||||
href: '/import/epub',
|
||||
description: 'Import a story from an EPUB file'
|
||||
},
|
||||
{
|
||||
id: 'bulk',
|
||||
label: 'Bulk Import',
|
||||
href: '/import/bulk',
|
||||
description: 'Import multiple stories from a list of URLs'
|
||||
}
|
||||
];
|
||||
|
||||
export default function ImportLayout({ children, title, description }: ImportLayoutProps) {
|
||||
const pathname = usePathname();
|
||||
const searchParams = useSearchParams();
|
||||
const mode = searchParams.get('mode');
|
||||
|
||||
// Determine which tab is active
|
||||
const getActiveTab = () => {
|
||||
if (pathname === '/import') {
|
||||
return mode === 'url' ? 'url' : 'manual';
|
||||
} else if (pathname === '/import/epub') {
|
||||
return 'epub';
|
||||
} else if (pathname === '/import/bulk') {
|
||||
return 'bulk';
|
||||
}
|
||||
return 'manual';
|
||||
};
|
||||
|
||||
const activeTab = getActiveTab();
|
||||
|
||||
return (
|
||||
<AppLayout>
|
||||
<div className="max-w-4xl mx-auto space-y-6">
|
||||
{/* Header */}
|
||||
<div className="text-center">
|
||||
<h1 className="text-3xl font-bold theme-header">{title}</h1>
|
||||
{description && (
|
||||
<p className="theme-text mt-2 text-lg">
|
||||
{description}
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Tab Navigation */}
|
||||
<div className="theme-card theme-shadow rounded-lg overflow-hidden">
|
||||
{/* Tab Headers */}
|
||||
<div className="flex border-b theme-border overflow-x-auto">
|
||||
{importTabs.map((tab) => (
|
||||
<Link
|
||||
key={tab.id}
|
||||
href={tab.href}
|
||||
className={`flex-1 min-w-0 px-4 py-3 text-sm font-medium text-center transition-colors whitespace-nowrap ${
|
||||
activeTab === tab.id
|
||||
? 'theme-accent-bg text-white border-b-2 border-transparent'
|
||||
: 'theme-text hover:theme-accent-light hover:theme-accent-text'
|
||||
}`}
|
||||
>
|
||||
<div className="truncate">
|
||||
{tab.label}
|
||||
</div>
|
||||
</Link>
|
||||
))}
|
||||
</div>
|
||||
|
||||
{/* Tab Descriptions */}
|
||||
<div className="px-6 py-4 bg-gray-50 dark:bg-gray-800/50">
|
||||
<div className="flex items-center justify-center">
|
||||
<p className="text-sm theme-text text-center">
|
||||
{importTabs.find(tab => tab.id === activeTab)?.description}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Tab Content */}
|
||||
<div className="p-6">
|
||||
{children}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Quick Actions */}
|
||||
<div className="flex justify-center">
|
||||
<Link
|
||||
href="/library"
|
||||
className="theme-text hover:theme-accent transition-colors text-sm"
|
||||
>
|
||||
← Back to Library
|
||||
</Link>
|
||||
</div>
|
||||
</div>
|
||||
</AppLayout>
|
||||
);
|
||||
}
|
||||
231
frontend/src/components/stories/AuthorSelector.tsx
Normal file
231
frontend/src/components/stories/AuthorSelector.tsx
Normal file
@@ -0,0 +1,231 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useEffect, useRef } from 'react';
|
||||
import { authorApi } from '../../lib/api';
|
||||
import { Author } from '../../types/api';
|
||||
|
||||
interface AuthorSelectorProps {
|
||||
value: string;
|
||||
onChange: (authorName: string, authorId?: string) => void;
|
||||
placeholder?: string;
|
||||
error?: string;
|
||||
disabled?: boolean;
|
||||
required?: boolean;
|
||||
label?: string;
|
||||
}
|
||||
|
||||
export default function AuthorSelector({
|
||||
value,
|
||||
onChange,
|
||||
placeholder = 'Enter or select an author',
|
||||
error,
|
||||
disabled = false,
|
||||
required = false,
|
||||
label = 'Author'
|
||||
}: AuthorSelectorProps) {
|
||||
const [isOpen, setIsOpen] = useState(false);
|
||||
const [authors, setAuthors] = useState<Author[]>([]);
|
||||
const [filteredAuthors, setFilteredAuthors] = useState<Author[]>([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [inputValue, setInputValue] = useState(value || '');
|
||||
|
||||
const inputRef = useRef<HTMLInputElement>(null);
|
||||
const dropdownRef = useRef<HTMLDivElement>(null);
|
||||
|
||||
// Load authors when component mounts or when dropdown opens
|
||||
useEffect(() => {
|
||||
const loadAuthors = async () => {
|
||||
try {
|
||||
setLoading(true);
|
||||
const result = await authorApi.getAuthors({ page: 0, size: 100 }); // Get first 100 authors
|
||||
setAuthors(result.content);
|
||||
} catch (error) {
|
||||
console.error('Failed to load authors:', error);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
if (isOpen && authors.length === 0) {
|
||||
loadAuthors();
|
||||
}
|
||||
}, [isOpen, authors.length]);
|
||||
|
||||
// Filter authors based on input value
|
||||
useEffect(() => {
|
||||
if (!inputValue.trim()) {
|
||||
setFilteredAuthors(authors);
|
||||
} else {
|
||||
const filtered = authors.filter(author =>
|
||||
author.name.toLowerCase().includes(inputValue.toLowerCase())
|
||||
);
|
||||
setFilteredAuthors(filtered);
|
||||
}
|
||||
}, [inputValue, authors]);
|
||||
|
||||
// Update input value when prop value changes
|
||||
useEffect(() => {
|
||||
if (value !== inputValue) {
|
||||
setInputValue(value || '');
|
||||
}
|
||||
}, [value]);
|
||||
|
||||
// Handle clicking outside to close dropdown
|
||||
useEffect(() => {
|
||||
const handleClickOutside = (event: MouseEvent) => {
|
||||
if (dropdownRef.current && !dropdownRef.current.contains(event.target as Node)) {
|
||||
setIsOpen(false);
|
||||
}
|
||||
};
|
||||
|
||||
if (isOpen) {
|
||||
document.addEventListener('mousedown', handleClickOutside);
|
||||
return () => document.removeEventListener('mousedown', handleClickOutside);
|
||||
}
|
||||
}, [isOpen]);
|
||||
|
||||
const handleInputChange = (e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
const newValue = e.target.value;
|
||||
setInputValue(newValue);
|
||||
setIsOpen(true);
|
||||
|
||||
// Call onChange for free-form text entry (new author)
|
||||
onChange(newValue);
|
||||
};
|
||||
|
||||
const handleAuthorSelect = (author: Author) => {
|
||||
setInputValue(author.name);
|
||||
setIsOpen(false);
|
||||
onChange(author.name, author.id);
|
||||
};
|
||||
|
||||
const handleInputFocus = () => {
|
||||
setIsOpen(true);
|
||||
};
|
||||
|
||||
const handleKeyDown = (e: React.KeyboardEvent) => {
|
||||
if (e.key === 'Escape') {
|
||||
setIsOpen(false);
|
||||
} else if (e.key === 'ArrowDown' && !isOpen) {
|
||||
setIsOpen(true);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="relative" ref={dropdownRef}>
|
||||
{label && (
|
||||
<label className="block text-sm font-medium theme-header mb-2">
|
||||
{label}{required && ' *'}
|
||||
</label>
|
||||
)}
|
||||
|
||||
<div className="relative">
|
||||
<input
|
||||
ref={inputRef}
|
||||
type="text"
|
||||
value={inputValue}
|
||||
onChange={handleInputChange}
|
||||
onFocus={handleInputFocus}
|
||||
onKeyDown={handleKeyDown}
|
||||
placeholder={placeholder}
|
||||
disabled={disabled}
|
||||
required={required}
|
||||
className={`w-full px-3 py-2 border rounded-md shadow-sm focus:outline-none focus:ring-2 focus:border-transparent transition-colors ${
|
||||
error
|
||||
? 'border-red-300 focus:ring-red-500 theme-error'
|
||||
: 'theme-border focus:ring-theme-accent focus:theme-accent-border'
|
||||
} ${disabled ? 'theme-disabled' : 'theme-input'}`}
|
||||
aria-expanded={isOpen}
|
||||
aria-haspopup="listbox"
|
||||
role="combobox"
|
||||
/>
|
||||
|
||||
{/* Dropdown arrow */}
|
||||
<div className="absolute inset-y-0 right-0 flex items-center pr-3 pointer-events-none">
|
||||
<svg
|
||||
className={`w-4 h-4 theme-text transition-transform ${isOpen ? 'rotate-180' : ''}`}
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
viewBox="0 0 24 24"
|
||||
>
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 9l-7 7-7-7" />
|
||||
</svg>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Dropdown */}
|
||||
{isOpen && (
|
||||
<div className="absolute z-50 w-full mt-1 theme-card theme-shadow border theme-border rounded-md max-h-60 overflow-auto">
|
||||
{loading ? (
|
||||
<div className="px-3 py-2 text-sm theme-text text-center">
|
||||
Loading authors...
|
||||
</div>
|
||||
) : filteredAuthors.length > 0 ? (
|
||||
<>
|
||||
{/* Existing authors */}
|
||||
<div className="py-1">
|
||||
{filteredAuthors.map((author) => (
|
||||
<button
|
||||
key={author.id}
|
||||
type="button"
|
||||
className="w-full text-left px-3 py-2 text-sm theme-text hover:theme-accent-light hover:theme-accent-text transition-colors flex items-center justify-between"
|
||||
onClick={() => handleAuthorSelect(author)}
|
||||
>
|
||||
<span>{author.name}</span>
|
||||
<span className="text-xs theme-text-muted">
|
||||
{author.storyCount} {author.storyCount === 1 ? 'story' : 'stories'}
|
||||
</span>
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
|
||||
{/* New author option if input doesn't match exactly */}
|
||||
{inputValue.trim() && !filteredAuthors.find(a => a.name.toLowerCase() === inputValue.toLowerCase()) && (
|
||||
<>
|
||||
<div className="border-t theme-border"></div>
|
||||
<div className="py-1">
|
||||
<button
|
||||
type="button"
|
||||
className="w-full text-left px-3 py-2 text-sm theme-text hover:theme-accent-light hover:theme-accent-text transition-colors"
|
||||
onClick={() => {
|
||||
setIsOpen(false);
|
||||
onChange(inputValue.trim());
|
||||
}}
|
||||
>
|
||||
<span className="font-medium">Create new author:</span> "{inputValue.trim()}"
|
||||
</button>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
) : inputValue.trim() ? (
|
||||
/* No matches, show create new option */
|
||||
<div className="py-1">
|
||||
<button
|
||||
type="button"
|
||||
className="w-full text-left px-3 py-2 text-sm theme-text hover:theme-accent-light hover:theme-accent-text transition-colors"
|
||||
onClick={() => {
|
||||
setIsOpen(false);
|
||||
onChange(inputValue.trim());
|
||||
}}
|
||||
>
|
||||
<span className="font-medium">Create new author:</span> "{inputValue.trim()}"
|
||||
</button>
|
||||
</div>
|
||||
) : (
|
||||
/* No authors loaded or empty input */
|
||||
<div className="px-3 py-2 text-sm theme-text-muted text-center">
|
||||
{authors.length === 0 ? 'No authors yet' : 'Type to search or create new author'}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{error && (
|
||||
<p className="mt-1 text-sm text-red-600 dark:text-red-400">
|
||||
{error}
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -20,9 +20,124 @@ export default function RichTextEditor({
|
||||
}: RichTextEditorProps) {
|
||||
const [viewMode, setViewMode] = useState<'visual' | 'html'>('visual');
|
||||
const [htmlValue, setHtmlValue] = useState(value);
|
||||
const [isMaximized, setIsMaximized] = useState(false);
|
||||
const [containerHeight, setContainerHeight] = useState(300); // Default height in pixels
|
||||
const previewRef = useRef<HTMLDivElement>(null);
|
||||
const visualTextareaRef = useRef<HTMLTextAreaElement>(null);
|
||||
const visualDivRef = useRef<HTMLDivElement>(null);
|
||||
const containerRef = useRef<HTMLDivElement>(null);
|
||||
const [isUserTyping, setIsUserTyping] = useState(false);
|
||||
|
||||
// Utility functions for cursor position preservation
|
||||
const saveCursorPosition = () => {
|
||||
const selection = window.getSelection();
|
||||
if (!selection || selection.rangeCount === 0) return null;
|
||||
|
||||
const range = selection.getRangeAt(0);
|
||||
const div = visualDivRef.current;
|
||||
if (!div) return null;
|
||||
|
||||
return {
|
||||
startContainer: range.startContainer,
|
||||
startOffset: range.startOffset,
|
||||
endContainer: range.endContainer,
|
||||
endOffset: range.endOffset
|
||||
};
|
||||
};
|
||||
|
||||
const restoreCursorPosition = (position: any) => {
|
||||
if (!position) return;
|
||||
|
||||
try {
|
||||
const selection = window.getSelection();
|
||||
if (!selection) return;
|
||||
|
||||
const range = document.createRange();
|
||||
range.setStart(position.startContainer, position.startOffset);
|
||||
range.setEnd(position.endContainer, position.endOffset);
|
||||
|
||||
selection.removeAllRanges();
|
||||
selection.addRange(range);
|
||||
} catch (e) {
|
||||
console.warn('Could not restore cursor position:', e);
|
||||
}
|
||||
};
|
||||
|
||||
// Maximize/minimize functionality
|
||||
const toggleMaximize = () => {
|
||||
if (!isMaximized) {
|
||||
// Store current height before maximizing
|
||||
if (containerRef.current) {
|
||||
setContainerHeight(containerRef.current.scrollHeight || containerHeight);
|
||||
}
|
||||
}
|
||||
setIsMaximized(!isMaximized);
|
||||
};
|
||||
|
||||
// Handle manual resize when dragging resize handle
|
||||
const handleMouseDown = (e: React.MouseEvent) => {
|
||||
if (isMaximized) return; // Don't allow resize when maximized
|
||||
|
||||
e.preventDefault();
|
||||
const startY = e.clientY;
|
||||
const startHeight = containerHeight;
|
||||
|
||||
const handleMouseMove = (e: MouseEvent) => {
|
||||
const deltaY = e.clientY - startY;
|
||||
const newHeight = Math.max(200, Math.min(800, startHeight + deltaY)); // Min 200px, Max 800px
|
||||
setContainerHeight(newHeight);
|
||||
};
|
||||
|
||||
const handleMouseUp = () => {
|
||||
document.removeEventListener('mousemove', handleMouseMove);
|
||||
document.removeEventListener('mouseup', handleMouseUp);
|
||||
};
|
||||
|
||||
document.addEventListener('mousemove', handleMouseMove);
|
||||
document.addEventListener('mouseup', handleMouseUp);
|
||||
};
|
||||
|
||||
// Escape key handler for maximized mode
|
||||
useEffect(() => {
|
||||
const handleEscapeKey = (e: KeyboardEvent) => {
|
||||
if (e.key === 'Escape' && isMaximized) {
|
||||
setIsMaximized(false);
|
||||
}
|
||||
};
|
||||
|
||||
if (isMaximized) {
|
||||
document.addEventListener('keydown', handleEscapeKey);
|
||||
// Prevent body from scrolling when maximized
|
||||
document.body.style.overflow = 'hidden';
|
||||
} else {
|
||||
document.body.style.overflow = '';
|
||||
}
|
||||
|
||||
return () => {
|
||||
document.removeEventListener('keydown', handleEscapeKey);
|
||||
document.body.style.overflow = '';
|
||||
};
|
||||
}, [isMaximized]);
|
||||
|
||||
// Set initial content when component mounts
|
||||
useEffect(() => {
|
||||
const div = visualDivRef.current;
|
||||
if (div && div.innerHTML !== value) {
|
||||
div.innerHTML = value || '';
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Update div content when value changes externally (not from user typing)
|
||||
useEffect(() => {
|
||||
const div = visualDivRef.current;
|
||||
if (div && !isUserTyping && div.innerHTML !== value) {
|
||||
const cursorPosition = saveCursorPosition();
|
||||
div.innerHTML = value || '';
|
||||
if (cursorPosition) {
|
||||
setTimeout(() => restoreCursorPosition(cursorPosition), 0);
|
||||
}
|
||||
}
|
||||
}, [value, isUserTyping]);
|
||||
|
||||
// Preload sanitization config
|
||||
useEffect(() => {
|
||||
@@ -38,8 +153,16 @@ export default function RichTextEditor({
|
||||
const div = visualDivRef.current;
|
||||
if (div) {
|
||||
const newHtml = div.innerHTML;
|
||||
onChange(newHtml);
|
||||
setHtmlValue(newHtml);
|
||||
setIsUserTyping(true);
|
||||
|
||||
// Only call onChange if content actually changed
|
||||
if (newHtml !== value) {
|
||||
onChange(newHtml);
|
||||
setHtmlValue(newHtml);
|
||||
}
|
||||
|
||||
// Reset typing state after a short delay
|
||||
setTimeout(() => setIsUserTyping(false), 100);
|
||||
}
|
||||
};
|
||||
|
||||
@@ -155,8 +278,10 @@ export default function RichTextEditor({
|
||||
}
|
||||
|
||||
// Update the state
|
||||
setIsUserTyping(true);
|
||||
onChange(visualDiv.innerHTML);
|
||||
setHtmlValue(visualDiv.innerHTML);
|
||||
setTimeout(() => setIsUserTyping(false), 100);
|
||||
} else if (textarea) {
|
||||
// Fallback for textarea mode (shouldn't happen in visual mode but good to have)
|
||||
const start = textarea.selectionStart;
|
||||
@@ -213,8 +338,10 @@ export default function RichTextEditor({
|
||||
visualDiv.innerHTML += textAsHtml;
|
||||
}
|
||||
|
||||
setIsUserTyping(true);
|
||||
onChange(visualDiv.innerHTML);
|
||||
setHtmlValue(visualDiv.innerHTML);
|
||||
setTimeout(() => setIsUserTyping(false), 100);
|
||||
}
|
||||
} else {
|
||||
console.log('No usable clipboard content found');
|
||||
@@ -229,8 +356,10 @@ export default function RichTextEditor({
|
||||
.filter(paragraph => paragraph.trim())
|
||||
.map(paragraph => `<p>${paragraph.replace(/\n/g, '<br>')}</p>`)
|
||||
.join('\n');
|
||||
setIsUserTyping(true);
|
||||
onChange(value + textAsHtml);
|
||||
setHtmlValue(value + textAsHtml);
|
||||
setTimeout(() => setIsUserTyping(false), 100);
|
||||
}
|
||||
}
|
||||
};
|
||||
@@ -293,8 +422,10 @@ export default function RichTextEditor({
|
||||
}
|
||||
|
||||
// Update the state
|
||||
setIsUserTyping(true);
|
||||
onChange(visualDiv.innerHTML);
|
||||
setHtmlValue(visualDiv.innerHTML);
|
||||
setTimeout(() => setIsUserTyping(false), 100);
|
||||
}
|
||||
} else {
|
||||
// HTML mode - existing logic with improvements
|
||||
@@ -367,6 +498,17 @@ export default function RichTextEditor({
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-1">
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={toggleMaximize}
|
||||
title={isMaximized ? "Minimize editor" : "Maximize editor"}
|
||||
className="font-mono"
|
||||
>
|
||||
{isMaximized ? "⊡" : "⊞"}
|
||||
</Button>
|
||||
<div className="w-px h-4 bg-gray-300 mx-1" />
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
@@ -432,31 +574,160 @@ export default function RichTextEditor({
|
||||
</div>
|
||||
|
||||
{/* Editor */}
|
||||
<div className="border theme-border rounded-b-lg overflow-hidden">
|
||||
{viewMode === 'visual' ? (
|
||||
<div
|
||||
className={`relative border theme-border rounded-b-lg ${
|
||||
isMaximized ? 'fixed inset-4 z-50 bg-white dark:bg-gray-900 shadow-2xl' : ''
|
||||
}`}
|
||||
style={isMaximized ? {} : { height: containerHeight }}
|
||||
>
|
||||
<div
|
||||
ref={containerRef}
|
||||
className="h-full flex flex-col overflow-hidden"
|
||||
>
|
||||
{/* Maximized toolbar (shown when maximized) */}
|
||||
{isMaximized && (
|
||||
<div className="flex items-center justify-between p-2 theme-card border-b theme-border">
|
||||
<div className="flex items-center gap-2">
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => setViewMode('visual')}
|
||||
className={viewMode === 'visual' ? 'theme-accent-bg text-white' : ''}
|
||||
>
|
||||
Visual
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => setViewMode('html')}
|
||||
className={viewMode === 'html' ? 'theme-accent-bg text-white' : ''}
|
||||
>
|
||||
HTML
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-1">
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={toggleMaximize}
|
||||
title="Minimize editor"
|
||||
className="font-mono"
|
||||
>
|
||||
⊡
|
||||
</Button>
|
||||
<div className="w-px h-4 bg-gray-300 mx-1" />
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => formatText('strong')}
|
||||
title="Bold"
|
||||
className="font-bold"
|
||||
>
|
||||
B
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => formatText('em')}
|
||||
title="Italic"
|
||||
className="italic"
|
||||
>
|
||||
I
|
||||
</Button>
|
||||
<div className="w-px h-4 bg-gray-300 mx-1" />
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => formatText('h1')}
|
||||
title="Heading 1"
|
||||
className="text-lg font-bold"
|
||||
>
|
||||
H1
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => formatText('h2')}
|
||||
title="Heading 2"
|
||||
className="text-base font-bold"
|
||||
>
|
||||
H2
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => formatText('h3')}
|
||||
title="Heading 3"
|
||||
className="text-sm font-bold"
|
||||
>
|
||||
H3
|
||||
</Button>
|
||||
<div className="w-px h-4 bg-gray-300 mx-1" />
|
||||
<Button
|
||||
type="button"
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => formatText('p')}
|
||||
title="Paragraph"
|
||||
>
|
||||
P
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Editor content */}
|
||||
<div className="flex-1 overflow-hidden">
|
||||
{viewMode === 'visual' ? (
|
||||
<div className="relative h-full">
|
||||
<div
|
||||
ref={visualDivRef}
|
||||
contentEditable
|
||||
onInput={handleVisualContentChange}
|
||||
onPaste={handlePaste}
|
||||
className="p-3 h-full overflow-y-auto focus:outline-none focus:ring-0 whitespace-pre-wrap resize-none"
|
||||
suppressContentEditableWarning={true}
|
||||
/>
|
||||
{!value && (
|
||||
<div className="absolute top-3 left-3 text-gray-500 dark:text-gray-400 pointer-events-none select-none">
|
||||
{placeholder}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
<Textarea
|
||||
value={htmlValue}
|
||||
onChange={handleHtmlChange}
|
||||
placeholder="<p>Write your HTML content here...</p>"
|
||||
className="border-0 rounded-none focus:ring-0 font-mono text-sm h-full resize-none"
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Resize handle (only show when not maximized) */}
|
||||
{!isMaximized && (
|
||||
<div
|
||||
ref={visualDivRef}
|
||||
contentEditable
|
||||
onInput={handleVisualContentChange}
|
||||
onPaste={handlePaste}
|
||||
className="p-3 min-h-[300px] focus:outline-none focus:ring-0 whitespace-pre-wrap"
|
||||
style={{ minHeight: '300px' }}
|
||||
dangerouslySetInnerHTML={{ __html: value || `<p>${placeholder}</p>` }}
|
||||
suppressContentEditableWarning={true}
|
||||
/>
|
||||
) : (
|
||||
<Textarea
|
||||
value={htmlValue}
|
||||
onChange={handleHtmlChange}
|
||||
placeholder="<p>Write your HTML content here...</p>"
|
||||
rows={12}
|
||||
className="border-0 rounded-none focus:ring-0 font-mono text-sm"
|
||||
/>
|
||||
onMouseDown={handleMouseDown}
|
||||
className="absolute bottom-0 left-0 right-0 h-2 cursor-ns-resize bg-gray-200 dark:bg-gray-700 hover:bg-gray-300 dark:hover:bg-gray-600 transition-colors flex items-center justify-center"
|
||||
title="Drag to resize"
|
||||
>
|
||||
<div className="w-8 h-0.5 bg-gray-400 dark:bg-gray-500 rounded-full"></div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Preview for HTML mode */}
|
||||
{viewMode === 'html' && value && (
|
||||
{/* Preview for HTML mode (only show when not maximized) */}
|
||||
{viewMode === 'html' && value && !isMaximized && (
|
||||
<div className="space-y-2">
|
||||
<h4 className="text-sm font-medium theme-header">Preview:</h4>
|
||||
<div
|
||||
@@ -480,6 +751,10 @@ export default function RichTextEditor({
|
||||
<strong>HTML mode:</strong> Edit HTML source directly for advanced formatting.
|
||||
Allowed tags: p, br, div, span, strong, em, b, i, u, s, h1-h6, ul, ol, li, blockquote, and more.
|
||||
</p>
|
||||
<p>
|
||||
<strong>Tips:</strong> Use the ⊞ button to maximize the editor for larger stories.
|
||||
Drag the resize handle at the bottom to adjust height. Press Escape to exit maximized mode.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -6,17 +6,21 @@ interface TagFilterProps {
|
||||
tags: Tag[];
|
||||
selectedTags: string[];
|
||||
onTagToggle: (tagName: string) => void;
|
||||
showCollectionCount?: boolean;
|
||||
}
|
||||
|
||||
export default function TagFilter({ tags, selectedTags, onTagToggle }: TagFilterProps) {
|
||||
export default function TagFilter({ tags, selectedTags, onTagToggle, showCollectionCount = false }: TagFilterProps) {
|
||||
if (!Array.isArray(tags) || tags.length === 0) return null;
|
||||
|
||||
// Filter out tags with no stories, then sort by usage count (descending) and then alphabetically
|
||||
// Filter out tags with no count, then sort by usage count (descending) and then alphabetically
|
||||
const sortedTags = [...tags]
|
||||
.filter(tag => (tag.storyCount || 0) > 0)
|
||||
.filter(tag => {
|
||||
const count = showCollectionCount ? (tag.collectionCount || 0) : (tag.storyCount || 0);
|
||||
return count > 0;
|
||||
})
|
||||
.sort((a, b) => {
|
||||
const aCount = a.storyCount || 0;
|
||||
const bCount = b.storyCount || 0;
|
||||
const aCount = showCollectionCount ? (a.collectionCount || 0) : (a.storyCount || 0);
|
||||
const bCount = showCollectionCount ? (b.collectionCount || 0) : (b.storyCount || 0);
|
||||
if (bCount !== aCount) {
|
||||
return bCount - aCount;
|
||||
}
|
||||
@@ -40,7 +44,7 @@ export default function TagFilter({ tags, selectedTags, onTagToggle }: TagFilter
|
||||
: 'theme-card theme-text theme-border hover:border-gray-400'
|
||||
}`}
|
||||
>
|
||||
{tag.name} ({tag.storyCount || 0})
|
||||
{tag.name} ({showCollectionCount ? (tag.collectionCount || 0) : (tag.storyCount || 0)})
|
||||
</button>
|
||||
);
|
||||
})}
|
||||
|
||||
98
frontend/src/components/ui/Dropdown.tsx
Normal file
98
frontend/src/components/ui/Dropdown.tsx
Normal file
@@ -0,0 +1,98 @@
|
||||
'use client';
|
||||
|
||||
import { useState, useRef, useEffect } from 'react';
|
||||
import Link from 'next/link';
|
||||
import { ChevronDownIcon } from '@heroicons/react/24/outline';
|
||||
|
||||
interface DropdownItem {
|
||||
href: string;
|
||||
label: string;
|
||||
description?: string;
|
||||
}
|
||||
|
||||
interface DropdownProps {
|
||||
trigger: string;
|
||||
items: DropdownItem[];
|
||||
className?: string;
|
||||
onItemClick?: () => void;
|
||||
}
|
||||
|
||||
export default function Dropdown({ trigger, items, className = '', onItemClick }: DropdownProps) {
|
||||
const [isOpen, setIsOpen] = useState(false);
|
||||
const dropdownRef = useRef<HTMLDivElement>(null);
|
||||
const timeoutRef = useRef<NodeJS.Timeout>();
|
||||
|
||||
useEffect(() => {
|
||||
const handleClickOutside = (event: MouseEvent) => {
|
||||
if (dropdownRef.current && !dropdownRef.current.contains(event.target as Node)) {
|
||||
setIsOpen(false);
|
||||
}
|
||||
};
|
||||
|
||||
if (isOpen) {
|
||||
document.addEventListener('mousedown', handleClickOutside);
|
||||
}
|
||||
|
||||
return () => {
|
||||
document.removeEventListener('mousedown', handleClickOutside);
|
||||
if (timeoutRef.current) {
|
||||
clearTimeout(timeoutRef.current);
|
||||
}
|
||||
};
|
||||
}, [isOpen]);
|
||||
|
||||
const handleMouseEnter = () => {
|
||||
if (timeoutRef.current) {
|
||||
clearTimeout(timeoutRef.current);
|
||||
}
|
||||
setIsOpen(true);
|
||||
};
|
||||
|
||||
const handleMouseLeave = () => {
|
||||
timeoutRef.current = setTimeout(() => {
|
||||
setIsOpen(false);
|
||||
}, 150);
|
||||
};
|
||||
|
||||
const handleItemClick = () => {
|
||||
setIsOpen(false);
|
||||
onItemClick?.();
|
||||
};
|
||||
|
||||
return (
|
||||
<div
|
||||
className={`relative ${className}`}
|
||||
ref={dropdownRef}
|
||||
onMouseEnter={handleMouseEnter}
|
||||
onMouseLeave={handleMouseLeave}
|
||||
>
|
||||
<button
|
||||
onClick={() => setIsOpen(!isOpen)}
|
||||
className="theme-text hover:theme-accent transition-colors font-medium flex items-center gap-1"
|
||||
>
|
||||
{trigger}
|
||||
<ChevronDownIcon
|
||||
className={`h-4 w-4 transition-transform duration-200 ${isOpen ? 'rotate-180' : ''}`}
|
||||
/>
|
||||
</button>
|
||||
|
||||
{isOpen && (
|
||||
<div className="absolute top-full left-0 mt-1 w-64 theme-card theme-shadow border theme-border rounded-lg py-2 z-50">
|
||||
{items.map((item, index) => (
|
||||
<Link
|
||||
key={index}
|
||||
href={item.href}
|
||||
onClick={handleItemClick}
|
||||
className="block px-4 py-2 theme-text hover:theme-accent transition-colors"
|
||||
>
|
||||
<div className="font-medium">{item.label}</div>
|
||||
{item.description && (
|
||||
<div className="text-sm theme-text-secondary mt-1">{item.description}</div>
|
||||
)}
|
||||
</Link>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -32,7 +32,8 @@ export default function ImageUpload({
|
||||
if (rejection.errors?.[0]?.code === 'file-too-large') {
|
||||
setError(`File is too large. Maximum size is ${maxSizeMB}MB.`);
|
||||
} else if (rejection.errors?.[0]?.code === 'file-invalid-type') {
|
||||
setError('Invalid file type. Please select an image file.');
|
||||
const allowedTypes = accept.split(',').map(type => type.trim()).join(', ');
|
||||
setError(`Invalid file type. Supported formats: ${allowedTypes.replace(/image\//g, '').toUpperCase()}.`);
|
||||
} else {
|
||||
setError('File rejected. Please try another file.');
|
||||
}
|
||||
@@ -41,18 +42,31 @@ export default function ImageUpload({
|
||||
|
||||
const file = acceptedFiles[0];
|
||||
if (file) {
|
||||
// Additional client-side validation for file type
|
||||
const allowedTypes = accept.split(',').map(type => type.trim());
|
||||
if (!allowedTypes.includes(file.type)) {
|
||||
const supportedFormats = allowedTypes.map(type => type.replace('image/', '').toUpperCase()).join(', ');
|
||||
setError(`Invalid file type. Your file is ${file.type}. Supported formats: ${supportedFormats}.`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Create preview
|
||||
const previewUrl = URL.createObjectURL(file);
|
||||
setPreview(previewUrl);
|
||||
onImageSelect(file);
|
||||
}
|
||||
}, [onImageSelect, maxSizeMB]);
|
||||
}, [onImageSelect, maxSizeMB, accept]);
|
||||
|
||||
// Build proper accept object for dropzone based on specific MIME types
|
||||
const acceptTypes = accept.split(',').map(type => type.trim());
|
||||
const dropzoneAccept = acceptTypes.reduce((acc, type) => {
|
||||
acc[type] = []; // Empty array means accept files with this MIME type
|
||||
return acc;
|
||||
}, {} as Record<string, string[]>);
|
||||
|
||||
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
||||
onDrop,
|
||||
accept: {
|
||||
'image/*': accept.split(',').map(type => type.trim()),
|
||||
},
|
||||
accept: dropzoneAccept,
|
||||
maxFiles: 1,
|
||||
maxSize: maxSizeMB * 1024 * 1024, // Convert MB to bytes
|
||||
});
|
||||
@@ -123,7 +137,7 @@ export default function ImageUpload({
|
||||
)}
|
||||
</div>
|
||||
<p className="text-sm text-gray-500">
|
||||
Supports JPEG, PNG, WebP up to {maxSizeMB}MB
|
||||
Supports {acceptTypes.map(type => type.replace('image/', '').toUpperCase()).join(', ')} up to {maxSizeMB}MB
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
'use client';
|
||||
|
||||
import { createContext, useContext, useEffect, useState } from 'react';
|
||||
import { authApi } from '../lib/api';
|
||||
import { useRouter } from 'next/navigation';
|
||||
import { authApi, setGlobalAuthFailureHandler } from '../lib/api';
|
||||
import { preloadSanitizationConfig } from '../lib/sanitization';
|
||||
|
||||
interface AuthContextType {
|
||||
@@ -16,8 +17,18 @@ const AuthContext = createContext<AuthContextType | undefined>(undefined);
|
||||
export function AuthProvider({ children }: { children: React.ReactNode }) {
|
||||
const [isAuthenticated, setIsAuthenticated] = useState(false);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const router = useRouter();
|
||||
|
||||
// Handle authentication failures from API calls
|
||||
const handleAuthFailure = () => {
|
||||
console.log('Authentication token expired, logging out user');
|
||||
setIsAuthenticated(false);
|
||||
router.push('/login');
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
// Register the auth failure handler for API interceptor
|
||||
setGlobalAuthFailureHandler(handleAuthFailure);
|
||||
// Check if user is already authenticated on app load
|
||||
const checkAuth = async () => {
|
||||
try {
|
||||
@@ -42,7 +53,7 @@ export function AuthProvider({ children }: { children: React.ReactNode }) {
|
||||
|
||||
checkAuth();
|
||||
loadSanitizationConfig();
|
||||
}, []);
|
||||
}, [router]);
|
||||
|
||||
const login = async (password: string) => {
|
||||
try {
|
||||
@@ -57,6 +68,7 @@ export function AuthProvider({ children }: { children: React.ReactNode }) {
|
||||
const logout = () => {
|
||||
authApi.logout();
|
||||
setIsAuthenticated(false);
|
||||
router.push('/login');
|
||||
};
|
||||
|
||||
return (
|
||||
|
||||
@@ -21,15 +21,36 @@ api.interceptors.request.use((config) => {
|
||||
return config;
|
||||
});
|
||||
|
||||
// Global auth failure handler - can be set by AuthContext
|
||||
let globalAuthFailureHandler: (() => void) | null = null;
|
||||
|
||||
export const setGlobalAuthFailureHandler = (handler: () => void) => {
|
||||
globalAuthFailureHandler = handler;
|
||||
};
|
||||
|
||||
// Response interceptor to handle auth errors
|
||||
api.interceptors.response.use(
|
||||
(response) => response,
|
||||
(error) => {
|
||||
if (error.response?.status === 401) {
|
||||
// Clear invalid token and redirect to login
|
||||
// Handle authentication failures
|
||||
if (error.response?.status === 401 || error.response?.status === 403) {
|
||||
console.warn('Authentication failed, token may be expired or invalid');
|
||||
|
||||
// Clear invalid token
|
||||
localStorage.removeItem('auth-token');
|
||||
window.location.href = '/login';
|
||||
|
||||
// Use global handler if available (from AuthContext), otherwise fallback to direct redirect
|
||||
if (globalAuthFailureHandler) {
|
||||
globalAuthFailureHandler();
|
||||
} else {
|
||||
// Fallback for cases where AuthContext isn't available
|
||||
window.location.href = '/login';
|
||||
}
|
||||
|
||||
// Return a more specific error for components to handle gracefully
|
||||
return Promise.reject(new Error('Authentication required'));
|
||||
}
|
||||
|
||||
return Promise.reject(error);
|
||||
}
|
||||
);
|
||||
@@ -113,6 +134,11 @@ export const storyApi = {
|
||||
await api.post(`/stories/${id}/rating`, { rating });
|
||||
},
|
||||
|
||||
updateReadingProgress: async (id: string, position: number): Promise<Story> => {
|
||||
const response = await api.post(`/stories/${id}/reading-progress`, { position });
|
||||
return response.data;
|
||||
},
|
||||
|
||||
uploadCover: async (id: string, coverImage: File): Promise<{ imagePath: string }> => {
|
||||
const formData = new FormData();
|
||||
formData.append('file', coverImage);
|
||||
@@ -150,6 +176,22 @@ export const storyApi = {
|
||||
const response = await api.post('/stories/recreate-typesense-collection');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
checkDuplicate: async (title: string, authorName: string): Promise<{
|
||||
hasDuplicates: boolean;
|
||||
count: number;
|
||||
duplicates: Array<{
|
||||
id: string;
|
||||
title: string;
|
||||
authorName: string;
|
||||
createdAt: string;
|
||||
}>;
|
||||
}> => {
|
||||
const response = await api.get('/stories/check-duplicate', {
|
||||
params: { title, authorName }
|
||||
});
|
||||
return response.data;
|
||||
},
|
||||
};
|
||||
|
||||
// Author endpoints
|
||||
@@ -240,6 +282,11 @@ export const tagApi = {
|
||||
// Backend returns TagDto[], extract just the names
|
||||
return response.data.map((tag: Tag) => tag.name);
|
||||
},
|
||||
|
||||
getCollectionTags: async (): Promise<Tag[]> => {
|
||||
const response = await api.get('/tags/collections');
|
||||
return response.data;
|
||||
},
|
||||
};
|
||||
|
||||
// Series endpoints
|
||||
@@ -272,6 +319,7 @@ export const searchApi = {
|
||||
maxRating?: number;
|
||||
sortBy?: string;
|
||||
sortDir?: string;
|
||||
facetBy?: string[];
|
||||
}): Promise<SearchResult> => {
|
||||
// Create URLSearchParams to properly handle array parameters
|
||||
const searchParams = new URLSearchParams();
|
||||
@@ -292,6 +340,9 @@ export const searchApi = {
|
||||
if (params.tags && params.tags.length > 0) {
|
||||
params.tags.forEach(tag => searchParams.append('tags', tag));
|
||||
}
|
||||
if (params.facetBy && params.facetBy.length > 0) {
|
||||
params.facetBy.forEach(facet => searchParams.append('facetBy', facet));
|
||||
}
|
||||
|
||||
const response = await api.get(`/stories/search?${searchParams.toString()}`);
|
||||
return response.data;
|
||||
@@ -447,6 +498,51 @@ export const collectionApi = {
|
||||
},
|
||||
};
|
||||
|
||||
// Database management endpoints
|
||||
export const databaseApi = {
|
||||
backup: async (): Promise<Blob> => {
|
||||
const response = await api.post('/database/backup', {}, {
|
||||
responseType: 'blob'
|
||||
});
|
||||
return response.data;
|
||||
},
|
||||
|
||||
restore: async (file: File): Promise<{ success: boolean; message: string }> => {
|
||||
const formData = new FormData();
|
||||
formData.append('file', file);
|
||||
const response = await api.post('/database/restore', formData, {
|
||||
headers: { 'Content-Type': 'multipart/form-data' },
|
||||
});
|
||||
return response.data;
|
||||
},
|
||||
|
||||
clear: async (): Promise<{ success: boolean; message: string; deletedRecords?: number }> => {
|
||||
const response = await api.post('/database/clear');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
backupComplete: async (): Promise<Blob> => {
|
||||
const response = await api.post('/database/backup-complete', {}, {
|
||||
responseType: 'blob'
|
||||
});
|
||||
return response.data;
|
||||
},
|
||||
|
||||
restoreComplete: async (file: File): Promise<{ success: boolean; message: string }> => {
|
||||
const formData = new FormData();
|
||||
formData.append('file', file);
|
||||
const response = await api.post('/database/restore-complete', formData, {
|
||||
headers: { 'Content-Type': 'multipart/form-data' },
|
||||
});
|
||||
return response.data;
|
||||
},
|
||||
|
||||
clearComplete: async (): Promise<{ success: boolean; message: string; deletedRecords?: number }> => {
|
||||
const response = await api.post('/database/clear-complete');
|
||||
return response.data;
|
||||
},
|
||||
};
|
||||
|
||||
// Image utility
|
||||
export const getImageUrl = (path: string): string => {
|
||||
if (!path) return '';
|
||||
|
||||
@@ -5,7 +5,8 @@ interface SanitizationConfig {
|
||||
allowedTags: string[];
|
||||
allowedAttributes: Record<string, string[]>;
|
||||
allowedCssProperties: string[];
|
||||
removedAttributes: Record<string, string[]>;
|
||||
removedAttributes?: Record<string, string[]>;
|
||||
allowedProtocols?: Record<string, Record<string, string[]>>;
|
||||
description: string;
|
||||
}
|
||||
|
||||
@@ -95,8 +96,10 @@ async function fetchSanitizationConfig(): Promise<SanitizationConfig> {
|
||||
'font-style', 'text-align', 'text-decoration', 'margin',
|
||||
'padding', 'text-indent', 'line-height'
|
||||
],
|
||||
removedAttributes: {
|
||||
'a': ['href', 'target']
|
||||
allowedProtocols: {
|
||||
'a': {
|
||||
'href': ['http', 'https', '#', '/']
|
||||
}
|
||||
},
|
||||
description: 'Fallback sanitization configuration'
|
||||
};
|
||||
@@ -114,10 +117,10 @@ function createDOMPurifyConfig(config: SanitizationConfig) {
|
||||
const allowedTags = config.allowedTags;
|
||||
const allowedAttributes: Record<string, string[]> = { ...config.allowedAttributes };
|
||||
|
||||
// Remove attributes that should be stripped (like href from links)
|
||||
// Remove attributes that should be stripped (deprecated, keeping for backward compatibility)
|
||||
if (config.removedAttributes) {
|
||||
Object.keys(config.removedAttributes).forEach(tag => {
|
||||
const attributesToRemove = config.removedAttributes[tag];
|
||||
const attributesToRemove = config.removedAttributes![tag];
|
||||
if (allowedAttributes[tag]) {
|
||||
allowedAttributes[tag] = allowedAttributes[tag].filter(
|
||||
attr => !attributesToRemove.includes(attr)
|
||||
@@ -132,9 +135,20 @@ function createDOMPurifyConfig(config: SanitizationConfig) {
|
||||
const flattenedAttributes = Object.values(allowedAttributes).flat();
|
||||
const uniqueAttributes = Array.from(new Set(flattenedAttributes));
|
||||
|
||||
// Configure allowed protocols for URL validation
|
||||
const allowedSchemes: string[] = [];
|
||||
if (config.allowedProtocols) {
|
||||
Object.values(config.allowedProtocols).forEach(attributeProtocols => {
|
||||
Object.values(attributeProtocols).forEach(protocols => {
|
||||
allowedSchemes.push(...protocols);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
const domPurifyConfig: DOMPurify.Config = {
|
||||
ALLOWED_TAGS: allowedTags,
|
||||
ALLOWED_ATTR: uniqueAttributes,
|
||||
ALLOWED_URI_REGEXP: /^(?:(?:https?|#|\/):?\/?)[\w.\-#/?=&%]+$/i,
|
||||
ALLOW_UNKNOWN_PROTOCOLS: false,
|
||||
SANITIZE_DOM: true,
|
||||
KEEP_CONTENT: true,
|
||||
|
||||
348
frontend/src/lib/scraper/config/sites.json
Normal file
348
frontend/src/lib/scraper/config/sites.json
Normal file
@@ -0,0 +1,348 @@
|
||||
{
|
||||
"sites": {
|
||||
"deviantart.com": {
|
||||
"story": {
|
||||
"title": "h1",
|
||||
"titleFallback": "meta[property='og:title']",
|
||||
"titleFallbackAttribute": "content",
|
||||
"author": {
|
||||
"strategy": "text-pattern",
|
||||
"pattern": "by ([^\\s]+) on DeviantArt",
|
||||
"searchAfter": "<title>",
|
||||
"searchBefore": "</title>"
|
||||
},
|
||||
"content": {
|
||||
"strategy": "deviantart-content",
|
||||
"minLength": 200,
|
||||
"containerHints": ["journal", "literature", "story", "text", "content"],
|
||||
"excludeSelectors": ["script", "style", "nav", "header", "footer", ".dev-page-sidebar"]
|
||||
},
|
||||
"summary": "meta[property='og:description']",
|
||||
"summaryAttribute": "content",
|
||||
"tags": "a[data-tagname]",
|
||||
"tagsAttribute": "data-tagname",
|
||||
"coverImage": "meta[property='og:image']",
|
||||
"coverImageAttribute": "content"
|
||||
},
|
||||
"authorPage": {
|
||||
"storyLinks": "a[data-hook='deviation_link']",
|
||||
"filterStrategy": "dom-check",
|
||||
"requiresChildElement": "div[class*='journal']"
|
||||
}
|
||||
},
|
||||
|
||||
"literotica.com": {
|
||||
"story": {
|
||||
"title": "h1",
|
||||
"titleFallback": "meta[property='og:title']",
|
||||
"titleFallbackAttribute": "content",
|
||||
"author": {
|
||||
"strategy": "link-with-path",
|
||||
"pathContains": "/authors/",
|
||||
"searchWithin": "header, .story-info, #story-meta"
|
||||
},
|
||||
"content": {
|
||||
"strategy": "text-blocks",
|
||||
"minLength": 500,
|
||||
"containerHints": ["story", "content", "text"],
|
||||
"excludeSelectors": ["script", "style", "nav", "header", "footer"]
|
||||
},
|
||||
"summary": "meta[name='description']",
|
||||
"summaryAttribute": "content",
|
||||
"multiPage": {
|
||||
"enabled": true,
|
||||
"strategy": "url-pattern",
|
||||
"pageParam": "page",
|
||||
"maxPages": 20
|
||||
}
|
||||
},
|
||||
"authorPage": {
|
||||
"storyLinks": {
|
||||
"strategy": "href-pattern",
|
||||
"pattern": "/s/[^/]+$",
|
||||
"searchWithin": "main, #content, .stories-list"
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
"mcstories.com": {
|
||||
"story": {
|
||||
"title": "title",
|
||||
"titleTransform": "remove-suffix: - MCStories.com",
|
||||
"author": "meta[name='dcterms.creator']",
|
||||
"authorAttribute": "content",
|
||||
"content": "article#mcstories",
|
||||
"summary": "meta[name='dcterms.description']",
|
||||
"summaryAttribute": "content"
|
||||
},
|
||||
"authorPage": {
|
||||
"storyLinks": "a[href$='.html']:not([href*='Authors'])",
|
||||
"linkPrefix": "https://mcstories.com/"
|
||||
}
|
||||
},
|
||||
|
||||
"docs-lab.com": {
|
||||
"story": {
|
||||
"title": "title",
|
||||
"titleTransform": "remove-suffix: - Doc's Lab",
|
||||
"author": "a[href*='/profiles/'] strong",
|
||||
"content": {
|
||||
"strategy": "html-between",
|
||||
"startMarker": "<h2>Story</h2>",
|
||||
"endMarker": "</div>",
|
||||
"includeStart": false
|
||||
},
|
||||
"tags": "span.label"
|
||||
},
|
||||
"authorPage": {
|
||||
"storyLinks": "a[href*='/submissions/']",
|
||||
"linkPrefix": "https://docs-lab.com"
|
||||
}
|
||||
},
|
||||
|
||||
"archiveofourown.org": {
|
||||
"story": {
|
||||
"title": "h2.title",
|
||||
"author": "a[rel='author']",
|
||||
"content": {
|
||||
"strategy": "chapters",
|
||||
"chapterSelector": "div.userstuff[role='article']",
|
||||
"chaptersWrapper": "#chapters",
|
||||
"singleChapter": "#workskin"
|
||||
},
|
||||
"summary": "div.summary blockquote.userstuff",
|
||||
"tags": {
|
||||
"strategy": "multiple-types",
|
||||
"selectors": {
|
||||
"fandom": "dd.fandom a.tag",
|
||||
"warning": "dd.warning a.tag",
|
||||
"category": "dd.category a.tag",
|
||||
"relationship": "dd.relationship a.tag",
|
||||
"character": "dd.character a.tag",
|
||||
"freeform": "dd.freeform a.tag"
|
||||
}
|
||||
},
|
||||
"multiPage": {
|
||||
"enabled": true,
|
||||
"strategy": "chapter-navigation",
|
||||
"chapterListSelector": "#chapter_index option",
|
||||
"urlPattern": "/chapters/{chapterId}"
|
||||
}
|
||||
},
|
||||
"authorPage": {
|
||||
"storyLinks": "h4.heading a[href*='/works/']",
|
||||
"pagination": {
|
||||
"enabled": true,
|
||||
"nextPageSelector": "li.next a[rel='next']"
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
"fanfiction.net": {
|
||||
"story": {
|
||||
"title": "#profile_top b.xcontrast_txt",
|
||||
"author": "#profile_top a[href*='/u/']",
|
||||
"content": "#storytext",
|
||||
"summary": "#profile_top div.xcontrast_txt",
|
||||
"coverImage": {
|
||||
"strategy": "lazy-loaded",
|
||||
"selector": "img.cimage",
|
||||
"attribute": "data-original"
|
||||
},
|
||||
"multiPage": {
|
||||
"enabled": true,
|
||||
"strategy": "chapter-dropdown",
|
||||
"chapterSelector": "select#chap_select option",
|
||||
"urlPattern": "{baseUrl}/{chapterNumber}"
|
||||
}
|
||||
},
|
||||
"authorPage": {
|
||||
"storyLinks": "div.z-list a.stitle",
|
||||
"metadata": {
|
||||
"strategy": "sibling-text",
|
||||
"metadataSelector": "div.z-padtop2",
|
||||
"parsePattern": "Rated: ([^-]+) - .+ - Chapters: (\\d+)"
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
"royalroad.com": {
|
||||
"story": {
|
||||
"title": "h1[property='name']",
|
||||
"author": "h4[property='author'] a",
|
||||
"content": {
|
||||
"strategy": "chapter-content",
|
||||
"selector": "div.chapter-content",
|
||||
"cleanupSelectors": [".portlet", ".ads-holder", "div[style*='display:none']"]
|
||||
},
|
||||
"summary": "div.description div.hidden-content",
|
||||
"tags": "span.tags a.fiction-tag",
|
||||
"coverImage": "img.thumbnail",
|
||||
"coverImageAttribute": "src",
|
||||
"multiPage": {
|
||||
"enabled": true,
|
||||
"strategy": "table-of-contents",
|
||||
"tocSelector": "table#chapters tbody tr a[href*='/chapter/']",
|
||||
"requiresAuth": false
|
||||
}
|
||||
},
|
||||
"authorPage": {
|
||||
"storyLinks": "div.fiction-list-item h2.fiction-title a",
|
||||
"additionalInfo": {
|
||||
"strategy": "data-attributes",
|
||||
"statsSelector": "div.stats",
|
||||
"extractStats": ["pages", "followers", "views"]
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
"wattpad.com": {
|
||||
"story": {
|
||||
"title": "h1",
|
||||
"author": {
|
||||
"strategy": "schema-org",
|
||||
"schemaType": "Person",
|
||||
"property": "name",
|
||||
"fallbackSelector": "a[href*='/user/']"
|
||||
},
|
||||
"content": {
|
||||
"strategy": "react-content",
|
||||
"contentClass": "pre-wrap",
|
||||
"paragraphSelector": "p[data-p-id]",
|
||||
"requiresJavaScript": true
|
||||
},
|
||||
"summary": "h2.description",
|
||||
"tags": "div.tag-items a.tag",
|
||||
"coverImage": {
|
||||
"strategy": "responsive-image",
|
||||
"selector": "img[alt*='cover']",
|
||||
"srcsetAttribute": "srcset",
|
||||
"selectLargest": true
|
||||
},
|
||||
"multiPage": {
|
||||
"enabled": true,
|
||||
"strategy": "api-based",
|
||||
"apiPattern": "/v4/parts/{partId}/text",
|
||||
"tocApiPattern": "/v5/stories/{storyId}/parts",
|
||||
"requiresAuth": true
|
||||
}
|
||||
},
|
||||
"authorPage": {
|
||||
"storyLinks": {
|
||||
"strategy": "infinite-scroll",
|
||||
"initialSelector": "a[href*='/story/']",
|
||||
"apiEndpoint": "/v4/users/{userId}/stories",
|
||||
"requiresJavaScript": true
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
"wanderinginn.com": {
|
||||
"story": {
|
||||
"title": "h1.entry-title",
|
||||
"author": "pirateaba",
|
||||
"content": ".entry-content",
|
||||
"summary": "meta[property='og:description']",
|
||||
"summaryAttribute": "content"
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
"strategies": {
|
||||
"text-blocks": {
|
||||
"description": "Find content by looking for large text blocks",
|
||||
"implementation": "Find all text nodes, group by parent, select parent with most text"
|
||||
},
|
||||
"link-with-path": {
|
||||
"description": "Find links containing specific path patterns",
|
||||
"implementation": "querySelector with href*= or iterate and check .href property"
|
||||
},
|
||||
"href-pattern": {
|
||||
"description": "Match links by regex pattern",
|
||||
"implementation": "Array.from(links).filter(a => pattern.test(a.href))"
|
||||
},
|
||||
"text-pattern": {
|
||||
"description": "Extract text using regex from raw HTML",
|
||||
"implementation": "Use regex on .html() with proper groups"
|
||||
},
|
||||
"html-between": {
|
||||
"description": "Extract HTML between markers",
|
||||
"implementation": "indexOf() to find positions, substring to extract"
|
||||
},
|
||||
"chapters": {
|
||||
"description": "Extract story content that may be in chapters",
|
||||
"implementation": "Check for multiple chapters or single chapter format"
|
||||
},
|
||||
"multiple-types": {
|
||||
"description": "Extract different categories of tags",
|
||||
"implementation": "Map over selector types and extract each category"
|
||||
},
|
||||
"chapter-navigation": {
|
||||
"description": "Navigate through chapters using chapter index",
|
||||
"implementation": "Extract chapter IDs and construct URLs"
|
||||
},
|
||||
"lazy-loaded": {
|
||||
"description": "Extract images that are lazy-loaded",
|
||||
"implementation": "Check data-* attributes for actual image source"
|
||||
},
|
||||
"chapter-dropdown": {
|
||||
"description": "Handle stories with chapter selection dropdown",
|
||||
"implementation": "Parse dropdown options and construct chapter URLs"
|
||||
},
|
||||
"table-of-contents": {
|
||||
"description": "Extract chapters from a table of contents",
|
||||
"implementation": "Find all chapter links in TOC structure"
|
||||
},
|
||||
"schema-org": {
|
||||
"description": "Extract data from schema.org structured data",
|
||||
"implementation": "Parse JSON-LD or microdata for specific properties"
|
||||
},
|
||||
"react-content": {
|
||||
"description": "Extract content from React-rendered pages",
|
||||
"implementation": "May require JavaScript execution or API access"
|
||||
},
|
||||
"responsive-image": {
|
||||
"description": "Select best quality from responsive images",
|
||||
"implementation": "Parse srcset and select highest resolution"
|
||||
},
|
||||
"api-based": {
|
||||
"description": "Use API endpoints instead of HTML scraping",
|
||||
"implementation": "Detect API patterns and make direct API calls"
|
||||
},
|
||||
"infinite-scroll": {
|
||||
"description": "Handle pages with infinite scroll",
|
||||
"implementation": "Detect scroll API endpoints or pagination"
|
||||
}
|
||||
},
|
||||
|
||||
"globalOptions": {
|
||||
"userAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36",
|
||||
"timeout": 30000,
|
||||
"retryAttempts": 3,
|
||||
"rateLimitMs": 1000,
|
||||
"cacheDuration": 300000,
|
||||
"javascriptTimeout": 10000
|
||||
},
|
||||
|
||||
"siteNotes": {
|
||||
"wattpad.com": {
|
||||
"warning": "Wattpad has aggressive anti-scraping measures. Consider using their API if available.",
|
||||
"requiresAuth": "Some stories may require login to access full content"
|
||||
},
|
||||
"royalroad.com": {
|
||||
"note": "Very scraper-friendly with good HTML structure"
|
||||
},
|
||||
"archiveofourown.org": {
|
||||
"note": "Respects robots.txt, has good semantic HTML",
|
||||
"rateLimit": "Be extra respectful of rate limits"
|
||||
},
|
||||
"fanfiction.net": {
|
||||
"note": "Older site with simpler HTML structure",
|
||||
"warning": "Known to block IPs for aggressive scraping"
|
||||
},
|
||||
"wanderinginn.com": {
|
||||
"note": "WordPress-based site with consistent structure",
|
||||
"author": "All stories by pirateaba - uses text pattern matching for content extraction"
|
||||
}
|
||||
}
|
||||
}
|
||||
382
frontend/src/lib/scraper/scraper.ts
Normal file
382
frontend/src/lib/scraper/scraper.ts
Normal file
@@ -0,0 +1,382 @@
|
||||
import 'server-only';
|
||||
|
||||
// Note: cheerio import is done dynamically to avoid client-side bundling issues
|
||||
// Using any type for CheerioAPI to prevent bundling issues
|
||||
import {
|
||||
SitesConfig,
|
||||
SiteConfig,
|
||||
ScrapedStory,
|
||||
ScrapedAuthorStory,
|
||||
SelectorStrategy,
|
||||
MultiPageConfig,
|
||||
ScraperError
|
||||
} from './types';
|
||||
import { RateLimiter } from './utils/rateLimit';
|
||||
import { ScraperCache } from './utils/cache';
|
||||
import { UrlParser } from './utils/urlParser';
|
||||
import {
|
||||
extractByTextPattern,
|
||||
extractTextBlocks,
|
||||
extractDeviantArtContent,
|
||||
extractHtmlBetween,
|
||||
extractLinkText,
|
||||
extractLinkWithPath,
|
||||
extractHrefPattern,
|
||||
extractFirstImage,
|
||||
extractResponsiveImage,
|
||||
extractLazyLoadedImage,
|
||||
extractChapters,
|
||||
extractChapterContent,
|
||||
extractMultipleTypes,
|
||||
extractSchemaOrg,
|
||||
extractReactContent,
|
||||
cleanHtml,
|
||||
extractAttribute
|
||||
} from './strategies';
|
||||
import sitesConfig from './config/sites.json';
|
||||
|
||||
export class StoryScraper {
|
||||
private config: SitesConfig;
|
||||
private cache: ScraperCache;
|
||||
private rateLimiter: RateLimiter;
|
||||
|
||||
constructor() {
|
||||
this.config = sitesConfig as SitesConfig;
|
||||
this.cache = new ScraperCache(this.config.globalOptions.cacheDuration);
|
||||
this.rateLimiter = new RateLimiter(this.config.globalOptions.rateLimitMs);
|
||||
}
|
||||
|
||||
async scrapeStory(url: string): Promise<ScrapedStory> {
|
||||
try {
|
||||
if (!UrlParser.validateUrl(url)) {
|
||||
throw new Error(`Invalid URL: ${url}`);
|
||||
}
|
||||
|
||||
const domain = UrlParser.getDomain(url);
|
||||
const siteConfig = this.config.sites[domain];
|
||||
|
||||
if (!siteConfig) {
|
||||
throw new Error(`Unsupported site: ${domain}`);
|
||||
}
|
||||
|
||||
const html = await this.fetchWithCache(url);
|
||||
const cheerio = await import('cheerio');
|
||||
const $ = cheerio.load(html);
|
||||
|
||||
const story: ScrapedStory = {
|
||||
title: await this.extractFieldWithFallback($, siteConfig.story, 'title', html),
|
||||
author: await this.extractFieldWithFallback($, siteConfig.story, 'author', html),
|
||||
content: await this.extractContent($, siteConfig.story, url, html),
|
||||
sourceUrl: url
|
||||
};
|
||||
|
||||
// Extract optional fields
|
||||
if (siteConfig.story.summary) {
|
||||
story.summary = await this.extractField($, siteConfig.story.summary, html, siteConfig.story.summaryAttribute);
|
||||
}
|
||||
|
||||
if (siteConfig.story.coverImage) {
|
||||
story.coverImage = await this.extractField($, siteConfig.story.coverImage, html, siteConfig.story.coverImageAttribute);
|
||||
}
|
||||
|
||||
if (siteConfig.story.tags) {
|
||||
const tagsResult = await this.extractTags($, siteConfig.story.tags, html, siteConfig.story.tagsAttribute);
|
||||
if (Array.isArray(tagsResult)) {
|
||||
story.tags = tagsResult;
|
||||
} else if (typeof tagsResult === 'string' && tagsResult) {
|
||||
story.tags = [tagsResult];
|
||||
}
|
||||
}
|
||||
|
||||
// Apply post-processing
|
||||
story.title = this.applyTransforms(story.title, siteConfig.story.titleTransform);
|
||||
story.content = await cleanHtml(story.content);
|
||||
|
||||
return story;
|
||||
} catch (error) {
|
||||
if (error instanceof Error) {
|
||||
throw new ScraperError(
|
||||
`Failed to scrape ${url}: ${error.message}`,
|
||||
url,
|
||||
error
|
||||
);
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async scrapeAuthorPage(url: string): Promise<ScrapedAuthorStory[]> {
|
||||
try {
|
||||
if (!UrlParser.validateUrl(url)) {
|
||||
throw new Error(`Invalid URL: ${url}`);
|
||||
}
|
||||
|
||||
const domain = UrlParser.getDomain(url);
|
||||
const siteConfig = this.config.sites[domain];
|
||||
|
||||
if (!siteConfig || !siteConfig.authorPage) {
|
||||
throw new Error(`Author page scraping not supported for: ${domain}`);
|
||||
}
|
||||
|
||||
const html = await this.fetchWithCache(url);
|
||||
const cheerio = await import('cheerio');
|
||||
const $ = cheerio.load(html);
|
||||
|
||||
const storyLinks = await this.extractField($, siteConfig.authorPage.storyLinks, html);
|
||||
const stories: ScrapedAuthorStory[] = [];
|
||||
|
||||
if (Array.isArray(storyLinks)) {
|
||||
for (const link of storyLinks) {
|
||||
const storyUrl = UrlParser.normalizeUrl(link, url);
|
||||
try {
|
||||
const scrapedStory = await this.scrapeStory(storyUrl);
|
||||
stories.push({
|
||||
url: storyUrl,
|
||||
title: scrapedStory.title,
|
||||
author: scrapedStory.author,
|
||||
summary: scrapedStory.summary
|
||||
});
|
||||
} catch (error) {
|
||||
console.warn(`Failed to scrape story ${storyUrl}:`, error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return stories;
|
||||
} catch (error) {
|
||||
if (error instanceof Error) {
|
||||
throw new ScraperError(
|
||||
`Failed to scrape author page ${url}: ${error.message}`,
|
||||
url,
|
||||
error
|
||||
);
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
private async extractFieldWithFallback(
|
||||
$: any,
|
||||
config: any,
|
||||
fieldName: string,
|
||||
html: string
|
||||
): Promise<string> {
|
||||
const primarySelector = config[fieldName];
|
||||
const fallbackSelector = config[`${fieldName}Fallback`];
|
||||
const attribute = config[`${fieldName}Attribute`];
|
||||
const fallbackAttribute = config[`${fieldName}FallbackAttribute`];
|
||||
|
||||
// Try primary selector first
|
||||
if (primarySelector) {
|
||||
const result = await this.extractField($, primarySelector, html, attribute);
|
||||
if (result && result.trim()) {
|
||||
return result.trim();
|
||||
}
|
||||
}
|
||||
|
||||
// Try fallback selector if primary failed
|
||||
if (fallbackSelector) {
|
||||
const result = await this.extractField($, fallbackSelector, html, fallbackAttribute);
|
||||
if (result && result.trim()) {
|
||||
return result.trim();
|
||||
}
|
||||
}
|
||||
|
||||
return '';
|
||||
}
|
||||
|
||||
private async extractField(
|
||||
$: any,
|
||||
selector: string | SelectorStrategy,
|
||||
html: string,
|
||||
attribute?: string
|
||||
): Promise<any> {
|
||||
if (typeof selector === 'string') {
|
||||
// Simple CSS selector - always return single value (first element)
|
||||
const element = $(selector).first();
|
||||
if (attribute) {
|
||||
// Extract specific attribute instead of text
|
||||
return element.attr(attribute) || '';
|
||||
}
|
||||
return element.text().trim();
|
||||
}
|
||||
|
||||
// Strategy-based extraction
|
||||
return await this.executeStrategy($, selector, html);
|
||||
}
|
||||
|
||||
private async extractTags(
|
||||
$: any,
|
||||
selector: string | SelectorStrategy,
|
||||
html: string,
|
||||
attribute?: string
|
||||
): Promise<any> {
|
||||
if (typeof selector === 'string') {
|
||||
// Simple CSS selector - collect ALL matching elements for tags
|
||||
const elements = $(selector);
|
||||
|
||||
if (elements.length === 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const results: string[] = [];
|
||||
elements.each((_: any, elem: any) => {
|
||||
const $elem = $(elem);
|
||||
const value = attribute ? $elem.attr(attribute) : $elem.text().trim();
|
||||
if (value) {
|
||||
results.push(value);
|
||||
}
|
||||
});
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
// Strategy-based extraction for tags
|
||||
return await this.executeStrategy($, selector, html);
|
||||
}
|
||||
|
||||
private async executeStrategy(
|
||||
$: any,
|
||||
strategy: SelectorStrategy,
|
||||
html: string
|
||||
): Promise<any> {
|
||||
switch (strategy.strategy) {
|
||||
case 'text-pattern':
|
||||
return extractByTextPattern(html, strategy as any);
|
||||
case 'link-with-path':
|
||||
return extractLinkWithPath($, strategy as any);
|
||||
case 'text-blocks':
|
||||
return extractTextBlocks($, strategy as any);
|
||||
case 'deviantart-content':
|
||||
return extractDeviantArtContent($, strategy as any);
|
||||
case 'href-pattern':
|
||||
return extractHrefPattern($, strategy as any);
|
||||
case 'html-between':
|
||||
return extractHtmlBetween(html, strategy as any);
|
||||
case 'link-text':
|
||||
return extractLinkText($, strategy as any);
|
||||
case 'first-image':
|
||||
return extractFirstImage($, strategy as any);
|
||||
case 'responsive-image':
|
||||
return extractResponsiveImage($, strategy as any);
|
||||
case 'lazy-loaded':
|
||||
return extractLazyLoadedImage($, strategy as any);
|
||||
case 'chapters':
|
||||
return extractChapters($, strategy as any);
|
||||
case 'chapter-content':
|
||||
return extractChapterContent($, strategy as any);
|
||||
case 'multiple-types':
|
||||
return extractMultipleTypes($, strategy as any);
|
||||
case 'schema-org':
|
||||
return extractSchemaOrg($, strategy as any);
|
||||
case 'react-content':
|
||||
return extractReactContent($, strategy as any);
|
||||
default:
|
||||
throw new Error(`Unknown strategy: ${strategy.strategy}`);
|
||||
}
|
||||
}
|
||||
|
||||
private async extractContent(
|
||||
$: any,
|
||||
storyConfig: any,
|
||||
url: string,
|
||||
html: string
|
||||
): Promise<string> {
|
||||
let content = await this.extractField($, storyConfig.content, html);
|
||||
|
||||
if (storyConfig.multiPage?.enabled) {
|
||||
const additionalPages = await this.fetchAdditionalPages(
|
||||
$,
|
||||
url,
|
||||
storyConfig.multiPage
|
||||
);
|
||||
|
||||
for (const pageHtml of additionalPages) {
|
||||
const cheerioPage = await import('cheerio');
|
||||
const $page = cheerioPage.load(pageHtml);
|
||||
const pageContent = await this.extractField(
|
||||
$page,
|
||||
storyConfig.content,
|
||||
pageHtml
|
||||
);
|
||||
content += '\n\n' + pageContent;
|
||||
}
|
||||
}
|
||||
|
||||
return content;
|
||||
}
|
||||
|
||||
private async fetchAdditionalPages(
|
||||
$: any,
|
||||
baseUrl: string,
|
||||
config: MultiPageConfig
|
||||
): Promise<string[]> {
|
||||
const pages: string[] = [];
|
||||
let currentUrl = baseUrl;
|
||||
let pageNum = 2;
|
||||
|
||||
while (pageNum <= (config.maxPages || 20)) {
|
||||
let nextUrl: string | null = null;
|
||||
|
||||
if (config.strategy === 'url-pattern') {
|
||||
nextUrl = UrlParser.buildPageUrl(baseUrl, pageNum, config);
|
||||
} else if (config.nextPageSelector) {
|
||||
const nextLink = $(config.nextPageSelector).attr('href');
|
||||
if (nextLink) {
|
||||
nextUrl = UrlParser.normalizeUrl(nextLink, currentUrl);
|
||||
}
|
||||
}
|
||||
|
||||
if (!nextUrl) break;
|
||||
|
||||
try {
|
||||
await this.rateLimiter.throttle();
|
||||
const html = await this.fetchWithCache(nextUrl);
|
||||
pages.push(html);
|
||||
currentUrl = nextUrl;
|
||||
pageNum++;
|
||||
} catch (error) {
|
||||
console.error(`Failed to fetch page ${pageNum}:`, error);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
return pages;
|
||||
}
|
||||
|
||||
private async fetchWithCache(url: string): Promise<string> {
|
||||
const cached = this.cache.get(url);
|
||||
if (cached) {
|
||||
return cached;
|
||||
}
|
||||
|
||||
await this.rateLimiter.throttle();
|
||||
|
||||
const response = await fetch(url, {
|
||||
headers: {
|
||||
'User-Agent': this.config.globalOptions.userAgent,
|
||||
},
|
||||
signal: AbortSignal.timeout(this.config.globalOptions.timeout)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
|
||||
}
|
||||
|
||||
const html = await response.text();
|
||||
this.cache.set(url, html);
|
||||
|
||||
return html;
|
||||
}
|
||||
|
||||
private applyTransforms(text: string, transform?: string): string {
|
||||
if (!transform) return text;
|
||||
|
||||
if (transform.startsWith('remove-suffix:')) {
|
||||
const suffix = transform.substring('remove-suffix:'.length).trim();
|
||||
return text.replace(new RegExp(`${suffix}$`, 'i'), '').trim();
|
||||
}
|
||||
|
||||
return text;
|
||||
}
|
||||
}
|
||||
164
frontend/src/lib/scraper/strategies/contentCleaner.ts
Normal file
164
frontend/src/lib/scraper/strategies/contentCleaner.ts
Normal file
@@ -0,0 +1,164 @@
|
||||
// Dynamic cheerio import used to avoid client-side bundling issues
|
||||
// Using any type for CheerioAPI to prevent bundling issues
|
||||
import {
|
||||
ChaptersStrategy,
|
||||
ChapterContentStrategy,
|
||||
MultipleTypesStrategy,
|
||||
SchemaOrgStrategy,
|
||||
ReactContentStrategy
|
||||
} from '../types';
|
||||
|
||||
export function extractChapters(
|
||||
$: any,
|
||||
config: ChaptersStrategy
|
||||
): string {
|
||||
// Check for multiple chapters first
|
||||
if (config.chaptersWrapper) {
|
||||
const chaptersWrapper = $(config.chaptersWrapper);
|
||||
if (chaptersWrapper.length > 0) {
|
||||
const chapters = chaptersWrapper.find(config.chapterSelector);
|
||||
if (chapters.length > 1) {
|
||||
// Multiple chapters - combine them
|
||||
let content = '';
|
||||
chapters.each((_: any, elem: any) => {
|
||||
content += $(elem).html() + '\n\n';
|
||||
});
|
||||
return content.trim();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Single chapter fallback
|
||||
if (config.singleChapter) {
|
||||
const singleChapter = $(config.singleChapter);
|
||||
if (singleChapter.length > 0) {
|
||||
return singleChapter.html() || '';
|
||||
}
|
||||
}
|
||||
|
||||
// Direct chapter selector fallback
|
||||
const chapter = $(config.chapterSelector).first();
|
||||
return chapter.html() || '';
|
||||
}
|
||||
|
||||
export function extractChapterContent(
|
||||
$: any,
|
||||
config: ChapterContentStrategy
|
||||
): string {
|
||||
const content = $(config.selector);
|
||||
|
||||
// Remove cleanup selectors
|
||||
if (config.cleanupSelectors) {
|
||||
config.cleanupSelectors.forEach(selector => {
|
||||
content.find(selector).remove();
|
||||
});
|
||||
}
|
||||
|
||||
return content.html() || '';
|
||||
}
|
||||
|
||||
export function extractMultipleTypes(
|
||||
$: any,
|
||||
config: MultipleTypesStrategy
|
||||
): string[] {
|
||||
const tags: string[] = [];
|
||||
|
||||
Object.entries(config.selectors).forEach(([type, selector]) => {
|
||||
$(selector).each((_: any, elem: any) => {
|
||||
const tag = $(elem).text().trim();
|
||||
if (tag) {
|
||||
tags.push(`${type}: ${tag}`);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
return tags;
|
||||
}
|
||||
|
||||
export function extractSchemaOrg(
|
||||
$: any,
|
||||
config: SchemaOrgStrategy
|
||||
): string {
|
||||
// Look for JSON-LD first
|
||||
$('script[type="application/ld+json"]').each((_: any, elem: any) => {
|
||||
try {
|
||||
const data = JSON.parse($(elem).html() || '');
|
||||
if (data['@type'] === config.schemaType ||
|
||||
(Array.isArray(data) && data.some(item => item['@type'] === config.schemaType))) {
|
||||
const item = Array.isArray(data) ?
|
||||
data.find(item => item['@type'] === config.schemaType) : data;
|
||||
if (item && item[config.property]) {
|
||||
return item[config.property];
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
// Invalid JSON, continue
|
||||
}
|
||||
});
|
||||
|
||||
// Fallback to selector
|
||||
if (config.fallbackSelector) {
|
||||
return $(config.fallbackSelector).first().text().trim();
|
||||
}
|
||||
|
||||
return '';
|
||||
}
|
||||
|
||||
export function extractReactContent(
|
||||
$: any,
|
||||
config: ReactContentStrategy
|
||||
): string {
|
||||
// This is a simplified version - full React content extraction
|
||||
// would require JavaScript execution or API access
|
||||
|
||||
const contentElements = $(config.paragraphSelector);
|
||||
let content = '';
|
||||
|
||||
contentElements.each((_: any, elem: any) => {
|
||||
const $elem = $(elem);
|
||||
if ($elem.hasClass(config.contentClass)) {
|
||||
content += $elem.html() + '\n\n';
|
||||
}
|
||||
});
|
||||
|
||||
return content.trim();
|
||||
}
|
||||
|
||||
export async function cleanHtml(html: string): Promise<string> {
|
||||
// Basic HTML cleaning - remove scripts, styles, and dangerous elements
|
||||
const cheerio = await import('cheerio');
|
||||
const $ = cheerio.load(html, {
|
||||
// Preserve self-closing tags like <br>
|
||||
xmlMode: false,
|
||||
decodeEntities: false
|
||||
});
|
||||
|
||||
// Remove dangerous elements
|
||||
$('script, style, iframe, embed, object').remove();
|
||||
|
||||
// Remove empty paragraphs and divs (but preserve <br> tags)
|
||||
$('p:empty, div:empty').not(':has(br)').remove();
|
||||
|
||||
// Clean up excessive whitespace in text nodes only, preserve <br> tags
|
||||
$('*').each((_, elem) => {
|
||||
const $elem = $(elem);
|
||||
if (elem.type === 'text') {
|
||||
const text = $elem.text();
|
||||
if (text && text.trim() !== text) {
|
||||
$elem.replaceWith(text.trim());
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Return HTML with proper self-closing tag format
|
||||
return $.html() || '';
|
||||
}
|
||||
|
||||
export function extractAttribute(
|
||||
$: any,
|
||||
selector: string,
|
||||
attribute: string
|
||||
): string {
|
||||
const element = $(selector).first();
|
||||
return element.attr(attribute) || '';
|
||||
}
|
||||
3
frontend/src/lib/scraper/strategies/index.ts
Normal file
3
frontend/src/lib/scraper/strategies/index.ts
Normal file
@@ -0,0 +1,3 @@
|
||||
export * from './textExtractor';
|
||||
export * from './linkExtractor';
|
||||
export * from './contentCleaner';
|
||||
98
frontend/src/lib/scraper/strategies/linkExtractor.ts
Normal file
98
frontend/src/lib/scraper/strategies/linkExtractor.ts
Normal file
@@ -0,0 +1,98 @@
|
||||
// Dynamic cheerio import used to avoid client-side bundling issues
|
||||
// Using any type for CheerioAPI to prevent bundling issues
|
||||
import {
|
||||
LinkWithPathStrategy,
|
||||
HrefPatternStrategy,
|
||||
FirstImageStrategy,
|
||||
ResponsiveImageStrategy,
|
||||
LazyLoadedStrategy
|
||||
} from '../types';
|
||||
|
||||
export function extractLinkWithPath(
|
||||
$: any,
|
||||
config: LinkWithPathStrategy
|
||||
): string {
|
||||
let searchScope = config.searchWithin ? $(config.searchWithin) : $('body');
|
||||
|
||||
const links = searchScope.find('a');
|
||||
|
||||
for (let i = 0; i < links.length; i++) {
|
||||
const link = links.eq(i);
|
||||
const href = link.attr('href');
|
||||
|
||||
if (href && href.includes(config.pathContains)) {
|
||||
return link.text().trim();
|
||||
}
|
||||
}
|
||||
|
||||
return '';
|
||||
}
|
||||
|
||||
export function extractHrefPattern(
|
||||
$: any,
|
||||
config: HrefPatternStrategy
|
||||
): string[] {
|
||||
let searchScope = config.searchWithin ? $(config.searchWithin) : $('body');
|
||||
|
||||
const pattern = new RegExp(config.pattern);
|
||||
const links: string[] = [];
|
||||
|
||||
searchScope.find('a').each((_: any, elem: any) => {
|
||||
const href = $(elem).attr('href');
|
||||
if (href && pattern.test(href)) {
|
||||
links.push(href);
|
||||
}
|
||||
});
|
||||
|
||||
return links;
|
||||
}
|
||||
|
||||
export function extractFirstImage(
|
||||
$: any,
|
||||
config: FirstImageStrategy
|
||||
): string {
|
||||
let searchScope = config.searchWithin ? $(config.searchWithin) : $('body');
|
||||
|
||||
const img = searchScope.find('img').first();
|
||||
return img.attr(config.attribute) || '';
|
||||
}
|
||||
|
||||
export function extractResponsiveImage(
|
||||
$: any,
|
||||
config: ResponsiveImageStrategy
|
||||
): string {
|
||||
const img = $(config.selector).first();
|
||||
|
||||
if (config.selectLargest && config.srcsetAttribute) {
|
||||
const srcset = img.attr(config.srcsetAttribute);
|
||||
if (srcset) {
|
||||
// Parse srcset and return the largest image
|
||||
const sources = srcset.split(',').map((src: string) => {
|
||||
const parts = src.trim().split(' ');
|
||||
const url = parts[0];
|
||||
const descriptor = parts[1] || '1x';
|
||||
const width = descriptor.includes('w') ?
|
||||
parseInt(descriptor.replace('w', '')) :
|
||||
descriptor.includes('x') ?
|
||||
parseInt(descriptor.replace('x', '')) * 100 : 100;
|
||||
return { url, width };
|
||||
});
|
||||
|
||||
const largest = sources.reduce((prev: any, current: any) =>
|
||||
prev.width > current.width ? prev : current
|
||||
);
|
||||
|
||||
return largest.url;
|
||||
}
|
||||
}
|
||||
|
||||
return img.attr('src') || '';
|
||||
}
|
||||
|
||||
export function extractLazyLoadedImage(
|
||||
$: any,
|
||||
config: LazyLoadedStrategy
|
||||
): string {
|
||||
const img = $(config.selector).first();
|
||||
return img.attr(config.attribute) || img.attr('src') || '';
|
||||
}
|
||||
203
frontend/src/lib/scraper/strategies/textExtractor.ts
Normal file
203
frontend/src/lib/scraper/strategies/textExtractor.ts
Normal file
@@ -0,0 +1,203 @@
|
||||
import * as cheerio from 'cheerio';
|
||||
import 'server-only';
|
||||
|
||||
// Dynamic cheerio import used to avoid client-side bundling issues
|
||||
// Using any type for CheerioAPI to prevent bundling issues
|
||||
import {
|
||||
TextPatternStrategy,
|
||||
TextBlockStrategy,
|
||||
HtmlBetweenStrategy,
|
||||
LinkTextStrategy
|
||||
} from '../types';
|
||||
|
||||
export function extractByTextPattern(
|
||||
html: string,
|
||||
config: TextPatternStrategy
|
||||
): string {
|
||||
let searchContent = html;
|
||||
|
||||
// Limit search scope if specified
|
||||
if (config.searchAfter) {
|
||||
const afterIndex = html.indexOf(config.searchAfter);
|
||||
if (afterIndex !== -1) {
|
||||
searchContent = html.substring(afterIndex);
|
||||
}
|
||||
}
|
||||
|
||||
if (config.searchBefore) {
|
||||
const beforeIndex = searchContent.indexOf(config.searchBefore);
|
||||
if (beforeIndex !== -1) {
|
||||
searchContent = searchContent.substring(0, beforeIndex);
|
||||
}
|
||||
}
|
||||
|
||||
const regex = new RegExp(config.pattern, 'i');
|
||||
const match = searchContent.match(regex);
|
||||
return match ? match[config.group || 1].trim() : '';
|
||||
}
|
||||
|
||||
export function extractTextBlocks(
|
||||
$: cheerio.CheerioAPI,
|
||||
config: TextBlockStrategy
|
||||
): string {
|
||||
const blocks: Array<{element: any, text: string}> = [];
|
||||
|
||||
// Remove excluded elements first
|
||||
if (config.excludeSelectors) {
|
||||
config.excludeSelectors.forEach(selector => {
|
||||
$(selector).remove();
|
||||
});
|
||||
}
|
||||
|
||||
$('*').each((_, elem) => {
|
||||
const $elem = $(elem);
|
||||
const text = $elem.clone().children().remove().end().text().trim();
|
||||
|
||||
if (text.length >= (config.minLength || 500)) {
|
||||
blocks.push({ element: elem, text });
|
||||
}
|
||||
});
|
||||
|
||||
// Find the block that likely contains story content
|
||||
const storyBlock = blocks.find(block => {
|
||||
if (config.containerHints && config.containerHints.length > 0) {
|
||||
const hasHints = config.containerHints.some(hint =>
|
||||
$(block.element).attr('class')?.includes(hint) ||
|
||||
$(block.element).attr('id')?.includes(hint)
|
||||
);
|
||||
return hasHints;
|
||||
}
|
||||
return blocks.length === 1;
|
||||
});
|
||||
|
||||
if (storyBlock) {
|
||||
return $(storyBlock.element).html() || '';
|
||||
}
|
||||
|
||||
// Fallback to largest block
|
||||
const largestBlock = blocks.reduce((prev, current) =>
|
||||
prev.text.length > current.text.length ? prev : current
|
||||
);
|
||||
|
||||
return largestBlock ? $(largestBlock.element).html() || '' : '';
|
||||
}
|
||||
|
||||
export function extractDeviantArtContent(
|
||||
$: cheerio.CheerioAPI,
|
||||
config: TextBlockStrategy
|
||||
): string {
|
||||
// Remove excluded elements first
|
||||
if (config.excludeSelectors) {
|
||||
config.excludeSelectors.forEach(selector => {
|
||||
$(selector).remove();
|
||||
});
|
||||
}
|
||||
|
||||
// DeviantArt has two main content structures:
|
||||
// 1. Old format: <div class="text"> containing the full story
|
||||
// 2. New format: <div class="_83r8m _2CKTq"> or similar classes containing multiple <p> elements
|
||||
|
||||
// Try the old format first (single text div)
|
||||
const textDiv = $('.text');
|
||||
if (textDiv.length > 0 && textDiv.text().trim().length >= (config.minLength || 200)) {
|
||||
return textDiv.html() || '';
|
||||
}
|
||||
|
||||
// Try the new format (multiple paragraphs in specific containers)
|
||||
const newFormatSelectors = [
|
||||
'div[class*="_83r8m"] p', // Main story content container
|
||||
'div[class*="_2CKTq"] p', // Alternate story content container
|
||||
'div[class*="journal"] p' // Generic journal container
|
||||
];
|
||||
|
||||
for (const selector of newFormatSelectors) {
|
||||
const paragraphs = $(selector);
|
||||
if (paragraphs.length > 0) {
|
||||
let totalText = '';
|
||||
paragraphs.each((_, p) => {
|
||||
totalText += $(p).text().trim();
|
||||
});
|
||||
|
||||
// Check if this container has enough content
|
||||
if (totalText.length >= (config.minLength || 200)) {
|
||||
// Combine all paragraphs into a single HTML string
|
||||
let combinedHtml = '';
|
||||
paragraphs.each((_, p) => {
|
||||
combinedHtml += $(p).prop('outerHTML') || '';
|
||||
});
|
||||
return combinedHtml;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback to the original text-blocks strategy
|
||||
return extractTextBlocks($, config);
|
||||
}
|
||||
|
||||
export function extractHtmlBetween(
|
||||
html: string,
|
||||
config: HtmlBetweenStrategy
|
||||
): string {
|
||||
const startIndex = html.indexOf(config.startMarker);
|
||||
if (startIndex === -1) return '';
|
||||
|
||||
const contentStart = config.includeStart ?
|
||||
startIndex :
|
||||
startIndex + config.startMarker.length;
|
||||
|
||||
const endIndex = html.indexOf(config.endMarker, contentStart);
|
||||
if (endIndex === -1) {
|
||||
return html.substring(contentStart);
|
||||
}
|
||||
|
||||
return html.substring(contentStart, endIndex).trim();
|
||||
}
|
||||
|
||||
export function extractLinkText(
|
||||
$: cheerio.CheerioAPI,
|
||||
config: LinkTextStrategy
|
||||
): string {
|
||||
let searchScope: cheerio.Cheerio<cheerio.AnyNode>;
|
||||
|
||||
if (config.searchWithin) {
|
||||
searchScope = $(config.searchWithin);
|
||||
} else {
|
||||
searchScope = $('body').length ? $('body') : $('*');
|
||||
}
|
||||
|
||||
// Look for links near the specified text patterns
|
||||
let foundText = '';
|
||||
|
||||
config.nearText.forEach(text => {
|
||||
if (foundText) return; // Already found
|
||||
|
||||
searchScope.find('*').each((_, elem) => {
|
||||
const $elem = $(elem);
|
||||
const elemText = $elem.text().toLowerCase();
|
||||
|
||||
if (elemText.includes(text.toLowerCase())) {
|
||||
// Look for nearby links
|
||||
const $link = $elem.find('a').first();
|
||||
if ($link.length) {
|
||||
foundText = $link.text().trim();
|
||||
return false; // Break out of each
|
||||
}
|
||||
|
||||
// Check if the element itself is a link
|
||||
if ($elem.is('a')) {
|
||||
foundText = $elem.text().trim();
|
||||
return false;
|
||||
}
|
||||
|
||||
// Look for links in the next few siblings
|
||||
const $siblings = $elem.nextAll().slice(0, 3);
|
||||
$siblings.find('a').first().each((_, link) => {
|
||||
foundText = $(link).text().trim();
|
||||
return false;
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
return foundText;
|
||||
}
|
||||
249
frontend/src/lib/scraper/types.ts
Normal file
249
frontend/src/lib/scraper/types.ts
Normal file
@@ -0,0 +1,249 @@
|
||||
export interface SiteConfig {
|
||||
story: StorySelectors;
|
||||
authorPage?: AuthorPageSelectors;
|
||||
}
|
||||
|
||||
export interface StorySelectors {
|
||||
title: string | SelectorStrategy;
|
||||
author: string | SelectorStrategy;
|
||||
content: string | SelectorStrategy;
|
||||
summary?: string | SelectorStrategy;
|
||||
coverImage?: string | SelectorStrategy;
|
||||
tags?: string | SelectorStrategy;
|
||||
multiPage?: MultiPageConfig;
|
||||
titleFallback?: string;
|
||||
titleFallbackAttribute?: string;
|
||||
contentFallback?: string;
|
||||
titleTransform?: string;
|
||||
summaryAttribute?: string;
|
||||
coverImageAttribute?: string;
|
||||
tagsAttribute?: string;
|
||||
}
|
||||
|
||||
export interface AuthorPageSelectors {
|
||||
storyLinks: string | SelectorStrategy;
|
||||
pagination?: PaginationConfig;
|
||||
linkPrefix?: string;
|
||||
filterStrategy?: string;
|
||||
requiresChildElement?: string;
|
||||
requiresNavigation?: NavigationConfig;
|
||||
metadata?: MetadataConfig;
|
||||
additionalInfo?: AdditionalInfoConfig;
|
||||
}
|
||||
|
||||
export interface SelectorStrategy {
|
||||
strategy: string;
|
||||
[key: string]: any;
|
||||
}
|
||||
|
||||
export interface MultiPageConfig {
|
||||
enabled: boolean;
|
||||
strategy: 'url-pattern' | 'next-link' | 'chapter-navigation' | 'chapter-dropdown' | 'table-of-contents' | 'api-based';
|
||||
nextPageSelector?: string;
|
||||
pageParam?: string;
|
||||
maxPages?: number;
|
||||
chapterListSelector?: string;
|
||||
chapterSelector?: string;
|
||||
urlPattern?: string;
|
||||
tocSelector?: string;
|
||||
requiresAuth?: boolean;
|
||||
apiPattern?: string;
|
||||
tocApiPattern?: string;
|
||||
}
|
||||
|
||||
export interface PaginationConfig {
|
||||
enabled: boolean;
|
||||
nextPageSelector: string;
|
||||
}
|
||||
|
||||
export interface NavigationConfig {
|
||||
enabled: boolean;
|
||||
clickText: string;
|
||||
waitMs: number;
|
||||
}
|
||||
|
||||
export interface MetadataConfig {
|
||||
strategy: string;
|
||||
metadataSelector: string;
|
||||
parsePattern: string;
|
||||
}
|
||||
|
||||
export interface AdditionalInfoConfig {
|
||||
strategy: string;
|
||||
statsSelector: string;
|
||||
extractStats: string[];
|
||||
}
|
||||
|
||||
export interface ScrapedStory {
|
||||
title: string;
|
||||
author: string;
|
||||
content: string;
|
||||
summary?: string;
|
||||
coverImage?: string;
|
||||
tags?: string[];
|
||||
sourceUrl: string;
|
||||
}
|
||||
|
||||
export interface ScrapedAuthorStory {
|
||||
url: string;
|
||||
title: string;
|
||||
author: string;
|
||||
summary?: string;
|
||||
}
|
||||
|
||||
export interface SitesConfig {
|
||||
sites: Record<string, SiteConfig>;
|
||||
strategies: Record<string, StrategyDescription>;
|
||||
globalOptions: GlobalOptions;
|
||||
siteNotes?: Record<string, SiteNotes>;
|
||||
}
|
||||
|
||||
export interface StrategyDescription {
|
||||
description: string;
|
||||
implementation: string;
|
||||
}
|
||||
|
||||
export interface GlobalOptions {
|
||||
userAgent: string;
|
||||
timeout: number;
|
||||
retryAttempts: number;
|
||||
rateLimitMs: number;
|
||||
cacheDuration?: number;
|
||||
javascriptTimeout?: number;
|
||||
}
|
||||
|
||||
export interface SiteNotes {
|
||||
warning?: string;
|
||||
note?: string;
|
||||
rateLimit?: string;
|
||||
requiresAuth?: string;
|
||||
}
|
||||
|
||||
// Strategy-specific interfaces
|
||||
export interface TextPatternStrategy extends SelectorStrategy {
|
||||
strategy: 'text-pattern';
|
||||
pattern: string;
|
||||
group?: number;
|
||||
searchAfter?: string;
|
||||
searchBefore?: string;
|
||||
}
|
||||
|
||||
export interface LinkWithPathStrategy extends SelectorStrategy {
|
||||
strategy: 'link-with-path';
|
||||
pathContains: string;
|
||||
searchWithin?: string;
|
||||
}
|
||||
|
||||
export interface TextBlockStrategy extends SelectorStrategy {
|
||||
strategy: 'text-blocks';
|
||||
minLength?: number;
|
||||
containerHints?: string[];
|
||||
excludeSelectors?: string[];
|
||||
}
|
||||
|
||||
export interface HrefPatternStrategy extends SelectorStrategy {
|
||||
strategy: 'href-pattern';
|
||||
pattern: string;
|
||||
searchWithin?: string;
|
||||
}
|
||||
|
||||
export interface HtmlBetweenStrategy extends SelectorStrategy {
|
||||
strategy: 'html-between';
|
||||
startMarker: string;
|
||||
endMarker: string;
|
||||
includeStart?: boolean;
|
||||
}
|
||||
|
||||
export interface ChaptersStrategy extends SelectorStrategy {
|
||||
strategy: 'chapters';
|
||||
chapterSelector: string;
|
||||
chaptersWrapper?: string;
|
||||
singleChapter?: string;
|
||||
}
|
||||
|
||||
export interface MultipleTypesStrategy extends SelectorStrategy {
|
||||
strategy: 'multiple-types';
|
||||
selectors: Record<string, string>;
|
||||
}
|
||||
|
||||
export interface LinkTextStrategy extends SelectorStrategy {
|
||||
strategy: 'link-text';
|
||||
nearText: string[];
|
||||
searchWithin?: string;
|
||||
}
|
||||
|
||||
export interface FirstImageStrategy extends SelectorStrategy {
|
||||
strategy: 'first-image';
|
||||
searchWithin: string;
|
||||
attribute: string;
|
||||
}
|
||||
|
||||
export interface SchemaOrgStrategy extends SelectorStrategy {
|
||||
strategy: 'schema-org';
|
||||
schemaType: string;
|
||||
property: string;
|
||||
fallbackSelector?: string;
|
||||
}
|
||||
|
||||
export interface ReactContentStrategy extends SelectorStrategy {
|
||||
strategy: 'react-content';
|
||||
contentClass: string;
|
||||
paragraphSelector: string;
|
||||
requiresJavaScript: boolean;
|
||||
}
|
||||
|
||||
export interface ResponsiveImageStrategy extends SelectorStrategy {
|
||||
strategy: 'responsive-image';
|
||||
selector: string;
|
||||
srcsetAttribute: string;
|
||||
selectLargest: boolean;
|
||||
}
|
||||
|
||||
export interface LazyLoadedStrategy extends SelectorStrategy {
|
||||
strategy: 'lazy-loaded';
|
||||
selector: string;
|
||||
attribute: string;
|
||||
}
|
||||
|
||||
export interface ChapterContentStrategy extends SelectorStrategy {
|
||||
strategy: 'chapter-content';
|
||||
selector: string;
|
||||
cleanupSelectors?: string[];
|
||||
}
|
||||
|
||||
export interface DataAttributesStrategy extends SelectorStrategy {
|
||||
strategy: 'data-attributes';
|
||||
statsSelector: string;
|
||||
extractStats: string[];
|
||||
}
|
||||
|
||||
export interface SiblingTextStrategy extends SelectorStrategy {
|
||||
strategy: 'sibling-text';
|
||||
metadataSelector: string;
|
||||
parsePattern: string;
|
||||
}
|
||||
|
||||
export interface ApiBasedStrategy extends SelectorStrategy {
|
||||
strategy: 'api-based';
|
||||
apiPattern: string;
|
||||
tocApiPattern?: string;
|
||||
requiresAuth: boolean;
|
||||
}
|
||||
|
||||
export interface InfiniteScrollStrategy extends SelectorStrategy {
|
||||
strategy: 'infinite-scroll';
|
||||
initialSelector: string;
|
||||
apiEndpoint: string;
|
||||
requiresJavaScript: boolean;
|
||||
}
|
||||
|
||||
export class ScraperError extends Error {
|
||||
constructor(
|
||||
message: string,
|
||||
public url: string,
|
||||
public originalError?: Error
|
||||
) {
|
||||
super(message);
|
||||
this.name = 'ScraperError';
|
||||
}
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user