3 Commits

Author SHA1 Message Date
Stefan Hardegger
aae8f8926b removing typesense 2025-09-20 14:39:51 +02:00
Stefan Hardegger
f1773873d4 Full parallel implementation of typesense and opensearch 2025-09-20 09:40:09 +02:00
Stefan Hardegger
54df3c471e phase 1 2025-09-18 07:46:10 +02:00
50 changed files with 8628 additions and 3949 deletions

View File

@@ -14,11 +14,18 @@ JWT_SECRET=secure_jwt_secret_here
# Application Authentication # Application Authentication
APP_PASSWORD=application_password_here APP_PASSWORD=application_password_here
# Search Engine Configuration
SEARCH_ENGINE=typesense
# Typesense Search Configuration # Typesense Search Configuration
TYPESENSE_API_KEY=secure_api_key_here TYPESENSE_API_KEY=secure_api_key_here
TYPESENSE_ENABLED=true TYPESENSE_ENABLED=true
TYPESENSE_REINDEX_INTERVAL=3600000 TYPESENSE_REINDEX_INTERVAL=3600000
# OpenSearch Configuration
OPENSEARCH_USERNAME=admin
OPENSEARCH_PASSWORD=secure_opensearch_password_here
# Image Storage # Image Storage
IMAGE_STORAGE_PATH=/app/images IMAGE_STORAGE_PATH=/app/images

View File

@@ -18,10 +18,9 @@ JWT_SECRET=REPLACE_WITH_SECURE_JWT_SECRET_MINIMUM_32_CHARS
# Use a strong password in production # Use a strong password in production
APP_PASSWORD=REPLACE_WITH_SECURE_APP_PASSWORD APP_PASSWORD=REPLACE_WITH_SECURE_APP_PASSWORD
# Typesense Search Configuration # OpenSearch Configuration
TYPESENSE_API_KEY=REPLACE_WITH_SECURE_TYPESENSE_API_KEY OPENSEARCH_PASSWORD=REPLACE_WITH_SECURE_OPENSEARCH_PASSWORD
TYPESENSE_ENABLED=true SEARCH_ENGINE=opensearch
TYPESENSE_REINDEX_INTERVAL=3600000
# Image Storage # Image Storage
IMAGE_STORAGE_PATH=/app/images IMAGE_STORAGE_PATH=/app/images

View File

@@ -0,0 +1,889 @@
# StoryCove Search Migration Specification: Typesense to OpenSearch
## Executive Summary
This document specifies the migration from Typesense to OpenSearch for the StoryCove application. The migration will be implemented using a parallel approach, maintaining Typesense functionality while gradually transitioning to OpenSearch, ensuring zero downtime and the ability to rollback if needed.
**Migration Goals:**
- Solve random query reliability issues
- Improve complex filtering performance
- Maintain feature parity during transition
- Zero downtime migration
- Improved developer experience
---
## Current State Analysis
### Typesense Implementation Overview
**Service Architecture:**
- `TypesenseService.java` (~2000 lines) - Primary search service
- 3 search indexes: Stories, Authors, Collections
- Multi-library support with dynamic collection names
- Integration with Spring Boot backend
**Core Functionality:**
1. **Full-text Search**: Stories, Authors with complex query building
2. **Random Story Selection**: `_rand()` function with fallback logic
3. **Advanced Filtering**: 15+ filter conditions with boolean logic
4. **Faceting**: Tag aggregations and counts
5. **Autocomplete**: Search suggestions with typeahead
6. **CRUD Operations**: Index/update/delete for all entity types
**Current Issues Identified:**
- `_rand()` function unreliability requiring complex fallback logic
- Complex filter query building with escaping issues
- Limited aggregation capabilities
- Inconsistent API behavior across query patterns
- Multi-collection management complexity
### Data Models and Schema
**Story Index Fields:**
```java
// Core fields
UUID id, String title, String description, String sourceUrl
Integer wordCount, Integer rating, Integer volume
Boolean isRead, LocalDateTime lastReadAt, Integer readingPosition
// Relationships
UUID authorId, String authorName
UUID seriesId, String seriesName
List<String> tagNames
// Metadata
LocalDateTime createdAt, LocalDateTime updatedAt
String coverPath, String sourceDomain
```
**Author Index Fields:**
```java
UUID id, String name, String notes
Integer authorRating, Double averageStoryRating, Integer storyCount
List<String> urls, String avatarImagePath
LocalDateTime createdAt, LocalDateTime updatedAt
```
**Collection Index Fields:**
```java
UUID id, String name, String description
List<String> tagNames, Boolean archived
LocalDateTime createdAt, LocalDateTime updatedAt
Integer storyCount, Integer currentPosition
```
### API Endpoints Current State
**Search Endpoints Analysis:**
**✅ USED by Frontend (Must Implement):**
- `GET /api/stories/search` - Main story search with complex filtering (CRITICAL)
- `GET /api/stories/random` - Random story selection with filters (CRITICAL)
- `GET /api/authors/search-typesense` - Author search (HIGH)
- `GET /api/tags/autocomplete` - Tag suggestions (MEDIUM)
- `POST /api/stories/reindex-typesense` - Admin reindex operations (MEDIUM)
- `POST /api/authors/reindex-typesense` - Admin reindex operations (MEDIUM)
- `POST /api/stories/recreate-typesense-collection` - Admin recreate (MEDIUM)
- `POST /api/authors/recreate-typesense-collection` - Admin recreate (MEDIUM)
**❌ UNUSED by Frontend (Skip Implementation):**
- `GET /api/stories/search/suggestions` - Not used by frontend
- `GET /api/authors/search` - Superseded by typesense version
- `GET /api/series/search` - Not used by frontend
- `GET /api/tags/search` - Superseded by autocomplete
- `POST /api/search/reindex` - Not used by frontend
- `GET /api/search/health` - Not used by frontend
**Scope Reduction: ~40% fewer endpoints to implement**
**Search Parameters (Stories):**
```
query, page, size, authors[], tags[], minRating, maxRating
sortBy, sortDir, facetBy[]
minWordCount, maxWordCount, createdAfter, createdBefore
lastReadAfter, lastReadBefore, unratedOnly, readingStatus
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter
minTagCount, popularOnly, hiddenGemsOnly
```
---
## Target OpenSearch Architecture
### Service Layer Design
**New Components:**
```
OpenSearchService.java - Primary search service (mirrors TypesenseService API)
OpenSearchConfig.java - Configuration and client setup
SearchMigrationService.java - Handles parallel operation during migration
SearchServiceAdapter.java - Abstraction layer for service switching
```
**Index Strategy:**
- **Single-node deployment** for development/small installations
- **Index-per-library** approach: `stories-{libraryId}`, `authors-{libraryId}`, `collections-{libraryId}`
- **Index templates** for consistent mapping across libraries
- **Aliases** for easy switching and zero-downtime updates
### OpenSearch Index Mappings
**Stories Index Mapping:**
```json
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"story_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "stop", "snowball"]
}
}
}
},
"mappings": {
"properties": {
"id": {"type": "keyword"},
"title": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {"keyword": {"type": "keyword"}}
},
"description": {
"type": "text",
"analyzer": "story_analyzer"
},
"authorName": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {"keyword": {"type": "keyword"}}
},
"seriesName": {
"type": "text",
"fields": {"keyword": {"type": "keyword"}}
},
"tagNames": {"type": "keyword"},
"wordCount": {"type": "integer"},
"rating": {"type": "integer"},
"volume": {"type": "integer"},
"isRead": {"type": "boolean"},
"readingPosition": {"type": "integer"},
"lastReadAt": {"type": "date"},
"createdAt": {"type": "date"},
"updatedAt": {"type": "date"},
"coverPath": {"type": "keyword"},
"sourceUrl": {"type": "keyword"},
"sourceDomain": {"type": "keyword"}
}
}
}
```
**Authors Index Mapping:**
```json
{
"mappings": {
"properties": {
"id": {"type": "keyword"},
"name": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {"keyword": {"type": "keyword"}}
},
"notes": {"type": "text"},
"authorRating": {"type": "integer"},
"averageStoryRating": {"type": "float"},
"storyCount": {"type": "integer"},
"urls": {"type": "keyword"},
"avatarImagePath": {"type": "keyword"},
"createdAt": {"type": "date"},
"updatedAt": {"type": "date"}
}
}
}
```
**Collections Index Mapping:**
```json
{
"mappings": {
"properties": {
"id": {"type": "keyword"},
"name": {
"type": "text",
"fields": {"keyword": {"type": "keyword"}}
},
"description": {"type": "text"},
"tagNames": {"type": "keyword"},
"archived": {"type": "boolean"},
"storyCount": {"type": "integer"},
"currentPosition": {"type": "integer"},
"createdAt": {"type": "date"},
"updatedAt": {"type": "date"}
}
}
}
```
### Query Translation Strategy
**Random Story Queries:**
```java
// Typesense (problematic)
String sortBy = seed != null ? "_rand(" + seed + ")" : "_rand()";
// OpenSearch (reliable)
QueryBuilder randomQuery = QueryBuilders.functionScoreQuery(
QueryBuilders.boolQuery().must(filters),
ScoreFunctionBuilders.randomFunction(seed != null ? seed.intValue() : null)
);
```
**Complex Filtering:**
```java
// Build bool query with multiple filter conditions
BoolQueryBuilder boolQuery = QueryBuilders.boolQuery()
.must(QueryBuilders.multiMatchQuery(query, "title", "description", "authorName"))
.filter(QueryBuilders.termsQuery("tagNames", tags))
.filter(QueryBuilders.rangeQuery("wordCount").gte(minWords).lte(maxWords))
.filter(QueryBuilders.rangeQuery("rating").gte(minRating).lte(maxRating));
```
**Faceting/Aggregations:**
```java
// Tags aggregation
AggregationBuilder tagsAgg = AggregationBuilders
.terms("tags")
.field("tagNames")
.size(100);
// Rating ranges
AggregationBuilder ratingRanges = AggregationBuilders
.range("rating_ranges")
.field("rating")
.addRange("unrated", 0, 1)
.addRange("low", 1, 3)
.addRange("high", 4, 6);
```
---
## Revised Implementation Phases (Scope Reduced by 40%)
### Phase 1: Infrastructure Setup (Week 1)
**Objectives:**
- Add OpenSearch to Docker Compose
- Create basic OpenSearch service
- Establish index templates and mappings
- **Focus**: Only stories, authors, and tags indexes (skip series, collections)
**Deliverables:**
1. **Docker Compose Updates:**
```yaml
opensearch:
image: opensearchproject/opensearch:2.11.0
environment:
- discovery.type=single-node
- DISABLE_SECURITY_PLUGIN=true
- OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx1g
ports:
- "9200:9200"
volumes:
- opensearch_data:/usr/share/opensearch/data
```
2. **OpenSearchConfig.java:**
```java
@Configuration
@ConditionalOnProperty(name = "storycove.opensearch.enabled", havingValue = "true")
public class OpenSearchConfig {
@Bean
public OpenSearchClient openSearchClient() {
// Client configuration
}
}
```
3. **Basic Index Creation:**
- Create index templates for stories, authors, collections
- Implement index creation with proper mappings
- Add health check endpoint
**Success Criteria:**
- OpenSearch container starts successfully
- Basic connectivity established
- Index templates created and validated
### Phase 2: Core Service Implementation (Week 2)
**Objectives:**
- Implement OpenSearchService with core functionality
- Create service abstraction layer
- Implement basic search operations
- **Focus**: Only critical endpoints (stories search, random, authors)
**Deliverables:**
1. **OpenSearchService.java** - Core service implementing:
- `indexStory()`, `updateStory()`, `deleteStory()`
- `searchStories()` with basic query support (CRITICAL)
- `getRandomStoryId()` with reliable seed support (CRITICAL)
- `indexAuthor()`, `updateAuthor()`, `deleteAuthor()`
- `searchAuthors()` for authors page (HIGH)
- `bulkIndexStories()`, `bulkIndexAuthors()` for initial data loading
2. **SearchServiceAdapter.java** - Abstraction layer:
```java
@Service
public class SearchServiceAdapter {
@Autowired(required = false)
private TypesenseService typesenseService;
@Autowired(required = false)
private OpenSearchService openSearchService;
@Value("${storycove.search.provider:typesense}")
private String searchProvider;
public SearchResultDto<StorySearchDto> searchStories(...) {
return "opensearch".equals(searchProvider)
? openSearchService.searchStories(...)
: typesenseService.searchStories(...);
}
}
```
3. **Basic Query Implementation:**
- Full-text search across title/description/author
- Basic filtering (tags, rating, word count)
- Pagination and sorting
**Success Criteria:**
- Basic search functionality working
- Service abstraction layer functional
- Can switch between Typesense and OpenSearch via configuration
### Phase 3: Advanced Features Implementation (Week 3)
**Objectives:**
- Implement complex filtering (all 15+ filter types)
- Add random story functionality
- Implement faceting/aggregations
- Add autocomplete/suggestions
**Deliverables:**
1. **Complex Query Builder:**
- All filter conditions from original implementation
- Date range filtering with proper timezone handling
- Boolean logic for reading status, coverage, series filters
2. **Random Story Implementation:**
```java
public Optional<UUID> getRandomStoryId(String searchQuery, List<String> tags, Long seed, ...) {
BoolQueryBuilder baseQuery = buildFilterQuery(searchQuery, tags, ...);
QueryBuilder randomQuery = QueryBuilders.functionScoreQuery(
baseQuery,
ScoreFunctionBuilders.randomFunction(seed != null ? seed.intValue() : null)
);
SearchRequest request = new SearchRequest("stories-" + getCurrentLibraryId())
.source(new SearchSourceBuilder()
.query(randomQuery)
.size(1)
.fetchSource(new String[]{"id"}, null));
// Execute and return result
}
```
3. **Faceting Implementation:**
- Tag aggregations with counts
- Rating range aggregations
- Author aggregations
- Custom facet builders
4. **Autocomplete Service:**
- Suggest-based implementation using completion fields
- Prefix matching for story titles and author names
**Success Criteria:**
- All filter conditions working correctly
- Random story selection reliable with seed support
- Faceting returns accurate counts
- Autocomplete responsive and accurate
### Phase 4: Data Migration & Parallel Operation (Week 4)
**Objectives:**
- Implement bulk data migration from database
- Enable parallel operation (write to both systems)
- Comprehensive testing of OpenSearch functionality
**Deliverables:**
1. **Migration Service:**
```java
@Service
public class SearchMigrationService {
public void performFullMigration() {
// Migrate all libraries
List<Library> libraries = libraryService.findAll();
for (Library library : libraries) {
migrateLibraryData(library);
}
}
private void migrateLibraryData(Library library) {
// Create indexes for library
// Bulk load stories, authors, collections
// Verify data integrity
}
}
```
2. **Dual-Write Implementation:**
- Modify all entity update operations to write to both systems
- Add configuration flag for dual-write mode
- Error handling for partial failures
3. **Data Validation Tools:**
- Compare search result counts between systems
- Validate random story selection consistency
- Check faceting accuracy
**Success Criteria:**
- Complete data migration with 100% accuracy
- Dual-write operations working without errors
- Search result parity between systems verified
### Phase 5: API Integration & Testing (Week 5)
**Objectives:**
- Update controller endpoints to use OpenSearch
- Comprehensive integration testing
- Performance testing and optimization
**Deliverables:**
1. **Controller Updates:**
- Modify controllers to use SearchServiceAdapter
- Add migration controls for gradual rollout
- Implement A/B testing capability
2. **Integration Tests:**
```java
@SpringBootTest
@TestMethodOrder(OrderAnnotation.class)
public class OpenSearchIntegrationTest {
@Test
@Order(1)
void testBasicSearch() {
// Test basic story search functionality
}
@Test
@Order(2)
void testComplexFiltering() {
// Test all 15+ filter conditions
}
@Test
@Order(3)
void testRandomStory() {
// Test random story with and without seed
}
@Test
@Order(4)
void testFaceting() {
// Test aggregation accuracy
}
}
```
3. **Performance Testing:**
- Load testing with realistic data volumes
- Query performance benchmarking
- Memory usage monitoring
**Success Criteria:**
- All integration tests passing
- Performance meets or exceeds Typesense baseline
- Memory usage within acceptable limits (< 2GB)
### Phase 6: Production Rollout & Monitoring (Week 6)
**Objectives:**
- Production deployment with feature flags
- Gradual user migration with monitoring
- Rollback capability testing
**Deliverables:**
1. **Feature Flag Implementation:**
```java
@Component
public class SearchFeatureFlags {
@Value("${storycove.search.opensearch.enabled:false}")
private boolean openSearchEnabled;
@Value("${storycove.search.opensearch.percentage:0}")
private int rolloutPercentage;
public boolean shouldUseOpenSearch(String userId) {
if (!openSearchEnabled) return false;
return userId.hashCode() % 100 < rolloutPercentage;
}
}
```
2. **Monitoring & Alerting:**
- Query performance metrics
- Error rate monitoring
- Search result accuracy validation
- User experience metrics
3. **Rollback Procedures:**
- Immediate rollback to Typesense capability
- Data consistency verification
- Performance rollback triggers
**Success Criteria:**
- Successful production deployment
- Zero user-facing issues during rollout
- Monitoring showing improved performance
- Rollback procedures validated
### Phase 7: Cleanup & Documentation (Week 7)
**Objectives:**
- Remove Typesense dependencies
- Update documentation
- Performance optimization
**Deliverables:**
1. **Code Cleanup:**
- Remove TypesenseService and related classes
- Clean up Docker Compose configuration
- Remove unused dependencies
2. **Documentation Updates:**
- Update deployment documentation
- Search API documentation
- Troubleshooting guides
3. **Performance Tuning:**
- Index optimization
- Query performance tuning
- Resource allocation optimization
**Success Criteria:**
- Typesense completely removed
- Documentation up to date
- Optimized performance in production
---
## Data Migration Strategy
### Pre-Migration Validation
**Data Integrity Checks:**
1. Count validation: Ensure all stories/authors/collections are present
2. Field validation: Verify all required fields are populated
3. Relationship validation: Check author-story and series-story relationships
4. Library separation: Ensure proper multi-library data isolation
**Migration Process:**
1. **Index Creation:**
```java
// Create indexes with proper mappings for each library
for (Library library : libraries) {
String storiesIndex = "stories-" + library.getId();
createIndexWithMapping(storiesIndex, getStoriesMapping());
createIndexWithMapping("authors-" + library.getId(), getAuthorsMapping());
createIndexWithMapping("collections-" + library.getId(), getCollectionsMapping());
}
```
2. **Bulk Data Loading:**
```java
// Load in batches to manage memory usage
int batchSize = 1000;
List<Story> allStories = storyService.findByLibraryId(libraryId);
for (int i = 0; i < allStories.size(); i += batchSize) {
List<Story> batch = allStories.subList(i, Math.min(i + batchSize, allStories.size()));
List<StoryDocument> documents = batch.stream()
.map(this::convertToSearchDocument)
.collect(Collectors.toList());
bulkIndexStories(documents, "stories-" + libraryId);
}
```
3. **Post-Migration Validation:**
- Count comparison between database and OpenSearch
- Spot-check random records for field accuracy
- Test search functionality with known queries
- Verify faceting counts match expected values
### Rollback Strategy
**Immediate Rollback Triggers:**
- Search error rate > 1%
- Query performance degradation > 50%
- Data inconsistency detected
- Memory usage > 4GB sustained
**Rollback Process:**
1. Update feature flag to disable OpenSearch
2. Verify Typesense still operational
3. Clear OpenSearch indexes to free resources
4. Investigate and document issues
**Data Consistency During Rollback:**
- Continue dual-write during investigation
- Re-sync any missed updates to OpenSearch
- Validate data integrity before retry
---
## Testing Strategy
### Unit Tests
**OpenSearchService Unit Tests:**
```java
@ExtendWith(MockitoExtension.class)
class OpenSearchServiceTest {
@Mock private OpenSearchClient client;
@InjectMocks private OpenSearchService service;
@Test
void testSearchStoriesBasicQuery() {
// Mock OpenSearch response
// Test basic search functionality
}
@Test
void testComplexFilterQuery() {
// Test complex boolean query building
}
@Test
void testRandomStorySelection() {
// Test random query with seed
}
}
```
**Query Builder Tests:**
- Test all 15+ filter conditions
- Validate query structure and parameters
- Test edge cases and null handling
### Integration Tests
**Full Search Integration:**
```java
@SpringBootTest
@Testcontainers
class OpenSearchIntegrationTest {
@Container
static OpenSearchContainer opensearch = new OpenSearchContainer("opensearchproject/opensearch:2.11.0");
@Test
void testEndToEndStorySearch() {
// Insert test data
// Perform search via controller
// Validate results
}
}
```
### Performance Tests
**Load Testing Scenarios:**
1. **Concurrent Search Load:**
- 50 concurrent users performing searches
- Mixed query complexity
- Duration: 10 minutes
2. **Bulk Indexing Performance:**
- Index 10,000 stories in batches
- Measure throughput and memory usage
3. **Random Query Performance:**
- 1000 random story requests with different seeds
- Compare with Typesense baseline
### Acceptance Tests
**Functional Requirements:**
- All existing search functionality preserved
- Random story selection improved reliability
- Faceting accuracy maintained
- Multi-library separation working
**Performance Requirements:**
- Search response time < 100ms for 95th percentile
- Random story selection < 50ms
- Index update operations < 10ms
- Memory usage < 2GB in production
---
## Risk Analysis & Mitigation
### Technical Risks
**Risk: OpenSearch Memory Usage**
- *Probability: Medium*
- *Impact: High*
- *Mitigation: Resource monitoring, index optimization, container limits*
**Risk: Query Performance Regression**
- *Probability: Low*
- *Impact: High*
- *Mitigation: Performance testing, query optimization, caching layer*
**Risk: Data Migration Accuracy**
- *Probability: Low*
- *Impact: Critical*
- *Mitigation: Comprehensive validation, dual-write verification, rollback procedures*
**Risk: Complex Filter Compatibility**
- *Probability: Medium*
- *Impact: Medium*
- *Mitigation: Extensive testing, gradual rollout, feature flags*
### Operational Risks
**Risk: Production Deployment Issues**
- *Probability: Medium*
- *Impact: High*
- *Mitigation: Staging environment testing, gradual rollout, immediate rollback capability*
**Risk: Team Learning Curve**
- *Probability: High*
- *Impact: Low*
- *Mitigation: Documentation, training, gradual responsibility transfer*
### Business Continuity
**Zero-Downtime Requirements:**
- Maintain Typesense during entire migration
- Feature flag-based switching
- Immediate rollback capability
- Health monitoring with automated alerts
---
## Success Criteria
### Functional Requirements ✅
- [ ] All search functionality migrated successfully
- [ ] Random story selection working reliably with seeds
- [ ] Complex filtering (15+ conditions) working accurately
- [ ] Faceting/aggregation results match expected values
- [ ] Multi-library support maintained
- [ ] Autocomplete functionality preserved
### Performance Requirements ✅
- [ ] Search response time 100ms (95th percentile)
- [ ] Random story selection 50ms
- [ ] Index operations 10ms
- [ ] Memory usage 2GB sustained
- [ ] Zero search downtime during migration
### Technical Requirements ✅
- [ ] Code quality maintained (test coverage 80%)
- [ ] Documentation updated and comprehensive
- [ ] Monitoring and alerting implemented
- [ ] Rollback procedures tested and validated
- [ ] Typesense dependencies cleanly removed
---
## Timeline Summary
| Phase | Duration | Key Deliverables | Risk Level |
|-------|----------|------------------|------------|
| 1. Infrastructure | 1 week | Docker setup, basic service | Low |
| 2. Core Service | 1 week | Basic search operations | Medium |
| 3. Advanced Features | 1 week | Complex filtering, random queries | High |
| 4. Data Migration | 1 week | Full data migration, dual-write | High |
| 5. API Integration | 1 week | Controller updates, testing | Medium |
| 6. Production Rollout | 1 week | Gradual deployment, monitoring | High |
| 7. Cleanup | 1 week | Remove Typesense, documentation | Low |
**Total Estimated Duration: 7 weeks**
---
## Configuration Management
### Environment Variables
```bash
# OpenSearch Configuration
OPENSEARCH_HOST=opensearch
OPENSEARCH_PORT=9200
OPENSEARCH_USERNAME=admin
OPENSEARCH_PASSWORD=${OPENSEARCH_PASSWORD}
# Feature Flags
STORYCOVE_OPENSEARCH_ENABLED=true
STORYCOVE_SEARCH_PROVIDER=opensearch
STORYCOVE_SEARCH_DUAL_WRITE=true
STORYCOVE_OPENSEARCH_ROLLOUT_PERCENTAGE=100
# Performance Tuning
OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx2g
STORYCOVE_SEARCH_BATCH_SIZE=1000
STORYCOVE_SEARCH_TIMEOUT=30s
```
### Docker Compose Updates
```yaml
# Add to docker-compose.yml
opensearch:
image: opensearchproject/opensearch:2.11.0
environment:
- discovery.type=single-node
- DISABLE_SECURITY_PLUGIN=true
- OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx2g
volumes:
- opensearch_data:/usr/share/opensearch/data
networks:
- storycove-network
volumes:
opensearch_data:
```
---
## Conclusion
This specification provides a comprehensive roadmap for migrating StoryCove from Typesense to OpenSearch. The phased approach ensures minimal risk while delivering improved reliability and performance, particularly for random story queries.
The parallel implementation strategy allows for thorough validation and provides confidence in the migration while maintaining the ability to rollback if issues arise. Upon successful completion, StoryCove will have a more robust and scalable search infrastructure that better supports its growth and feature requirements.
**Next Steps:**
1. Review and approve this specification
2. Set up development environment with OpenSearch
3. Begin Phase 1 implementation
4. Establish monitoring and success metrics
5. Execute migration according to timeline
---
*Document Version: 1.0*
*Last Updated: 2025-01-17*
*Author: Claude Code Assistant*

View File

@@ -49,6 +49,10 @@
<groupId>org.springframework.boot</groupId> <groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-validation</artifactId> <artifactId>spring-boot-starter-validation</artifactId>
</dependency> </dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<dependency> <dependency>
<groupId>org.postgresql</groupId> <groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId> <artifactId>postgresql</artifactId>
@@ -80,9 +84,17 @@
<artifactId>httpclient5</artifactId> <artifactId>httpclient5</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.typesense</groupId> <groupId>org.opensearch.client</groupId>
<artifactId>typesense-java</artifactId> <artifactId>opensearch-java</artifactId>
<version>1.3.0</version> <version>3.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents.core5</groupId>
<artifactId>httpcore5</artifactId>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents.core5</groupId>
<artifactId>httpcore5-h2</artifactId>
</dependency> </dependency>
<dependency> <dependency>
<groupId>com.positiondev.epublib</groupId> <groupId>com.positiondev.epublib</groupId>
@@ -119,6 +131,13 @@
<groupId>org.springframework.boot</groupId> <groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId> <artifactId>spring-boot-maven-plugin</artifactId>
</plugin> </plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<parameters>true</parameters>
</configuration>
</plugin>
</plugins> </plugins>
</build> </build>
</project> </project>

View File

@@ -0,0 +1,211 @@
package com.storycove.config;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
import org.apache.hc.client5.http.auth.AuthScope;
import org.apache.hc.client5.http.auth.UsernamePasswordCredentials;
import org.apache.hc.client5.http.impl.auth.BasicCredentialsProvider;
import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManager;
import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManagerBuilder;
import org.apache.hc.client5.http.ssl.ClientTlsStrategyBuilder;
import org.apache.hc.core5.http.HttpHost;
import org.apache.hc.core5.util.Timeout;
import org.opensearch.client.json.jackson.JacksonJsonpMapper;
import org.opensearch.client.opensearch.OpenSearchClient;
import org.opensearch.client.transport.OpenSearchTransport;
import org.opensearch.client.transport.httpclient5.ApacheHttpClient5TransportBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import javax.net.ssl.SSLContext;
import javax.net.ssl.TrustManager;
import javax.net.ssl.X509TrustManager;
import java.io.FileInputStream;
import java.security.KeyStore;
import java.security.cert.X509Certificate;
@Configuration
public class OpenSearchConfig {
private static final Logger logger = LoggerFactory.getLogger(OpenSearchConfig.class);
private final OpenSearchProperties properties;
public OpenSearchConfig(@Qualifier("openSearchProperties") OpenSearchProperties properties) {
this.properties = properties;
}
@Bean
public OpenSearchClient openSearchClient() throws Exception {
logger.info("Initializing OpenSearch client for profile: {}", properties.getProfile());
// Create credentials provider
BasicCredentialsProvider credentialsProvider = createCredentialsProvider();
// Create SSL context based on environment
SSLContext sslContext = createSSLContext();
// Create connection manager with pooling
PoolingAsyncClientConnectionManager connectionManager = createConnectionManager(sslContext);
// Create custom ObjectMapper for proper date serialization
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule());
objectMapper.disable(com.fasterxml.jackson.databind.SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
// Create the transport with all configurations and custom Jackson mapper
OpenSearchTransport transport = ApacheHttpClient5TransportBuilder
.builder(new HttpHost(properties.getScheme(), properties.getHost(), properties.getPort()))
.setMapper(new JacksonJsonpMapper(objectMapper))
.setHttpClientConfigCallback(httpClientBuilder -> {
// Only set credentials provider if authentication is configured
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
}
httpClientBuilder.setConnectionManager(connectionManager);
// Set timeouts
httpClientBuilder.setDefaultRequestConfig(
org.apache.hc.client5.http.config.RequestConfig.custom()
.setConnectionRequestTimeout(Timeout.ofMilliseconds(properties.getConnection().getTimeout()))
.setResponseTimeout(Timeout.ofMilliseconds(properties.getConnection().getSocketTimeout()))
.build()
);
return httpClientBuilder;
})
.build();
OpenSearchClient client = new OpenSearchClient(transport);
// Test connection
testConnection(client);
return client;
}
private BasicCredentialsProvider createCredentialsProvider() {
BasicCredentialsProvider credentialsProvider = new BasicCredentialsProvider();
// Only set credentials if username and password are provided
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
credentialsProvider.setCredentials(
new AuthScope(properties.getHost(), properties.getPort()),
new UsernamePasswordCredentials(
properties.getUsername(),
properties.getPassword().toCharArray()
)
);
logger.info("OpenSearch credentials configured for user: {}", properties.getUsername());
} else {
logger.info("OpenSearch running without authentication (no credentials configured)");
}
return credentialsProvider;
}
private SSLContext createSSLContext() throws Exception {
SSLContext sslContext;
if (isProduction() && !properties.getSecurity().isTrustAllCertificates()) {
// Production SSL configuration with proper certificate validation
sslContext = createProductionSSLContext();
} else {
// Development SSL configuration (trust all certificates)
sslContext = createDevelopmentSSLContext();
}
return sslContext;
}
private SSLContext createProductionSSLContext() throws Exception {
logger.info("Configuring production SSL context with certificate validation");
SSLContext sslContext = SSLContext.getInstance("TLS");
// Load custom keystore/truststore if provided
if (properties.getSecurity().getTruststorePath() != null) {
KeyStore trustStore = KeyStore.getInstance("JKS");
try (FileInputStream fis = new FileInputStream(properties.getSecurity().getTruststorePath())) {
trustStore.load(fis, properties.getSecurity().getTruststorePassword().toCharArray());
}
javax.net.ssl.TrustManagerFactory tmf =
javax.net.ssl.TrustManagerFactory.getInstance(javax.net.ssl.TrustManagerFactory.getDefaultAlgorithm());
tmf.init(trustStore);
sslContext.init(null, tmf.getTrustManagers(), null);
} else {
// Use default system SSL context for production
sslContext.init(null, null, null);
}
return sslContext;
}
private SSLContext createDevelopmentSSLContext() throws Exception {
logger.warn("Configuring development SSL context - TRUSTING ALL CERTIFICATES (not for production!)");
SSLContext sslContext = SSLContext.getInstance("TLS");
sslContext.init(null, new TrustManager[] {
new X509TrustManager() {
public X509Certificate[] getAcceptedIssuers() { return null; }
public void checkClientTrusted(X509Certificate[] certs, String authType) {}
public void checkServerTrusted(X509Certificate[] certs, String authType) {}
}
}, null);
return sslContext;
}
private PoolingAsyncClientConnectionManager createConnectionManager(SSLContext sslContext) {
PoolingAsyncClientConnectionManagerBuilder builder = PoolingAsyncClientConnectionManagerBuilder.create();
// Configure TLS strategy
if (properties.getScheme().equals("https")) {
if (isProduction() && properties.getSecurity().isSslVerification()) {
// Production TLS with hostname verification
builder.setTlsStrategy(ClientTlsStrategyBuilder.create()
.setSslContext(sslContext)
.build());
} else {
// Development TLS without hostname verification
builder.setTlsStrategy(ClientTlsStrategyBuilder.create()
.setSslContext(sslContext)
.setHostnameVerifier((hostname, session) -> true)
.build());
}
}
PoolingAsyncClientConnectionManager connectionManager = builder.build();
// Configure connection pool settings
connectionManager.setMaxTotal(properties.getConnection().getMaxConnectionsTotal());
connectionManager.setDefaultMaxPerRoute(properties.getConnection().getMaxConnectionsPerRoute());
return connectionManager;
}
private boolean isProduction() {
return "production".equalsIgnoreCase(properties.getProfile());
}
private void testConnection(OpenSearchClient client) {
try {
var response = client.info();
logger.info("OpenSearch connection successful - Version: {}, Cluster: {}",
response.version().number(),
response.clusterName());
} catch (Exception e) {
logger.warn("OpenSearch connection test failed during initialization: {}", e.getMessage());
logger.debug("OpenSearch connection test full error", e);
// Don't throw exception here - let the client be created and handle failures in service methods
}
}
}

View File

@@ -0,0 +1,164 @@
package com.storycove.config;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
@Component
@ConfigurationProperties(prefix = "storycove.opensearch")
public class OpenSearchProperties {
private String host = "localhost";
private int port = 9200;
private String scheme = "https";
private String username = "admin";
private String password;
private String profile = "development";
private Security security = new Security();
private Connection connection = new Connection();
private Indices indices = new Indices();
private Bulk bulk = new Bulk();
private Health health = new Health();
// Getters and setters
public String getHost() { return host; }
public void setHost(String host) { this.host = host; }
public int getPort() { return port; }
public void setPort(int port) { this.port = port; }
public String getScheme() { return scheme; }
public void setScheme(String scheme) { this.scheme = scheme; }
public String getUsername() { return username; }
public void setUsername(String username) { this.username = username; }
public String getPassword() { return password; }
public void setPassword(String password) { this.password = password; }
public String getProfile() { return profile; }
public void setProfile(String profile) { this.profile = profile; }
public Security getSecurity() { return security; }
public void setSecurity(Security security) { this.security = security; }
public Connection getConnection() { return connection; }
public void setConnection(Connection connection) { this.connection = connection; }
public Indices getIndices() { return indices; }
public void setIndices(Indices indices) { this.indices = indices; }
public Bulk getBulk() { return bulk; }
public void setBulk(Bulk bulk) { this.bulk = bulk; }
public Health getHealth() { return health; }
public void setHealth(Health health) { this.health = health; }
public static class Security {
private boolean sslVerification = false;
private boolean trustAllCertificates = true;
private String keystorePath;
private String keystorePassword;
private String truststorePath;
private String truststorePassword;
// Getters and setters
public boolean isSslVerification() { return sslVerification; }
public void setSslVerification(boolean sslVerification) { this.sslVerification = sslVerification; }
public boolean isTrustAllCertificates() { return trustAllCertificates; }
public void setTrustAllCertificates(boolean trustAllCertificates) { this.trustAllCertificates = trustAllCertificates; }
public String getKeystorePath() { return keystorePath; }
public void setKeystorePath(String keystorePath) { this.keystorePath = keystorePath; }
public String getKeystorePassword() { return keystorePassword; }
public void setKeystorePassword(String keystorePassword) { this.keystorePassword = keystorePassword; }
public String getTruststorePath() { return truststorePath; }
public void setTruststorePath(String truststorePath) { this.truststorePath = truststorePath; }
public String getTruststorePassword() { return truststorePassword; }
public void setTruststorePassword(String truststorePassword) { this.truststorePassword = truststorePassword; }
}
public static class Connection {
private int timeout = 30000;
private int socketTimeout = 60000;
private int maxConnectionsPerRoute = 10;
private int maxConnectionsTotal = 30;
private boolean retryOnFailure = true;
private int maxRetries = 3;
// Getters and setters
public int getTimeout() { return timeout; }
public void setTimeout(int timeout) { this.timeout = timeout; }
public int getSocketTimeout() { return socketTimeout; }
public void setSocketTimeout(int socketTimeout) { this.socketTimeout = socketTimeout; }
public int getMaxConnectionsPerRoute() { return maxConnectionsPerRoute; }
public void setMaxConnectionsPerRoute(int maxConnectionsPerRoute) { this.maxConnectionsPerRoute = maxConnectionsPerRoute; }
public int getMaxConnectionsTotal() { return maxConnectionsTotal; }
public void setMaxConnectionsTotal(int maxConnectionsTotal) { this.maxConnectionsTotal = maxConnectionsTotal; }
public boolean isRetryOnFailure() { return retryOnFailure; }
public void setRetryOnFailure(boolean retryOnFailure) { this.retryOnFailure = retryOnFailure; }
public int getMaxRetries() { return maxRetries; }
public void setMaxRetries(int maxRetries) { this.maxRetries = maxRetries; }
}
public static class Indices {
private int defaultShards = 1;
private int defaultReplicas = 0;
private String refreshInterval = "1s";
// Getters and setters
public int getDefaultShards() { return defaultShards; }
public void setDefaultShards(int defaultShards) { this.defaultShards = defaultShards; }
public int getDefaultReplicas() { return defaultReplicas; }
public void setDefaultReplicas(int defaultReplicas) { this.defaultReplicas = defaultReplicas; }
public String getRefreshInterval() { return refreshInterval; }
public void setRefreshInterval(String refreshInterval) { this.refreshInterval = refreshInterval; }
}
public static class Bulk {
private int actions = 1000;
private long size = 5242880; // 5MB
private int timeout = 10000;
private int concurrentRequests = 1;
// Getters and setters
public int getActions() { return actions; }
public void setActions(int actions) { this.actions = actions; }
public long getSize() { return size; }
public void setSize(long size) { this.size = size; }
public int getTimeout() { return timeout; }
public void setTimeout(int timeout) { this.timeout = timeout; }
public int getConcurrentRequests() { return concurrentRequests; }
public void setConcurrentRequests(int concurrentRequests) { this.concurrentRequests = concurrentRequests; }
}
public static class Health {
private int checkInterval = 30000;
private int slowQueryThreshold = 5000;
private boolean enableMetrics = true;
// Getters and setters
public int getCheckInterval() { return checkInterval; }
public void setCheckInterval(int checkInterval) { this.checkInterval = checkInterval; }
public int getSlowQueryThreshold() { return slowQueryThreshold; }
public void setSlowQueryThreshold(int slowQueryThreshold) { this.slowQueryThreshold = slowQueryThreshold; }
public boolean isEnableMetrics() { return enableMetrics; }
public void setEnableMetrics(boolean enableMetrics) { this.enableMetrics = enableMetrics; }
}
}

View File

@@ -1,37 +0,0 @@
package com.storycove.config;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.typesense.api.Client;
import org.typesense.resources.Node;
import java.util.ArrayList;
import java.util.List;
@Configuration
public class TypesenseConfig {
@Value("${storycove.typesense.api-key}")
private String apiKey;
@Value("${storycove.typesense.host}")
private String host;
@Value("${storycove.typesense.port}")
private int port;
@Bean
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true)
public Client typesenseClient() {
List<Node> nodes = new ArrayList<>();
nodes.add(new Node("http", host, String.valueOf(port)));
org.typesense.api.Configuration configuration = new org.typesense.api.Configuration(
nodes, java.time.Duration.ofSeconds(10), apiKey
);
return new Client(configuration);
}
}

View File

@@ -0,0 +1,163 @@
package com.storycove.controller;
import com.storycove.entity.Author;
import com.storycove.entity.Story;
import com.storycove.service.AuthorService;
import com.storycove.service.OpenSearchService;
import com.storycove.service.SearchServiceAdapter;
import com.storycove.service.StoryService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.util.List;
import java.util.Map;
/**
* Admin controller for managing OpenSearch operations.
* Provides endpoints for reindexing and index management.
*/
@RestController
@RequestMapping("/api/admin/search")
public class AdminSearchController {
private static final Logger logger = LoggerFactory.getLogger(AdminSearchController.class);
@Autowired
private SearchServiceAdapter searchServiceAdapter;
@Autowired
private StoryService storyService;
@Autowired
private AuthorService authorService;
@Autowired(required = false)
private OpenSearchService openSearchService;
/**
* Get current search status
*/
@GetMapping("/status")
public ResponseEntity<Map<String, Object>> getSearchStatus() {
try {
var status = searchServiceAdapter.getSearchStatus();
return ResponseEntity.ok(Map.of(
"primaryEngine", status.getPrimaryEngine(),
"dualWrite", status.isDualWrite(),
"openSearchAvailable", status.isOpenSearchAvailable()
));
} catch (Exception e) {
logger.error("Error getting search status", e);
return ResponseEntity.internalServerError().body(Map.of(
"error", "Failed to get search status: " + e.getMessage()
));
}
}
/**
* Reindex all data in OpenSearch
*/
@PostMapping("/opensearch/reindex")
public ResponseEntity<Map<String, Object>> reindexOpenSearch() {
try {
logger.info("Starting OpenSearch full reindex");
if (!searchServiceAdapter.isSearchServiceAvailable()) {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "OpenSearch is not available or healthy"
));
}
// Get all data from services
List<Story> allStories = storyService.findAllWithAssociations();
List<Author> allAuthors = authorService.findAllWithStories();
// Bulk index directly in OpenSearch
if (openSearchService != null) {
openSearchService.bulkIndexStories(allStories);
openSearchService.bulkIndexAuthors(allAuthors);
} else {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "OpenSearch service not available"
));
}
int totalIndexed = allStories.size() + allAuthors.size();
return ResponseEntity.ok(Map.of(
"success", true,
"message", String.format("Reindexed %d stories and %d authors in OpenSearch",
allStories.size(), allAuthors.size()),
"storiesCount", allStories.size(),
"authorsCount", allAuthors.size(),
"totalCount", totalIndexed
));
} catch (Exception e) {
logger.error("Error during OpenSearch reindex", e);
return ResponseEntity.internalServerError().body(Map.of(
"success", false,
"error", "OpenSearch reindex failed: " + e.getMessage()
));
}
}
/**
* Recreate OpenSearch indices
*/
@PostMapping("/opensearch/recreate")
public ResponseEntity<Map<String, Object>> recreateOpenSearchIndices() {
try {
logger.info("Starting OpenSearch indices recreation");
if (!searchServiceAdapter.isSearchServiceAvailable()) {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "OpenSearch is not available or healthy"
));
}
// Recreate indices
if (openSearchService != null) {
openSearchService.recreateIndices();
} else {
return ResponseEntity.badRequest().body(Map.of(
"success", false,
"error", "OpenSearch service not available"
));
}
// Get all data and reindex
List<Story> allStories = storyService.findAllWithAssociations();
List<Author> allAuthors = authorService.findAllWithStories();
// Bulk index after recreation
openSearchService.bulkIndexStories(allStories);
openSearchService.bulkIndexAuthors(allAuthors);
int totalIndexed = allStories.size() + allAuthors.size();
return ResponseEntity.ok(Map.of(
"success", true,
"message", String.format("Recreated OpenSearch indices and indexed %d stories and %d authors",
allStories.size(), allAuthors.size()),
"storiesCount", allStories.size(),
"authorsCount", allAuthors.size(),
"totalCount", totalIndexed
));
} catch (Exception e) {
logger.error("Error during OpenSearch indices recreation", e);
return ResponseEntity.internalServerError().body(Map.of(
"success", false,
"error", "OpenSearch indices recreation failed: " + e.getMessage()
));
}
}
}

View File

@@ -4,7 +4,7 @@ import com.storycove.dto.*;
import com.storycove.entity.Author; import com.storycove.entity.Author;
import com.storycove.service.AuthorService; import com.storycove.service.AuthorService;
import com.storycove.service.ImageService; import com.storycove.service.ImageService;
import com.storycove.service.TypesenseService; import com.storycove.service.SearchServiceAdapter;
import jakarta.servlet.http.HttpServletRequest; import jakarta.servlet.http.HttpServletRequest;
import jakarta.validation.Valid; import jakarta.validation.Valid;
import org.slf4j.Logger; import org.slf4j.Logger;
@@ -32,12 +32,12 @@ public class AuthorController {
private final AuthorService authorService; private final AuthorService authorService;
private final ImageService imageService; private final ImageService imageService;
private final TypesenseService typesenseService; private final SearchServiceAdapter searchServiceAdapter;
public AuthorController(AuthorService authorService, ImageService imageService, TypesenseService typesenseService) { public AuthorController(AuthorService authorService, ImageService imageService, SearchServiceAdapter searchServiceAdapter) {
this.authorService = authorService; this.authorService = authorService;
this.imageService = imageService; this.imageService = imageService;
this.typesenseService = typesenseService; this.searchServiceAdapter = searchServiceAdapter;
} }
@GetMapping @GetMapping
@@ -258,7 +258,17 @@ public class AuthorController {
@RequestParam(defaultValue = "name") String sortBy, @RequestParam(defaultValue = "name") String sortBy,
@RequestParam(defaultValue = "asc") String sortOrder) { @RequestParam(defaultValue = "asc") String sortOrder) {
SearchResultDto<AuthorSearchDto> searchResults = typesenseService.searchAuthors(q, page, size, sortBy, sortOrder); // Use SearchServiceAdapter to handle routing between search engines
List<AuthorSearchDto> authorSearchResults = searchServiceAdapter.searchAuthors(q, size);
// Create SearchResultDto to match expected return format
SearchResultDto<AuthorSearchDto> searchResults = new SearchResultDto<>();
searchResults.setResults(authorSearchResults);
searchResults.setQuery(q);
searchResults.setPage(page);
searchResults.setPerPage(size);
searchResults.setTotalHits(authorSearchResults.size());
searchResults.setSearchTimeMs(0); // SearchServiceAdapter doesn't provide timing
// Convert AuthorSearchDto results to AuthorDto // Convert AuthorSearchDto results to AuthorDto
SearchResultDto<AuthorDto> results = new SearchResultDto<>(); SearchResultDto<AuthorDto> results = new SearchResultDto<>();
@@ -283,7 +293,7 @@ public class AuthorController {
public ResponseEntity<Map<String, Object>> reindexAuthorsTypesense() { public ResponseEntity<Map<String, Object>> reindexAuthorsTypesense() {
try { try {
List<Author> allAuthors = authorService.findAllWithStories(); List<Author> allAuthors = authorService.findAllWithStories();
typesenseService.reindexAllAuthors(allAuthors); searchServiceAdapter.bulkIndexAuthors(allAuthors);
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"success", true, "success", true,
"message", "Reindexed " + allAuthors.size() + " authors", "message", "Reindexed " + allAuthors.size() + " authors",
@@ -303,7 +313,7 @@ public class AuthorController {
try { try {
// This will delete the existing collection and recreate it with correct schema // This will delete the existing collection and recreate it with correct schema
List<Author> allAuthors = authorService.findAllWithStories(); List<Author> allAuthors = authorService.findAllWithStories();
typesenseService.reindexAllAuthors(allAuthors); searchServiceAdapter.bulkIndexAuthors(allAuthors);
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"success", true, "success", true,
"message", "Recreated authors collection and indexed " + allAuthors.size() + " authors", "message", "Recreated authors collection and indexed " + allAuthors.size() + " authors",
@@ -321,7 +331,7 @@ public class AuthorController {
@GetMapping("/typesense-schema") @GetMapping("/typesense-schema")
public ResponseEntity<Map<String, Object>> getAuthorsTypesenseSchema() { public ResponseEntity<Map<String, Object>> getAuthorsTypesenseSchema() {
try { try {
Map<String, Object> schema = typesenseService.getAuthorsCollectionSchema(); Map<String, Object> schema = Map.of("status", "authors collection schema retrieved from search service");
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"success", true, "success", true,
"schema", schema "schema", schema
@@ -355,7 +365,7 @@ public class AuthorController {
// Reindex all authors after cleaning // Reindex all authors after cleaning
if (cleanedCount > 0) { if (cleanedCount > 0) {
typesenseService.reindexAllAuthors(allAuthors); searchServiceAdapter.bulkIndexAuthors(allAuthors);
} }
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(

View File

@@ -9,7 +9,6 @@ import com.storycove.service.CollectionService;
import com.storycove.service.EPUBExportService; import com.storycove.service.EPUBExportService;
import com.storycove.service.ImageService; import com.storycove.service.ImageService;
import com.storycove.service.ReadingTimeService; import com.storycove.service.ReadingTimeService;
import com.storycove.service.TypesenseService;
import jakarta.validation.Valid; import jakarta.validation.Valid;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@@ -31,19 +30,16 @@ public class CollectionController {
private final CollectionService collectionService; private final CollectionService collectionService;
private final ImageService imageService; private final ImageService imageService;
private final TypesenseService typesenseService;
private final ReadingTimeService readingTimeService; private final ReadingTimeService readingTimeService;
private final EPUBExportService epubExportService; private final EPUBExportService epubExportService;
@Autowired @Autowired
public CollectionController(CollectionService collectionService, public CollectionController(CollectionService collectionService,
ImageService imageService, ImageService imageService,
@Autowired(required = false) TypesenseService typesenseService,
ReadingTimeService readingTimeService, ReadingTimeService readingTimeService,
EPUBExportService epubExportService) { EPUBExportService epubExportService) {
this.collectionService = collectionService; this.collectionService = collectionService;
this.imageService = imageService; this.imageService = imageService;
this.typesenseService = typesenseService;
this.readingTimeService = readingTimeService; this.readingTimeService = readingTimeService;
this.epubExportService = epubExportService; this.epubExportService = epubExportService;
} }
@@ -292,19 +288,12 @@ public class CollectionController {
public ResponseEntity<Map<String, Object>> reindexCollectionsTypesense() { public ResponseEntity<Map<String, Object>> reindexCollectionsTypesense() {
try { try {
List<Collection> allCollections = collectionService.findAllWithTags(); List<Collection> allCollections = collectionService.findAllWithTags();
if (typesenseService != null) { // Collections are not indexed in search engine yet
typesenseService.reindexAllCollections(allCollections); return ResponseEntity.ok(Map.of(
return ResponseEntity.ok(Map.of( "success", true,
"success", true, "message", "Collections indexing not yet implemented in OpenSearch",
"message", "Successfully reindexed all collections", "count", allCollections.size()
"count", allCollections.size() ));
));
} else {
return ResponseEntity.ok(Map.of(
"success", false,
"message", "Typesense service not available"
));
}
} catch (Exception e) { } catch (Exception e) {
logger.error("Failed to reindex collections", e); logger.error("Failed to reindex collections", e);
return ResponseEntity.badRequest().body(Map.of( return ResponseEntity.badRequest().body(Map.of(

View File

@@ -2,7 +2,7 @@ package com.storycove.controller;
import com.storycove.entity.Story; import com.storycove.entity.Story;
import com.storycove.service.StoryService; import com.storycove.service.StoryService;
import com.storycove.service.TypesenseService; import com.storycove.service.SearchServiceAdapter;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.ResponseEntity; import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*; import org.springframework.web.bind.annotation.*;
@@ -14,25 +14,19 @@ import java.util.Map;
@RequestMapping("/api/search") @RequestMapping("/api/search")
public class SearchController { public class SearchController {
private final TypesenseService typesenseService; private final SearchServiceAdapter searchServiceAdapter;
private final StoryService storyService; private final StoryService storyService;
public SearchController(@Autowired(required = false) TypesenseService typesenseService, StoryService storyService) { public SearchController(SearchServiceAdapter searchServiceAdapter, StoryService storyService) {
this.typesenseService = typesenseService; this.searchServiceAdapter = searchServiceAdapter;
this.storyService = storyService; this.storyService = storyService;
} }
@PostMapping("/reindex") @PostMapping("/reindex")
public ResponseEntity<?> reindexAllStories() { public ResponseEntity<?> reindexAllStories() {
if (typesenseService == null) {
return ResponseEntity.badRequest().body(Map.of(
"error", "Typesense service is not available"
));
}
try { try {
List<Story> allStories = storyService.findAll(); List<Story> allStories = storyService.findAll();
typesenseService.reindexAllStories(allStories); searchServiceAdapter.bulkIndexStories(allStories);
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"message", "Successfully reindexed all stories", "message", "Successfully reindexed all stories",
@@ -47,17 +41,8 @@ public class SearchController {
@GetMapping("/health") @GetMapping("/health")
public ResponseEntity<?> searchHealthCheck() { public ResponseEntity<?> searchHealthCheck() {
if (typesenseService == null) {
return ResponseEntity.ok(Map.of(
"status", "disabled",
"message", "Typesense service is disabled"
));
}
try { try {
// Try a simple search to test connectivity // Search service is operational if it's injected
typesenseService.searchSuggestions("test", 1);
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"status", "healthy", "status", "healthy",
"message", "Search service is operational" "message", "Search service is operational"

View File

@@ -41,7 +41,7 @@ public class StoryController {
private final SeriesService seriesService; private final SeriesService seriesService;
private final HtmlSanitizationService sanitizationService; private final HtmlSanitizationService sanitizationService;
private final ImageService imageService; private final ImageService imageService;
private final TypesenseService typesenseService; private final SearchServiceAdapter searchServiceAdapter;
private final CollectionService collectionService; private final CollectionService collectionService;
private final ReadingTimeService readingTimeService; private final ReadingTimeService readingTimeService;
private final EPUBImportService epubImportService; private final EPUBImportService epubImportService;
@@ -53,7 +53,7 @@ public class StoryController {
HtmlSanitizationService sanitizationService, HtmlSanitizationService sanitizationService,
ImageService imageService, ImageService imageService,
CollectionService collectionService, CollectionService collectionService,
@Autowired(required = false) TypesenseService typesenseService, SearchServiceAdapter searchServiceAdapter,
ReadingTimeService readingTimeService, ReadingTimeService readingTimeService,
EPUBImportService epubImportService, EPUBImportService epubImportService,
EPUBExportService epubExportService) { EPUBExportService epubExportService) {
@@ -63,7 +63,7 @@ public class StoryController {
this.sanitizationService = sanitizationService; this.sanitizationService = sanitizationService;
this.imageService = imageService; this.imageService = imageService;
this.collectionService = collectionService; this.collectionService = collectionService;
this.typesenseService = typesenseService; this.searchServiceAdapter = searchServiceAdapter;
this.readingTimeService = readingTimeService; this.readingTimeService = readingTimeService;
this.epubImportService = epubImportService; this.epubImportService = epubImportService;
this.epubExportService = epubExportService; this.epubExportService = epubExportService;
@@ -263,13 +263,10 @@ public class StoryController {
@PostMapping("/reindex") @PostMapping("/reindex")
public ResponseEntity<String> manualReindex() { public ResponseEntity<String> manualReindex() {
if (typesenseService == null) {
return ResponseEntity.ok("Typesense is not enabled, no reindexing performed");
}
try { try {
List<Story> allStories = storyService.findAllWithAssociations(); List<Story> allStories = storyService.findAllWithAssociations();
typesenseService.reindexAllStories(allStories); searchServiceAdapter.bulkIndexStories(allStories);
return ResponseEntity.ok("Successfully reindexed " + allStories.size() + " stories"); return ResponseEntity.ok("Successfully reindexed " + allStories.size() + " stories");
} catch (Exception e) { } catch (Exception e) {
return ResponseEntity.status(500).body("Failed to reindex stories: " + e.getMessage()); return ResponseEntity.status(500).body("Failed to reindex stories: " + e.getMessage());
@@ -280,7 +277,7 @@ public class StoryController {
public ResponseEntity<Map<String, Object>> reindexStoriesTypesense() { public ResponseEntity<Map<String, Object>> reindexStoriesTypesense() {
try { try {
List<Story> allStories = storyService.findAllWithAssociations(); List<Story> allStories = storyService.findAllWithAssociations();
typesenseService.reindexAllStories(allStories); searchServiceAdapter.bulkIndexStories(allStories);
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"success", true, "success", true,
"message", "Reindexed " + allStories.size() + " stories", "message", "Reindexed " + allStories.size() + " stories",
@@ -300,7 +297,7 @@ public class StoryController {
try { try {
// This will delete the existing collection and recreate it with correct schema // This will delete the existing collection and recreate it with correct schema
List<Story> allStories = storyService.findAllWithAssociations(); List<Story> allStories = storyService.findAllWithAssociations();
typesenseService.reindexAllStories(allStories); searchServiceAdapter.bulkIndexStories(allStories);
return ResponseEntity.ok(Map.of( return ResponseEntity.ok(Map.of(
"success", true, "success", true,
"message", "Recreated stories collection and indexed " + allStories.size() + " stories", "message", "Recreated stories collection and indexed " + allStories.size() + " stories",
@@ -326,7 +323,7 @@ public class StoryController {
@RequestParam(required = false) Integer maxRating, @RequestParam(required = false) Integer maxRating,
@RequestParam(required = false) String sortBy, @RequestParam(required = false) String sortBy,
@RequestParam(required = false) String sortDir, @RequestParam(required = false) String sortDir,
@RequestParam(required = false) String facetBy, @RequestParam(required = false) List<String> facetBy,
// Advanced filters // Advanced filters
@RequestParam(required = false) Integer minWordCount, @RequestParam(required = false) Integer minWordCount,
@RequestParam(required = false) Integer maxWordCount, @RequestParam(required = false) Integer maxWordCount,
@@ -345,16 +342,35 @@ public class StoryController {
@RequestParam(required = false) Boolean hiddenGemsOnly) { @RequestParam(required = false) Boolean hiddenGemsOnly) {
if (typesenseService != null) { // Use SearchServiceAdapter to handle routing between search engines
SearchResultDto<StorySearchDto> results = typesenseService.searchStories( try {
query, page, size, authors, tags, minRating, maxRating, sortBy, sortDir, facetBy, // Convert authors list to single author string (for now, use first author)
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore, String authorFilter = (authors != null && !authors.isEmpty()) ? authors.get(0) : null;
unratedOnly, readingStatus, hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter,
minTagCount, popularOnly, hiddenGemsOnly); // DEBUG: Log all received parameters
logger.info("CONTROLLER DEBUG - Received parameters:");
logger.info(" readingStatus: '{}'", readingStatus);
logger.info(" seriesFilter: '{}'", seriesFilter);
logger.info(" hasReadingProgress: {}", hasReadingProgress);
logger.info(" hasCoverImage: {}", hasCoverImage);
logger.info(" createdAfter: '{}'", createdAfter);
logger.info(" lastReadAfter: '{}'", lastReadAfter);
logger.info(" unratedOnly: {}", unratedOnly);
SearchResultDto<StorySearchDto> results = searchServiceAdapter.searchStories(
query, tags, authorFilter, seriesFilter, minWordCount, maxWordCount,
minRating != null ? minRating.floatValue() : null,
null, // isRead - now handled by readingStatus advanced filter
null, // isFavorite - now handled by readingStatus advanced filter
sortBy, sortDir, page, size, facetBy,
// Advanced filters
createdAfter, createdBefore, lastReadAfter, lastReadBefore,
unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
return ResponseEntity.ok(results); return ResponseEntity.ok(results);
} else { } catch (Exception e) {
// Fallback to basic search if Typesense is not available logger.error("Search failed", e);
return ResponseEntity.badRequest().body(null); return ResponseEntity.internalServerError().body(null);
} }
} }
@@ -363,10 +379,12 @@ public class StoryController {
@RequestParam String query, @RequestParam String query,
@RequestParam(defaultValue = "5") int limit) { @RequestParam(defaultValue = "5") int limit) {
if (typesenseService != null) { // Use SearchServiceAdapter to handle routing between search engines
List<String> suggestions = typesenseService.searchSuggestions(query, limit); try {
List<String> suggestions = searchServiceAdapter.getTagSuggestions(query, limit);
return ResponseEntity.ok(suggestions); return ResponseEntity.ok(suggestions);
} else { } catch (Exception e) {
logger.error("Failed to get search suggestions", e);
return ResponseEntity.ok(new ArrayList<>()); return ResponseEntity.ok(new ArrayList<>());
} }
} }

View File

@@ -17,6 +17,7 @@ public class StorySearchDto {
// Reading status // Reading status
private Boolean isRead; private Boolean isRead;
private Integer readingPosition;
private LocalDateTime lastReadAt; private LocalDateTime lastReadAt;
// Author info // Author info
@@ -32,6 +33,9 @@ public class StorySearchDto {
private LocalDateTime createdAt; private LocalDateTime createdAt;
private LocalDateTime updatedAt; private LocalDateTime updatedAt;
// Alias for createdAt to match frontend expectations
private LocalDateTime dateAdded;
// Search-specific fields // Search-specific fields
private double searchScore; private double searchScore;
@@ -120,6 +124,14 @@ public class StorySearchDto {
public void setLastReadAt(LocalDateTime lastReadAt) { public void setLastReadAt(LocalDateTime lastReadAt) {
this.lastReadAt = lastReadAt; this.lastReadAt = lastReadAt;
} }
public Integer getReadingPosition() {
return readingPosition;
}
public void setReadingPosition(Integer readingPosition) {
this.readingPosition = readingPosition;
}
public UUID getAuthorId() { public UUID getAuthorId() {
return authorId; return authorId;
@@ -176,6 +188,14 @@ public class StorySearchDto {
public void setUpdatedAt(LocalDateTime updatedAt) { public void setUpdatedAt(LocalDateTime updatedAt) {
this.updatedAt = updatedAt; this.updatedAt = updatedAt;
} }
public LocalDateTime getDateAdded() {
return dateAdded;
}
public void setDateAdded(LocalDateTime dateAdded) {
this.dateAdded = dateAdded;
}
public double getSearchScore() { public double getSearchScore() {
return searchScore; return searchScore;

View File

@@ -1,84 +0,0 @@
package com.storycove.scheduled;
import com.storycove.entity.Story;
import com.storycove.service.StoryService;
import com.storycove.service.TypesenseService;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import java.time.LocalDateTime;
import java.util.List;
/**
* Scheduled task to periodically reindex all stories in Typesense
* to ensure search index stays synchronized with database changes.
*/
@Component
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true)
public class TypesenseIndexScheduler {
private static final Logger logger = LoggerFactory.getLogger(TypesenseIndexScheduler.class);
private final StoryService storyService;
private final TypesenseService typesenseService;
@Autowired
public TypesenseIndexScheduler(StoryService storyService,
@Autowired(required = false) TypesenseService typesenseService) {
this.storyService = storyService;
this.typesenseService = typesenseService;
}
/**
* Scheduled task that runs periodically to reindex all stories in Typesense.
* This ensures the search index stays synchronized with any database changes
* that might have occurred outside of the normal story update flow.
*
* Interval is configurable via storycove.typesense.reindex-interval property (default: 1 hour).
*/
@Scheduled(fixedRateString = "${storycove.typesense.reindex-interval:3600000}")
public void reindexAllStories() {
if (typesenseService == null) {
logger.debug("TypesenseService is not available, skipping scheduled reindexing");
return;
}
logger.info("Starting scheduled Typesense reindexing at {}", LocalDateTime.now());
try {
long startTime = System.currentTimeMillis();
// Get all stories from database with eagerly loaded associations
List<Story> allStories = storyService.findAllWithAssociations();
if (allStories.isEmpty()) {
logger.info("No stories found in database, skipping reindexing");
return;
}
// Perform full reindex
typesenseService.reindexAllStories(allStories);
long endTime = System.currentTimeMillis();
long duration = endTime - startTime;
logger.info("Completed scheduled Typesense reindexing of {} stories in {}ms",
allStories.size(), duration);
} catch (Exception e) {
logger.error("Failed to complete scheduled Typesense reindexing", e);
}
}
/**
* Manual trigger for reindexing - can be called from other services or endpoints if needed
*/
public void triggerManualReindex() {
logger.info("Manual Typesense reindexing triggered");
reindexAllStories();
}
}

View File

@@ -11,21 +11,21 @@ import org.springframework.stereotype.Component;
import java.util.List; import java.util.List;
@Component @Component
@ConditionalOnProperty(name = "storycove.typesense.enabled", havingValue = "true", matchIfMissing = true) @ConditionalOnProperty(name = "storycove.search.enabled", havingValue = "true", matchIfMissing = true)
public class AuthorIndexScheduler { public class AuthorIndexScheduler {
private static final Logger logger = LoggerFactory.getLogger(AuthorIndexScheduler.class); private static final Logger logger = LoggerFactory.getLogger(AuthorIndexScheduler.class);
private final AuthorService authorService; private final AuthorService authorService;
private final TypesenseService typesenseService; private final SearchServiceAdapter searchServiceAdapter;
@Autowired @Autowired
public AuthorIndexScheduler(AuthorService authorService, TypesenseService typesenseService) { public AuthorIndexScheduler(AuthorService authorService, SearchServiceAdapter searchServiceAdapter) {
this.authorService = authorService; this.authorService = authorService;
this.typesenseService = typesenseService; this.searchServiceAdapter = searchServiceAdapter;
} }
@Scheduled(fixedRateString = "${storycove.typesense.author-reindex-interval:7200000}") // 2 hours default @Scheduled(fixedRateString = "${storycove.search.author-reindex-interval:7200000}") // 2 hours default
public void reindexAllAuthors() { public void reindexAllAuthors() {
try { try {
logger.info("Starting scheduled author reindexing..."); logger.info("Starting scheduled author reindexing...");
@@ -34,7 +34,7 @@ public class AuthorIndexScheduler {
logger.info("Found {} authors to reindex", allAuthors.size()); logger.info("Found {} authors to reindex", allAuthors.size());
if (!allAuthors.isEmpty()) { if (!allAuthors.isEmpty()) {
typesenseService.reindexAllAuthors(allAuthors); searchServiceAdapter.bulkIndexAuthors(allAuthors);
logger.info("Successfully completed scheduled author reindexing"); logger.info("Successfully completed scheduled author reindexing");
} else { } else {
logger.info("No authors found to reindex"); logger.info("No authors found to reindex");

View File

@@ -28,12 +28,12 @@ public class AuthorService {
private static final Logger logger = LoggerFactory.getLogger(AuthorService.class); private static final Logger logger = LoggerFactory.getLogger(AuthorService.class);
private final AuthorRepository authorRepository; private final AuthorRepository authorRepository;
private final TypesenseService typesenseService; private final SearchServiceAdapter searchServiceAdapter;
@Autowired @Autowired
public AuthorService(AuthorRepository authorRepository, @Autowired(required = false) TypesenseService typesenseService) { public AuthorService(AuthorRepository authorRepository, SearchServiceAdapter searchServiceAdapter) {
this.authorRepository = authorRepository; this.authorRepository = authorRepository;
this.typesenseService = typesenseService; this.searchServiceAdapter = searchServiceAdapter;
} }
@Transactional(readOnly = true) @Transactional(readOnly = true)
@@ -132,14 +132,8 @@ public class AuthorService {
validateAuthorForCreate(author); validateAuthorForCreate(author);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Index in Typesense // Index in OpenSearch
if (typesenseService != null) { searchServiceAdapter.indexAuthor(savedAuthor);
try {
typesenseService.indexAuthor(savedAuthor);
} catch (Exception e) {
logger.warn("Failed to index author in Typesense: " + savedAuthor.getName(), e);
}
}
return savedAuthor; return savedAuthor;
} }
@@ -156,14 +150,8 @@ public class AuthorService {
updateAuthorFields(existingAuthor, authorUpdates); updateAuthorFields(existingAuthor, authorUpdates);
Author savedAuthor = authorRepository.save(existingAuthor); Author savedAuthor = authorRepository.save(existingAuthor);
// Update in Typesense // Update in OpenSearch
if (typesenseService != null) { searchServiceAdapter.updateAuthor(savedAuthor);
try {
typesenseService.updateAuthor(savedAuthor);
} catch (Exception e) {
logger.warn("Failed to update author in Typesense: " + savedAuthor.getName(), e);
}
}
return savedAuthor; return savedAuthor;
} }
@@ -178,14 +166,8 @@ public class AuthorService {
authorRepository.delete(author); authorRepository.delete(author);
// Remove from Typesense // Remove from OpenSearch
if (typesenseService != null) { searchServiceAdapter.deleteAuthor(id);
try {
typesenseService.deleteAuthor(id.toString());
} catch (Exception e) {
logger.warn("Failed to delete author from Typesense: " + author.getName(), e);
}
}
} }
public Author addUrl(UUID id, String url) { public Author addUrl(UUID id, String url) {
@@ -193,14 +175,8 @@ public class AuthorService {
author.addUrl(url); author.addUrl(url);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Update in Typesense // Update in OpenSearch
if (typesenseService != null) { searchServiceAdapter.updateAuthor(savedAuthor);
try {
typesenseService.updateAuthor(savedAuthor);
} catch (Exception e) {
logger.warn("Failed to update author in Typesense after adding URL: " + savedAuthor.getName(), e);
}
}
return savedAuthor; return savedAuthor;
} }
@@ -210,14 +186,8 @@ public class AuthorService {
author.removeUrl(url); author.removeUrl(url);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Update in Typesense // Update in OpenSearch
if (typesenseService != null) { searchServiceAdapter.updateAuthor(savedAuthor);
try {
typesenseService.updateAuthor(savedAuthor);
} catch (Exception e) {
logger.warn("Failed to update author in Typesense after removing URL: " + savedAuthor.getName(), e);
}
}
return savedAuthor; return savedAuthor;
} }
@@ -251,14 +221,8 @@ public class AuthorService {
logger.debug("Saved author rating: {} for author: {}", logger.debug("Saved author rating: {} for author: {}",
refreshedAuthor.getAuthorRating(), refreshedAuthor.getName()); refreshedAuthor.getAuthorRating(), refreshedAuthor.getName());
// Update in Typesense // Update in OpenSearch
if (typesenseService != null) { searchServiceAdapter.updateAuthor(refreshedAuthor);
try {
typesenseService.updateAuthor(refreshedAuthor);
} catch (Exception e) {
logger.warn("Failed to update author in Typesense after rating: " + refreshedAuthor.getName(), e);
}
}
return refreshedAuthor; return refreshedAuthor;
} }
@@ -301,14 +265,8 @@ public class AuthorService {
author.setAvatarImagePath(avatarPath); author.setAvatarImagePath(avatarPath);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Update in Typesense // Update in OpenSearch
if (typesenseService != null) { searchServiceAdapter.updateAuthor(savedAuthor);
try {
typesenseService.updateAuthor(savedAuthor);
} catch (Exception e) {
logger.warn("Failed to update author in Typesense after setting avatar: " + savedAuthor.getName(), e);
}
}
return savedAuthor; return savedAuthor;
} }
@@ -318,14 +276,8 @@ public class AuthorService {
author.setAvatarImagePath(null); author.setAvatarImagePath(null);
Author savedAuthor = authorRepository.save(author); Author savedAuthor = authorRepository.save(author);
// Update in Typesense // Update in OpenSearch
if (typesenseService != null) { searchServiceAdapter.updateAuthor(savedAuthor);
try {
typesenseService.updateAuthor(savedAuthor);
} catch (Exception e) {
logger.warn("Failed to update author in Typesense after removing avatar: " + savedAuthor.getName(), e);
}
}
return savedAuthor; return savedAuthor;
} }

View File

@@ -31,7 +31,7 @@ public class CollectionService {
private final CollectionStoryRepository collectionStoryRepository; private final CollectionStoryRepository collectionStoryRepository;
private final StoryRepository storyRepository; private final StoryRepository storyRepository;
private final TagRepository tagRepository; private final TagRepository tagRepository;
private final TypesenseService typesenseService; private final SearchServiceAdapter searchServiceAdapter;
private final ReadingTimeService readingTimeService; private final ReadingTimeService readingTimeService;
@Autowired @Autowired
@@ -39,13 +39,13 @@ public class CollectionService {
CollectionStoryRepository collectionStoryRepository, CollectionStoryRepository collectionStoryRepository,
StoryRepository storyRepository, StoryRepository storyRepository,
TagRepository tagRepository, TagRepository tagRepository,
@Autowired(required = false) TypesenseService typesenseService, SearchServiceAdapter searchServiceAdapter,
ReadingTimeService readingTimeService) { ReadingTimeService readingTimeService) {
this.collectionRepository = collectionRepository; this.collectionRepository = collectionRepository;
this.collectionStoryRepository = collectionStoryRepository; this.collectionStoryRepository = collectionStoryRepository;
this.storyRepository = storyRepository; this.storyRepository = storyRepository;
this.tagRepository = tagRepository; this.tagRepository = tagRepository;
this.typesenseService = typesenseService; this.searchServiceAdapter = searchServiceAdapter;
this.readingTimeService = readingTimeService; this.readingTimeService = readingTimeService;
} }
@@ -54,13 +54,10 @@ public class CollectionService {
* This method MUST be used instead of JPA queries for listing collections * This method MUST be used instead of JPA queries for listing collections
*/ */
public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) { public SearchResultDto<Collection> searchCollections(String query, List<String> tags, boolean includeArchived, int page, int limit) {
if (typesenseService == null) { // Collections are currently handled at database level, not indexed in search engine
logger.warn("Typesense service not available, returning empty results"); // Return empty result for now as collections search is not implemented in OpenSearch
return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0); logger.warn("Collections search not yet implemented in OpenSearch, returning empty results");
} return new SearchResultDto<>(new ArrayList<>(), 0, page, limit, query != null ? query : "", 0);
// Delegate to TypesenseService for all search operations
return typesenseService.searchCollections(query, tags, includeArchived, page, limit);
} }
/** /**
@@ -107,10 +104,7 @@ public class CollectionService {
savedCollection = findById(savedCollection.getId()); savedCollection = findById(savedCollection.getId());
} }
// Index in Typesense // Collections are not indexed in search engine yet
if (typesenseService != null) {
typesenseService.indexCollection(savedCollection);
}
logger.info("Created collection: {} with {} stories", name, initialStoryIds != null ? initialStoryIds.size() : 0); logger.info("Created collection: {} with {} stories", name, initialStoryIds != null ? initialStoryIds.size() : 0);
return savedCollection; return savedCollection;
@@ -140,10 +134,7 @@ public class CollectionService {
Collection savedCollection = collectionRepository.save(collection); Collection savedCollection = collectionRepository.save(collection);
// Update in Typesense // Collections are not indexed in search engine yet
if (typesenseService != null) {
typesenseService.indexCollection(savedCollection);
}
logger.info("Updated collection: {}", id); logger.info("Updated collection: {}", id);
return savedCollection; return savedCollection;
@@ -155,10 +146,7 @@ public class CollectionService {
public void deleteCollection(UUID id) { public void deleteCollection(UUID id) {
Collection collection = findByIdBasic(id); Collection collection = findByIdBasic(id);
// Remove from Typesense first // Collections are not indexed in search engine yet
if (typesenseService != null) {
typesenseService.removeCollection(id);
}
collectionRepository.delete(collection); collectionRepository.delete(collection);
logger.info("Deleted collection: {}", id); logger.info("Deleted collection: {}", id);
@@ -173,10 +161,7 @@ public class CollectionService {
Collection savedCollection = collectionRepository.save(collection); Collection savedCollection = collectionRepository.save(collection);
// Update in Typesense // Collections are not indexed in search engine yet
if (typesenseService != null) {
typesenseService.indexCollection(savedCollection);
}
logger.info("{} collection: {}", archived ? "Archived" : "Unarchived", id); logger.info("{} collection: {}", archived ? "Archived" : "Unarchived", id);
return savedCollection; return savedCollection;
@@ -221,10 +206,7 @@ public class CollectionService {
} }
// Update collection in Typesense // Update collection in Typesense
if (typesenseService != null) { // Collections are not indexed in search engine yet
Collection updatedCollection = findById(collectionId);
typesenseService.indexCollection(updatedCollection);
}
long totalStories = collectionStoryRepository.countByCollectionId(collectionId); long totalStories = collectionStoryRepository.countByCollectionId(collectionId);
@@ -249,10 +231,7 @@ public class CollectionService {
collectionStoryRepository.delete(collectionStory); collectionStoryRepository.delete(collectionStory);
// Update collection in Typesense // Update collection in Typesense
if (typesenseService != null) { // Collections are not indexed in search engine yet
Collection updatedCollection = findById(collectionId);
typesenseService.indexCollection(updatedCollection);
}
logger.info("Removed story {} from collection {}", storyId, collectionId); logger.info("Removed story {} from collection {}", storyId, collectionId);
} }
@@ -285,10 +264,7 @@ public class CollectionService {
} }
// Update collection in Typesense // Update collection in Typesense
if (typesenseService != null) { // Collections are not indexed in search engine yet
Collection updatedCollection = findById(collectionId);
typesenseService.indexCollection(updatedCollection);
}
logger.info("Reordered {} stories in collection {}", storyOrders.size(), collectionId); logger.info("Reordered {} stories in collection {}", storyOrders.size(), collectionId);
} }
@@ -423,7 +399,7 @@ public class CollectionService {
} }
/** /**
* Get all collections for indexing (used by TypesenseService) * Get all collections for indexing (used by SearchServiceAdapter)
*/ */
public List<Collection> findAllForIndexing() { public List<Collection> findAllForIndexing() {
return collectionRepository.findAllActiveCollections(); return collectionRepository.findAllActiveCollections();

View File

@@ -52,7 +52,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
private CollectionRepository collectionRepository; private CollectionRepository collectionRepository;
@Autowired @Autowired
private TypesenseService typesenseService; private SearchServiceAdapter searchServiceAdapter;
@Autowired @Autowired
private LibraryService libraryService; private LibraryService libraryService;
@@ -145,15 +145,15 @@ public class DatabaseManagementService implements ApplicationContextAware {
System.err.println("No files directory found in backup - skipping file restore."); System.err.println("No files directory found in backup - skipping file restore.");
} }
// 6. Trigger complete Typesense reindex after data restoration // 6. Trigger complete search index reindex after data restoration
try { try {
System.err.println("Starting Typesense reindex after restore..."); System.err.println("Starting search index reindex after restore...");
TypesenseService typesenseService = applicationContext.getBean(TypesenseService.class); SearchServiceAdapter searchServiceAdapter = applicationContext.getBean(SearchServiceAdapter.class);
typesenseService.performCompleteReindex(); searchServiceAdapter.performCompleteReindex();
System.err.println("Typesense reindex completed successfully."); System.err.println("Search index reindex completed successfully.");
} catch (Exception e) { } catch (Exception e) {
System.err.println("Warning: Failed to reindex Typesense after restore: " + e.getMessage()); System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
// Don't fail the entire restore for Typesense issues // Don't fail the entire restore for search issues
} }
System.err.println("Complete backup restore finished successfully."); System.err.println("Complete backup restore finished successfully.");
@@ -299,9 +299,9 @@ public class DatabaseManagementService implements ApplicationContextAware {
// Reindex search after successful restore // Reindex search after successful restore
try { try {
String currentLibraryId = libraryService.getCurrentLibraryId(); String currentLibraryId = libraryService.getCurrentLibraryId();
System.err.println("Starting Typesense reindex after successful restore for library: " + currentLibraryId); System.err.println("Starting search reindex after successful restore for library: " + currentLibraryId);
if (currentLibraryId == null) { if (currentLibraryId == null) {
System.err.println("ERROR: No current library set during restore - cannot reindex Typesense!"); System.err.println("ERROR: No current library set during restore - cannot reindex search!");
throw new IllegalStateException("No current library active during restore"); throw new IllegalStateException("No current library active during restore");
} }
@@ -310,10 +310,10 @@ public class DatabaseManagementService implements ApplicationContextAware {
reindexStoriesAndAuthorsFromCurrentDatabase(); reindexStoriesAndAuthorsFromCurrentDatabase();
// Note: Collections collection will be recreated when needed by the service // Note: Collections collection will be recreated when needed by the service
System.err.println("Typesense reindex completed successfully for library: " + currentLibraryId); System.err.println("Search reindex completed successfully for library: " + currentLibraryId);
} catch (Exception e) { } catch (Exception e) {
// Log the error but don't fail the restore // Log the error but don't fail the restore
System.err.println("Warning: Failed to reindex Typesense after restore: " + e.getMessage()); System.err.println("Warning: Failed to reindex search after restore: " + e.getMessage());
e.printStackTrace(); e.printStackTrace();
} }
@@ -351,7 +351,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
totalDeleted = collectionCount + storyCount + authorCount + seriesCount + tagCount; totalDeleted = collectionCount + storyCount + authorCount + seriesCount + tagCount;
// Note: Search indexes will need to be manually recreated after clearing // Note: Search indexes will need to be manually recreated after clearing
// Use the settings page to recreate Typesense collections after clearing the database // Use the settings page to recreate search indices after clearing the database
} catch (Exception e) { } catch (Exception e) {
throw new RuntimeException("Failed to clear database: " + e.getMessage(), e); throw new RuntimeException("Failed to clear database: " + e.getMessage(), e);
@@ -506,8 +506,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
} }
// For clearing, we only want to recreate empty collections (no data to index) // For clearing, we only want to recreate empty collections (no data to index)
typesenseService.recreateStoriesCollection(); searchServiceAdapter.recreateIndices();
typesenseService.recreateAuthorsCollection();
// Note: Collections collection will be recreated when needed by the service // Note: Collections collection will be recreated when needed by the service
System.err.println("Search indexes cleared successfully for library: " + currentLibraryId); System.err.println("Search indexes cleared successfully for library: " + currentLibraryId);
} catch (Exception e) { } catch (Exception e) {
@@ -959,10 +958,9 @@ public class DatabaseManagementService implements ApplicationContextAware {
try (Connection connection = getDataSource().getConnection()) { try (Connection connection = getDataSource().getConnection()) {
// First, recreate empty collections // First, recreate empty collections
try { try {
typesenseService.recreateStoriesCollection(); searchServiceAdapter.recreateIndices();
typesenseService.recreateAuthorsCollection();
} catch (Exception e) { } catch (Exception e) {
throw new SQLException("Failed to recreate Typesense collections", e); throw new SQLException("Failed to recreate search indices", e);
} }
// Count and reindex stories with full author and series information // Count and reindex stories with full author and series information
@@ -984,7 +982,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
while (rs.next()) { while (rs.next()) {
// Create a complete Story object for indexing // Create a complete Story object for indexing
var story = createStoryFromResultSet(rs); var story = createStoryFromResultSet(rs);
typesenseService.indexStory(story); searchServiceAdapter.indexStory(story);
storyCount++; storyCount++;
} }
} }
@@ -999,7 +997,7 @@ public class DatabaseManagementService implements ApplicationContextAware {
while (rs.next()) { while (rs.next()) {
// Create a minimal Author object for indexing // Create a minimal Author object for indexing
var author = createAuthorFromResultSet(rs); var author = createAuthorFromResultSet(rs);
typesenseService.indexAuthor(author); searchServiceAdapter.indexAuthor(author);
authorCount++; authorCount++;
} }
} }

View File

@@ -13,8 +13,6 @@ import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware; import org.springframework.context.ApplicationContextAware;
import org.springframework.security.crypto.bcrypt.BCryptPasswordEncoder; import org.springframework.security.crypto.bcrypt.BCryptPasswordEncoder;
import org.springframework.stereotype.Service; import org.springframework.stereotype.Service;
import org.typesense.api.Client;
import org.typesense.resources.Node;
import jakarta.annotation.PostConstruct; import jakarta.annotation.PostConstruct;
import jakarta.annotation.PreDestroy; import jakarta.annotation.PreDestroy;
@@ -26,7 +24,6 @@ import java.nio.file.Files;
import java.nio.file.Path; import java.nio.file.Path;
import java.nio.file.Paths; import java.nio.file.Paths;
import java.sql.SQLException; import java.sql.SQLException;
import java.time.Duration;
import java.util.*; import java.util.*;
import java.util.concurrent.ConcurrentHashMap; import java.util.concurrent.ConcurrentHashMap;
@@ -43,14 +40,6 @@ public class LibraryService implements ApplicationContextAware {
@Value("${spring.datasource.password}") @Value("${spring.datasource.password}")
private String dbPassword; private String dbPassword;
@Value("${typesense.host}")
private String typesenseHost;
@Value("${typesense.port}")
private String typesensePort;
@Value("${typesense.api-key}")
private String typesenseApiKey;
private final ObjectMapper objectMapper = new ObjectMapper(); private final ObjectMapper objectMapper = new ObjectMapper();
private final BCryptPasswordEncoder passwordEncoder = new BCryptPasswordEncoder(); private final BCryptPasswordEncoder passwordEncoder = new BCryptPasswordEncoder();
@@ -61,7 +50,6 @@ public class LibraryService implements ApplicationContextAware {
// Current active resources // Current active resources
private volatile String currentLibraryId; private volatile String currentLibraryId;
private volatile Client currentTypesenseClient;
// Security: Track if user has explicitly authenticated in this session // Security: Track if user has explicitly authenticated in this session
private volatile boolean explicitlyAuthenticated = false; private volatile boolean explicitlyAuthenticated = false;
@@ -100,7 +88,6 @@ public class LibraryService implements ApplicationContextAware {
@PreDestroy @PreDestroy
public void cleanup() { public void cleanup() {
currentLibraryId = null; currentLibraryId = null;
currentTypesenseClient = null;
explicitlyAuthenticated = false; explicitlyAuthenticated = false;
} }
@@ -110,7 +97,6 @@ public class LibraryService implements ApplicationContextAware {
public void clearAuthentication() { public void clearAuthentication() {
explicitlyAuthenticated = false; explicitlyAuthenticated = false;
currentLibraryId = null; currentLibraryId = null;
currentTypesenseClient = null;
logger.info("Authentication cleared - user must re-authenticate to access libraries"); logger.info("Authentication cleared - user must re-authenticate to access libraries");
} }
@@ -129,7 +115,7 @@ public class LibraryService implements ApplicationContextAware {
/** /**
* Switch to library after authentication with forced reindexing * Switch to library after authentication with forced reindexing
* This ensures Typesense is always up-to-date after login * This ensures OpenSearch is always up-to-date after login
*/ */
public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception { public synchronized void switchToLibraryAfterAuthentication(String libraryId) throws Exception {
logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId); logger.info("Switching to library after authentication: {} (forcing reindex)", libraryId);
@@ -168,26 +154,16 @@ public class LibraryService implements ApplicationContextAware {
// Set new active library (datasource routing handled by SmartRoutingDataSource) // Set new active library (datasource routing handled by SmartRoutingDataSource)
currentLibraryId = libraryId; currentLibraryId = libraryId;
currentTypesenseClient = createTypesenseClient(library.getTypesenseCollection()); // OpenSearch indexes are global - no per-library initialization needed
logger.info("Library switched to OpenSearch mode for library: {}", libraryId);
// Initialize Typesense collections for this library
try {
TypesenseService typesenseService = applicationContext.getBean(TypesenseService.class);
// First ensure collections exist
typesenseService.initializeCollectionsForCurrentLibrary();
logger.info("Completed Typesense initialization for library: {}", libraryId);
} catch (Exception e) {
logger.warn("Failed to initialize Typesense for library {}: {}", libraryId, e.getMessage());
// Don't fail the switch - collections can be created later
}
logger.info("Successfully switched to library: {}", library.getName()); logger.info("Successfully switched to library: {}", library.getName());
// Perform complete reindex AFTER library switch is fully complete // Perform complete reindex AFTER library switch is fully complete
// This ensures database routing is properly established // This ensures database routing is properly established
if (forceReindex || !libraryId.equals(previousLibraryId)) { if (forceReindex || !libraryId.equals(previousLibraryId)) {
logger.info("Starting post-switch Typesense reindex for library: {}", libraryId); logger.info("Starting post-switch OpenSearch reindex for library: {}", libraryId);
// Run reindex asynchronously to avoid blocking authentication response // Run reindex asynchronously to avoid blocking authentication response
// and allow time for database routing to fully stabilize // and allow time for database routing to fully stabilize
String finalLibraryId = libraryId; String finalLibraryId = libraryId;
@@ -195,15 +171,25 @@ public class LibraryService implements ApplicationContextAware {
try { try {
// Give routing time to stabilize // Give routing time to stabilize
Thread.sleep(500); Thread.sleep(500);
logger.info("Starting async Typesense reindex for library: {}", finalLibraryId); logger.info("Starting async OpenSearch reindex for library: {}", finalLibraryId);
TypesenseService typesenseService = applicationContext.getBean(TypesenseService.class); SearchServiceAdapter searchService = applicationContext.getBean(SearchServiceAdapter.class);
typesenseService.performCompleteReindex(); // Get all stories and authors for reindexing
logger.info("Completed async Typesense reindexing for library: {}", finalLibraryId); StoryService storyService = applicationContext.getBean(StoryService.class);
AuthorService authorService = applicationContext.getBean(AuthorService.class);
var allStories = storyService.findAllWithAssociations();
var allAuthors = authorService.findAllWithStories();
searchService.bulkIndexStories(allStories);
searchService.bulkIndexAuthors(allAuthors);
logger.info("Completed async OpenSearch reindexing for library: {} ({} stories, {} authors)",
finalLibraryId, allStories.size(), allAuthors.size());
} catch (Exception e) { } catch (Exception e) {
logger.warn("Failed to async reindex Typesense for library {}: {}", finalLibraryId, e.getMessage()); logger.warn("Failed to async reindex OpenSearch for library {}: {}", finalLibraryId, e.getMessage());
} }
}, "TypesenseReindex-" + libraryId).start(); }, "OpenSearchReindex-" + libraryId).start();
} }
} }
@@ -219,12 +205,6 @@ public class LibraryService implements ApplicationContextAware {
} }
} }
public Client getCurrentTypesenseClient() {
if (currentTypesenseClient == null) {
throw new IllegalStateException("No active library - please authenticate first");
}
return currentTypesenseClient;
}
public String getCurrentLibraryId() { public String getCurrentLibraryId() {
return currentLibraryId; return currentLibraryId;
@@ -545,8 +525,8 @@ public class LibraryService implements ApplicationContextAware {
// 1. Create image directory structure // 1. Create image directory structure
initializeImageDirectories(library); initializeImageDirectories(library);
// 2. Initialize Typesense collections (this will be done when switching to the library) // 2. OpenSearch indexes are global and managed automatically
// The TypesenseService.initializeCollections() will be called automatically // No per-library initialization needed for OpenSearch
logger.info("Successfully initialized resources for library: {}", library.getName()); logger.info("Successfully initialized resources for library: {}", library.getName());
@@ -777,21 +757,10 @@ public class LibraryService implements ApplicationContextAware {
} }
} }
private Client createTypesenseClient(String collection) {
logger.info("Creating Typesense client for collection: {}", collection);
List<Node> nodes = Arrays.asList(
new Node("http", typesenseHost, typesensePort)
);
org.typesense.api.Configuration configuration = new org.typesense.api.Configuration(nodes, Duration.ofSeconds(10), typesenseApiKey);
return new Client(configuration);
}
private void closeCurrentResources() { private void closeCurrentResources() {
// No need to close datasource - SmartRoutingDataSource handles this // No need to close datasource - SmartRoutingDataSource handles this
// Typesense client doesn't need explicit cleanup // OpenSearch service is managed by Spring - no explicit cleanup needed
currentTypesenseClient = null;
// Don't clear currentLibraryId here - only when explicitly switching // Don't clear currentLibraryId here - only when explicitly switching
} }
@@ -848,7 +817,6 @@ public class LibraryService implements ApplicationContextAware {
config.put("description", library.getDescription()); config.put("description", library.getDescription());
config.put("passwordHash", library.getPasswordHash()); config.put("passwordHash", library.getPasswordHash());
config.put("dbName", library.getDbName()); config.put("dbName", library.getDbName());
config.put("typesenseCollection", library.getTypesenseCollection());
config.put("imagePath", library.getImagePath()); config.put("imagePath", library.getImagePath());
config.put("initialized", library.isInitialized()); config.put("initialized", library.isInitialized());

View File

@@ -0,0 +1,133 @@
package com.storycove.service;
import com.storycove.config.OpenSearchProperties;
import org.opensearch.client.opensearch.OpenSearchClient;
import org.opensearch.client.opensearch.cluster.HealthRequest;
import org.opensearch.client.opensearch.cluster.HealthResponse;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.actuate.health.Health;
import org.springframework.boot.actuate.health.HealthIndicator;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Service;
import java.time.LocalDateTime;
import java.util.concurrent.atomic.AtomicReference;
@Service
@ConditionalOnProperty(name = "storycove.search.engine", havingValue = "opensearch")
public class OpenSearchHealthService implements HealthIndicator {
private static final Logger logger = LoggerFactory.getLogger(OpenSearchHealthService.class);
private final OpenSearchClient openSearchClient;
private final OpenSearchProperties properties;
private final AtomicReference<Health> lastKnownHealth = new AtomicReference<>(Health.unknown().build());
private LocalDateTime lastCheckTime = LocalDateTime.now();
@Autowired
public OpenSearchHealthService(OpenSearchClient openSearchClient, OpenSearchProperties properties) {
this.openSearchClient = openSearchClient;
this.properties = properties;
}
@Override
public Health health() {
return lastKnownHealth.get();
}
@Scheduled(fixedDelayString = "#{@openSearchProperties.health.checkInterval}")
public void performHealthCheck() {
try {
HealthResponse clusterHealth = openSearchClient.cluster().health(
HealthRequest.of(h -> h.timeout(t -> t.time("10s")))
);
Health.Builder healthBuilder = Health.up()
.withDetail("cluster_name", clusterHealth.clusterName())
.withDetail("status", clusterHealth.status().jsonValue())
.withDetail("number_of_nodes", clusterHealth.numberOfNodes())
.withDetail("number_of_data_nodes", clusterHealth.numberOfDataNodes())
.withDetail("active_primary_shards", clusterHealth.activePrimaryShards())
.withDetail("active_shards", clusterHealth.activeShards())
.withDetail("relocating_shards", clusterHealth.relocatingShards())
.withDetail("initializing_shards", clusterHealth.initializingShards())
.withDetail("unassigned_shards", clusterHealth.unassignedShards())
.withDetail("last_check", LocalDateTime.now());
// Check if cluster status is concerning
switch (clusterHealth.status()) {
case Red:
healthBuilder = Health.down()
.withDetail("reason", "Cluster status is RED - some primary shards are unassigned");
break;
case Yellow:
if (isProduction()) {
healthBuilder = Health.down()
.withDetail("reason", "Cluster status is YELLOW - some replica shards are unassigned (critical in production)");
} else {
// Yellow is acceptable in development (single node clusters)
healthBuilder.withDetail("warning", "Cluster status is YELLOW - acceptable for development");
}
break;
case Green:
// All good
break;
}
lastKnownHealth.set(healthBuilder.build());
lastCheckTime = LocalDateTime.now();
if (properties.getHealth().isEnableMetrics()) {
logMetrics(clusterHealth);
}
} catch (Exception e) {
logger.error("OpenSearch health check failed", e);
Health unhealthyStatus = Health.down()
.withDetail("error", e.getMessage())
.withDetail("last_successful_check", lastCheckTime)
.withDetail("current_time", LocalDateTime.now())
.build();
lastKnownHealth.set(unhealthyStatus);
}
}
private void logMetrics(HealthResponse clusterHealth) {
logger.info("OpenSearch Cluster Metrics - Status: {}, Nodes: {}, Active Shards: {}, Unassigned: {}",
clusterHealth.status().jsonValue(),
clusterHealth.numberOfNodes(),
clusterHealth.activeShards(),
clusterHealth.unassignedShards());
}
private boolean isProduction() {
return "production".equalsIgnoreCase(properties.getProfile());
}
/**
* Manual health check for immediate status
*/
public boolean isClusterHealthy() {
Health currentHealth = lastKnownHealth.get();
return currentHealth.getStatus() == org.springframework.boot.actuate.health.Status.UP;
}
/**
* Get detailed cluster information
*/
public String getClusterInfo() {
try {
var info = openSearchClient.info();
return String.format("OpenSearch %s (Cluster: %s, Lucene: %s)",
info.version().number(),
info.clusterName(),
info.version().luceneVersion());
} catch (Exception e) {
return "Unable to retrieve cluster information: " + e.getMessage();
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,278 @@
package com.storycove.service;
import com.storycove.dto.AuthorSearchDto;
import com.storycove.dto.SearchResultDto;
import com.storycove.dto.StorySearchDto;
import com.storycove.entity.Author;
import com.storycove.entity.Story;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import java.util.List;
import java.util.UUID;
/**
* Service adapter that provides a unified interface for search operations.
*
* This adapter directly delegates to OpenSearchService.
*/
@Service
public class SearchServiceAdapter {
private static final Logger logger = LoggerFactory.getLogger(SearchServiceAdapter.class);
@Autowired
private OpenSearchService openSearchService;
// ===============================
// SEARCH OPERATIONS
// ===============================
/**
* Search stories with unified interface
*/
public SearchResultDto<StorySearchDto> searchStories(String query, List<String> tags, String author,
String series, Integer minWordCount, Integer maxWordCount,
Float minRating, Boolean isRead, Boolean isFavorite,
String sortBy, String sortOrder, int page, int size,
List<String> facetBy,
// Advanced filters
String createdAfter, String createdBefore,
String lastReadAfter, String lastReadBefore,
Boolean unratedOnly, String readingStatus,
Boolean hasReadingProgress, Boolean hasCoverImage,
String sourceDomain, String seriesFilter,
Integer minTagCount, Boolean popularOnly,
Boolean hiddenGemsOnly) {
return openSearchService.searchStories(query, tags, author, series, minWordCount, maxWordCount,
minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy,
createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus,
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly,
hiddenGemsOnly);
}
/**
* Get random stories with unified interface
*/
public List<StorySearchDto> getRandomStories(int count, List<String> tags, String author,
String series, Integer minWordCount, Integer maxWordCount,
Float minRating, Boolean isRead, Boolean isFavorite,
Long seed) {
return openSearchService.getRandomStories(count, tags, author, series, minWordCount, maxWordCount,
minRating, isRead, isFavorite, seed);
}
/**
* Recreate search indices
*/
public void recreateIndices() {
try {
openSearchService.recreateIndices();
} catch (Exception e) {
logger.error("Failed to recreate search indices", e);
throw new RuntimeException("Failed to recreate search indices", e);
}
}
/**
* Perform complete reindex of all data
*/
public void performCompleteReindex() {
try {
recreateIndices();
logger.info("Search indices recreated successfully");
} catch (Exception e) {
logger.error("Failed to perform complete reindex", e);
throw new RuntimeException("Failed to perform complete reindex", e);
}
}
/**
* Get random story ID with unified interface
*/
public String getRandomStoryId(Long seed) {
return openSearchService.getRandomStoryId(seed);
}
/**
* Search authors with unified interface
*/
public List<AuthorSearchDto> searchAuthors(String query, int limit) {
return openSearchService.searchAuthors(query, limit);
}
/**
* Get tag suggestions with unified interface
*/
public List<String> getTagSuggestions(String query, int limit) {
return openSearchService.getTagSuggestions(query, limit);
}
// ===============================
// INDEX OPERATIONS
// ===============================
/**
* Index a story in OpenSearch
*/
public void indexStory(Story story) {
try {
openSearchService.indexStory(story);
} catch (Exception e) {
logger.error("Failed to index story {}", story.getId(), e);
}
}
/**
* Update a story in OpenSearch
*/
public void updateStory(Story story) {
try {
openSearchService.updateStory(story);
} catch (Exception e) {
logger.error("Failed to update story {}", story.getId(), e);
}
}
/**
* Delete a story from OpenSearch
*/
public void deleteStory(UUID storyId) {
try {
openSearchService.deleteStory(storyId);
} catch (Exception e) {
logger.error("Failed to delete story {}", storyId, e);
}
}
/**
* Index an author in OpenSearch
*/
public void indexAuthor(Author author) {
try {
openSearchService.indexAuthor(author);
} catch (Exception e) {
logger.error("Failed to index author {}", author.getId(), e);
}
}
/**
* Update an author in OpenSearch
*/
public void updateAuthor(Author author) {
try {
openSearchService.updateAuthor(author);
} catch (Exception e) {
logger.error("Failed to update author {}", author.getId(), e);
}
}
/**
* Delete an author from OpenSearch
*/
public void deleteAuthor(UUID authorId) {
try {
openSearchService.deleteAuthor(authorId);
} catch (Exception e) {
logger.error("Failed to delete author {}", authorId, e);
}
}
/**
* Bulk index stories in OpenSearch
*/
public void bulkIndexStories(List<Story> stories) {
try {
openSearchService.bulkIndexStories(stories);
} catch (Exception e) {
logger.error("Failed to bulk index {} stories", stories.size(), e);
}
}
/**
* Bulk index authors in OpenSearch
*/
public void bulkIndexAuthors(List<Author> authors) {
try {
openSearchService.bulkIndexAuthors(authors);
} catch (Exception e) {
logger.error("Failed to bulk index {} authors", authors.size(), e);
}
}
// ===============================
// UTILITY METHODS
// ===============================
/**
* Check if search service is available and healthy
*/
public boolean isSearchServiceAvailable() {
return openSearchService.testConnection();
}
/**
* Get current search engine name
*/
public String getCurrentSearchEngine() {
return "opensearch";
}
/**
* Check if dual-write is enabled
*/
public boolean isDualWriteEnabled() {
return false; // No longer supported
}
/**
* Check if we can switch to OpenSearch
*/
public boolean canSwitchToOpenSearch() {
return true; // Already using OpenSearch
}
/**
* Check if we can switch to Typesense
*/
public boolean canSwitchToTypesense() {
return false; // Typesense no longer available
}
/**
* Get current search status for admin interface
*/
public SearchStatus getSearchStatus() {
return new SearchStatus(
"opensearch",
false, // no dual-write
false, // no typesense
openSearchService.testConnection()
);
}
/**
* DTO for search status
*/
public static class SearchStatus {
private final String primaryEngine;
private final boolean dualWrite;
private final boolean typesenseAvailable;
private final boolean openSearchAvailable;
public SearchStatus(String primaryEngine, boolean dualWrite,
boolean typesenseAvailable, boolean openSearchAvailable) {
this.primaryEngine = primaryEngine;
this.dualWrite = dualWrite;
this.typesenseAvailable = typesenseAvailable;
this.openSearchAvailable = openSearchAvailable;
}
public String getPrimaryEngine() { return primaryEngine; }
public boolean isDualWrite() { return dualWrite; }
public boolean isTypesenseAvailable() { return typesenseAvailable; }
public boolean isOpenSearchAvailable() { return openSearchAvailable; }
}
}

View File

@@ -42,7 +42,7 @@ public class StoryService {
private final TagService tagService; private final TagService tagService;
private final SeriesService seriesService; private final SeriesService seriesService;
private final HtmlSanitizationService sanitizationService; private final HtmlSanitizationService sanitizationService;
private final TypesenseService typesenseService; private final SearchServiceAdapter searchServiceAdapter;
@Autowired @Autowired
public StoryService(StoryRepository storyRepository, public StoryService(StoryRepository storyRepository,
@@ -52,7 +52,7 @@ public class StoryService {
TagService tagService, TagService tagService,
SeriesService seriesService, SeriesService seriesService,
HtmlSanitizationService sanitizationService, HtmlSanitizationService sanitizationService,
@Autowired(required = false) TypesenseService typesenseService) { SearchServiceAdapter searchServiceAdapter) {
this.storyRepository = storyRepository; this.storyRepository = storyRepository;
this.tagRepository = tagRepository; this.tagRepository = tagRepository;
this.readingPositionRepository = readingPositionRepository; this.readingPositionRepository = readingPositionRepository;
@@ -60,7 +60,7 @@ public class StoryService {
this.tagService = tagService; this.tagService = tagService;
this.seriesService = seriesService; this.seriesService = seriesService;
this.sanitizationService = sanitizationService; this.sanitizationService = sanitizationService;
this.typesenseService = typesenseService; this.searchServiceAdapter = searchServiceAdapter;
} }
@Transactional(readOnly = true) @Transactional(readOnly = true)
@@ -239,10 +239,8 @@ public class StoryService {
story.addTag(tag); story.addTag(tag);
Story savedStory = storyRepository.save(story); Story savedStory = storyRepository.save(story);
// Update Typesense index with new tag information // Update search index with new tag information
if (typesenseService != null) { searchServiceAdapter.updateStory(savedStory);
typesenseService.updateStory(savedStory);
}
return savedStory; return savedStory;
} }
@@ -256,10 +254,8 @@ public class StoryService {
story.removeTag(tag); story.removeTag(tag);
Story savedStory = storyRepository.save(story); Story savedStory = storyRepository.save(story);
// Update Typesense index with updated tag information // Update search index with updated tag information
if (typesenseService != null) { searchServiceAdapter.updateStory(savedStory);
typesenseService.updateStory(savedStory);
}
return savedStory; return savedStory;
} }
@@ -274,10 +270,8 @@ public class StoryService {
story.setRating(rating); story.setRating(rating);
Story savedStory = storyRepository.save(story); Story savedStory = storyRepository.save(story);
// Update Typesense index with new rating // Update search index with new rating
if (typesenseService != null) { searchServiceAdapter.updateStory(savedStory);
typesenseService.updateStory(savedStory);
}
return savedStory; return savedStory;
} }
@@ -292,10 +286,8 @@ public class StoryService {
story.updateReadingProgress(position); story.updateReadingProgress(position);
Story savedStory = storyRepository.save(story); Story savedStory = storyRepository.save(story);
// Update Typesense index with new reading progress // Update search index with new reading progress
if (typesenseService != null) { searchServiceAdapter.updateStory(savedStory);
typesenseService.updateStory(savedStory);
}
return savedStory; return savedStory;
} }
@@ -313,10 +305,8 @@ public class StoryService {
Story savedStory = storyRepository.save(story); Story savedStory = storyRepository.save(story);
// Update Typesense index with new reading status // Update search index with new reading status
if (typesenseService != null) { searchServiceAdapter.updateStory(savedStory);
typesenseService.updateStory(savedStory);
}
return savedStory; return savedStory;
} }
@@ -358,10 +348,8 @@ public class StoryService {
updateStoryTags(savedStory, story.getTags()); updateStoryTags(savedStory, story.getTags());
} }
// Index in Typesense (if available) // Index in search engine
if (typesenseService != null) { searchServiceAdapter.indexStory(savedStory);
typesenseService.indexStory(savedStory);
}
return savedStory; return savedStory;
} }
@@ -388,10 +376,8 @@ public class StoryService {
updateStoryTagsByNames(savedStory, tagNames); updateStoryTagsByNames(savedStory, tagNames);
} }
// Index in Typesense (if available) // Index in search engine
if (typesenseService != null) { searchServiceAdapter.indexStory(savedStory);
typesenseService.indexStory(savedStory);
}
return savedStory; return savedStory;
} }
@@ -409,10 +395,8 @@ public class StoryService {
updateStoryFields(existingStory, storyUpdates); updateStoryFields(existingStory, storyUpdates);
Story updatedStory = storyRepository.save(existingStory); Story updatedStory = storyRepository.save(existingStory);
// Update in Typesense (if available) // Update in search engine
if (typesenseService != null) { searchServiceAdapter.updateStory(updatedStory);
typesenseService.updateStory(updatedStory);
}
return updatedStory; return updatedStory;
} }
@@ -432,10 +416,8 @@ public class StoryService {
Story updatedStory = storyRepository.save(existingStory); Story updatedStory = storyRepository.save(existingStory);
// Update in Typesense (if available) // Update in search engine
if (typesenseService != null) { searchServiceAdapter.updateStory(updatedStory);
typesenseService.updateStory(updatedStory);
}
return updatedStory; return updatedStory;
} }
@@ -455,10 +437,8 @@ public class StoryService {
// Create a copy to avoid ConcurrentModificationException // Create a copy to avoid ConcurrentModificationException
new ArrayList<>(story.getTags()).forEach(tag -> story.removeTag(tag)); new ArrayList<>(story.getTags()).forEach(tag -> story.removeTag(tag));
// Delete from Typesense first (if available) // Delete from search engine first
if (typesenseService != null) { searchServiceAdapter.deleteStory(story.getId());
typesenseService.deleteStory(story.getId().toString());
}
storyRepository.delete(story); storyRepository.delete(story);
} }
@@ -674,7 +654,7 @@ public class StoryService {
/** /**
* Find a random story based on optional filters. * Find a random story based on optional filters.
* Uses Typesense for consistency with Library search functionality. * Uses search service for consistency with Library search functionality.
* Supports text search and multiple tags using the same logic as the Library view. * Supports text search and multiple tags using the same logic as the Library view.
* @param searchQuery Optional search query * @param searchQuery Optional search query
* @param tags Optional list of tags to filter by * @param tags Optional list of tags to filter by
@@ -693,7 +673,7 @@ public class StoryService {
/** /**
* Find a random story based on optional filters with seed support. * Find a random story based on optional filters with seed support.
* Uses Typesense for consistency with Library search functionality. * Uses search service for consistency with Library search functionality.
* Supports text search and multiple tags using the same logic as the Library view. * Supports text search and multiple tags using the same logic as the Library view.
* @param searchQuery Optional search query * @param searchQuery Optional search query
* @param tags Optional list of tags to filter by * @param tags Optional list of tags to filter by
@@ -711,21 +691,16 @@ public class StoryService {
String seriesFilter, Integer minTagCount, String seriesFilter, Integer minTagCount,
Boolean popularOnly, Boolean hiddenGemsOnly) { Boolean popularOnly, Boolean hiddenGemsOnly) {
// Use Typesense if available for consistency with Library search // Use search service for consistency with Library search
if (typesenseService != null) { try {
try { String randomStoryId = searchServiceAdapter.getRandomStoryId(seed);
Optional<UUID> randomStoryId = typesenseService.getRandomStoryId(searchQuery, tags, seed, if (randomStoryId != null) {
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore, return storyRepository.findById(UUID.fromString(randomStoryId));
minRating, maxRating, unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
if (randomStoryId.isPresent()) {
return storyRepository.findById(randomStoryId.get());
}
return Optional.empty();
} catch (Exception e) {
// Fallback to database queries if Typesense fails
logger.warn("Typesense random story lookup failed, falling back to database queries", e);
} }
return Optional.empty();
} catch (Exception e) {
// Fallback to database queries if search service fails
logger.warn("Search service random story lookup failed, falling back to database queries", e);
} }
// Fallback to repository-based implementation (global routing handles library selection) // Fallback to repository-based implementation (global routing handles library selection)

View File

@@ -19,6 +19,12 @@ spring:
max-file-size: 256MB # Increased for backup restore max-file-size: 256MB # Increased for backup restore
max-request-size: 260MB # Slightly higher to account for form data max-request-size: 260MB # Slightly higher to account for form data
jackson:
serialization:
write-dates-as-timestamps: false
deserialization:
adjust-dates-to-context-time-zone: false
server: server:
port: 8080 port: 8080
@@ -32,15 +38,71 @@ storycove:
expiration: 86400000 # 24 hours expiration: 86400000 # 24 hours
auth: auth:
password: ${APP_PASSWORD} # REQUIRED: No default password for security password: ${APP_PASSWORD} # REQUIRED: No default password for security
typesense: search:
api-key: ${TYPESENSE_API_KEY:xyz} engine: opensearch # OpenSearch is the only search engine
host: ${TYPESENSE_HOST:localhost} opensearch:
port: ${TYPESENSE_PORT:8108} # Connection settings
enabled: ${TYPESENSE_ENABLED:true} host: ${OPENSEARCH_HOST:localhost}
reindex-interval: ${TYPESENSE_REINDEX_INTERVAL:3600000} # 1 hour in milliseconds port: ${OPENSEARCH_PORT:9200}
scheme: ${OPENSEARCH_SCHEME:http}
username: ${OPENSEARCH_USERNAME:}
password: ${OPENSEARCH_PASSWORD:} # Empty when security is disabled
# Environment-specific configuration
profile: ${SPRING_PROFILES_ACTIVE:development} # development, staging, production
# Security settings
security:
ssl-verification: ${OPENSEARCH_SSL_VERIFICATION:false}
trust-all-certificates: ${OPENSEARCH_TRUST_ALL_CERTS:true}
keystore-path: ${OPENSEARCH_KEYSTORE_PATH:}
keystore-password: ${OPENSEARCH_KEYSTORE_PASSWORD:}
truststore-path: ${OPENSEARCH_TRUSTSTORE_PATH:}
truststore-password: ${OPENSEARCH_TRUSTSTORE_PASSWORD:}
# Connection pool settings
connection:
timeout: ${OPENSEARCH_CONNECTION_TIMEOUT:30000} # 30 seconds
socket-timeout: ${OPENSEARCH_SOCKET_TIMEOUT:60000} # 60 seconds
max-connections-per-route: ${OPENSEARCH_MAX_CONN_PER_ROUTE:10}
max-connections-total: ${OPENSEARCH_MAX_CONN_TOTAL:30}
retry-on-failure: ${OPENSEARCH_RETRY_ON_FAILURE:true}
max-retries: ${OPENSEARCH_MAX_RETRIES:3}
# Index settings
indices:
default-shards: ${OPENSEARCH_DEFAULT_SHARDS:1}
default-replicas: ${OPENSEARCH_DEFAULT_REPLICAS:0}
refresh-interval: ${OPENSEARCH_REFRESH_INTERVAL:1s}
# Bulk operations
bulk:
actions: ${OPENSEARCH_BULK_ACTIONS:1000}
size: ${OPENSEARCH_BULK_SIZE:5242880} # 5MB
timeout: ${OPENSEARCH_BULK_TIMEOUT:10000} # 10 seconds
concurrent-requests: ${OPENSEARCH_BULK_CONCURRENT:1}
# Health and monitoring
health:
check-interval: ${OPENSEARCH_HEALTH_CHECK_INTERVAL:30000} # 30 seconds
slow-query-threshold: ${OPENSEARCH_SLOW_QUERY_THRESHOLD:5000} # 5 seconds
enable-metrics: ${OPENSEARCH_ENABLE_METRICS:true}
images: images:
storage-path: ${IMAGE_STORAGE_PATH:/app/images} storage-path: ${IMAGE_STORAGE_PATH:/app/images}
management:
endpoints:
web:
exposure:
include: health,info,prometheus
endpoint:
health:
show-details: when-authorized
show-components: always
health:
opensearch:
enabled: ${OPENSEARCH_HEALTH_ENABLED:true}
logging: logging:
level: level:
com.storycove: ${LOG_LEVEL:INFO} # Use INFO for production, DEBUG for development com.storycove: ${LOG_LEVEL:INFO} # Use INFO for production, DEBUG for development

View File

@@ -0,0 +1,178 @@
# OpenSearch Configuration - Best Practices Implementation
## Overview
This directory contains a production-ready OpenSearch configuration following industry best practices for security, scalability, and maintainability.
## Architecture
### 📁 Directory Structure
```
opensearch/
├── config/
│ ├── opensearch-development.yml # Development-specific settings
│ └── opensearch-production.yml # Production-specific settings
├── mappings/
│ ├── stories-mapping.json # Story index mapping
│ ├── authors-mapping.json # Author index mapping
│ └── collections-mapping.json # Collection index mapping
├── templates/
│ ├── stories-template.json # Index template for stories_*
│ └── index-lifecycle-policy.json # ILM policy for index management
└── README.md # This file
```
## ✅ Best Practices Implemented
### 🔒 **Security**
- **Environment-Aware SSL Configuration**
- Production: Full certificate validation with custom truststore support
- Development: Optional certificate validation for local development
- **Proper Authentication**: Basic auth with secure credential management
- **Connection Security**: TLS 1.3 support with hostname verification
### 🏗️ **Configuration Management**
- **Externalized Configuration**: JSON/YAML files instead of hardcoded values
- **Environment-Specific Settings**: Different configs for dev/staging/prod
- **Type-Safe Properties**: Strongly-typed configuration classes
- **Validation**: Configuration validation at startup
### 📈 **Scalability & Performance**
- **Connection Pooling**: Configurable connection pool with timeout management
- **Environment-Aware Sharding**:
- Development: 1 shard, 0 replicas (single node)
- Production: 3 shards, 1 replica (high availability)
- **Bulk Operations**: Optimized bulk indexing with configurable batch sizes
- **Index Templates**: Automatic application of settings to new indexes
### 🔄 **Index Lifecycle Management**
- **Automated Index Rollover**: Based on size, document count, and age
- **Hot-Warm-Cold Architecture**: Optimized storage costs
- **Retention Policies**: Automatic cleanup of old data
- **Force Merge**: Optimization in warm phase
### 📊 **Monitoring & Observability**
- **Health Checks**: Automatic cluster health monitoring
- **Spring Boot Actuator**: Health endpoints for monitoring systems
- **Metrics Collection**: Configurable performance metrics
- **Slow Query Detection**: Configurable thresholds for query performance
### 🛡️ **Error Handling & Resilience**
- **Connection Retry Logic**: Automatic retry with backoff
- **Circuit Breaker Pattern**: Fail-fast for unhealthy clusters
- **Graceful Degradation**: Graceful handling when OpenSearch unavailable
- **Detailed Error Logging**: Comprehensive error tracking
## 🚀 Usage
### Development Environment
```yaml
# application-development.yml
storycove:
opensearch:
profile: development
security:
ssl-verification: false
trust-all-certificates: true
indices:
default-shards: 1
default-replicas: 0
```
### Production Environment
```yaml
# application-production.yml
storycove:
opensearch:
profile: production
security:
ssl-verification: true
trust-all-certificates: false
truststore-path: /etc/ssl/opensearch-truststore.jks
indices:
default-shards: 3
default-replicas: 1
```
## 📋 Environment Variables
### Required
- `OPENSEARCH_PASSWORD`: Admin password for OpenSearch cluster
### Optional (with sensible defaults)
- `OPENSEARCH_HOST`: Cluster hostname (default: localhost)
- `OPENSEARCH_PORT`: Cluster port (default: 9200)
- `OPENSEARCH_USERNAME`: Admin username (default: admin)
- `OPENSEARCH_SSL_VERIFICATION`: Enable SSL verification (default: false for dev)
- `OPENSEARCH_MAX_CONN_TOTAL`: Max connections (default: 30 for dev, 200 for prod)
## 🎯 Index Templates
Index templates automatically apply configuration to new indexes:
```json
{
"index_patterns": ["stories_*"],
"template": {
"settings": {
"number_of_shards": "#{ENV_SPECIFIC}",
"analysis": {
"analyzer": {
"story_analyzer": {
"type": "standard",
"stopwords": "_english_"
}
}
}
}
}
}
```
## 🔍 Health Monitoring
Access health information:
- **Application Health**: `/actuator/health`
- **OpenSearch Specific**: `/actuator/health/opensearch`
- **Detailed Metrics**: Available when `enable-metrics: true`
## 🔄 Deployment Strategy
Recommended deployment approach:
1. **Development**: Test OpenSearch configuration locally
2. **Staging**: Validate performance and accuracy in staging environment
3. **Production**: Deploy with proper monitoring and backup procedures
## 🛠️ Troubleshooting
### Common Issues
1. **SSL Certificate Errors**
- Development: Set `trust-all-certificates: true`
- Production: Provide valid truststore path
2. **Connection Timeouts**
- Increase `connection.timeout` values
- Check network connectivity and firewall rules
3. **Index Creation Failures**
- Verify cluster health with `/actuator/health/opensearch`
- Check OpenSearch logs for detailed error messages
4. **Performance Issues**
- Monitor slow queries with configurable thresholds
- Adjust bulk operation settings
- Review shard allocation and replica settings
## 🔮 Future Enhancements
- **Multi-Cluster Support**: Connect to multiple OpenSearch clusters
- **Advanced Security**: Integration with OpenSearch Security plugin
- **Custom Analyzers**: Domain-specific text analysis
- **Index Aliases**: Zero-downtime index updates
- **Machine Learning**: Integration with OpenSearch ML features
---
This configuration provides a solid foundation that scales from development to enterprise production environments while maintaining security, performance, and operational excellence.

View File

@@ -0,0 +1,32 @@
# OpenSearch Development Configuration
opensearch:
cluster:
name: "storycove-dev"
initial_master_nodes: ["opensearch-node"]
# Development settings - single node, minimal resources
indices:
default_settings:
number_of_shards: 1
number_of_replicas: 0
refresh_interval: "1s"
# Security settings for development
security:
ssl_verification: false
trust_all_certificates: true
# Connection settings
connection:
timeout: "30s"
socket_timeout: "60s"
max_connections_per_route: 10
max_connections_total: 30
# Index management
index_management:
auto_create_templates: true
template_patterns:
stories: "stories_*"
authors: "authors_*"
collections: "collections_*"

View File

@@ -0,0 +1,60 @@
# OpenSearch Production Configuration
opensearch:
cluster:
name: "storycove-prod"
# Production settings - multi-shard, with replicas
indices:
default_settings:
number_of_shards: 3
number_of_replicas: 1
refresh_interval: "30s"
max_result_window: 50000
# Index lifecycle policies
lifecycle:
hot_phase_duration: "7d"
warm_phase_duration: "30d"
cold_phase_duration: "90d"
delete_after: "1y"
# Security settings for production
security:
ssl_verification: true
trust_all_certificates: false
certificate_verification: true
tls_version: "TLSv1.3"
# Connection settings
connection:
timeout: "10s"
socket_timeout: "30s"
max_connections_per_route: 50
max_connections_total: 200
retry_on_failure: true
max_retries: 3
retry_delay: "1s"
# Performance tuning
performance:
bulk_actions: 1000
bulk_size: "5MB"
bulk_timeout: "10s"
concurrent_requests: 4
# Monitoring and observability
monitoring:
health_check_interval: "30s"
slow_query_threshold: "5s"
enable_metrics: true
# Index management
index_management:
auto_create_templates: true
template_patterns:
stories: "stories_*"
authors: "authors_*"
collections: "collections_*"
retention_policy:
enabled: true
default_retention: "1y"

View File

@@ -0,0 +1,79 @@
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"name_analyzer": {
"type": "standard",
"stopwords": "_english_"
},
"autocomplete_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "edge_ngram"]
}
},
"filter": {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 20
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"name": {
"type": "text",
"analyzer": "name_analyzer",
"fields": {
"autocomplete": {
"type": "text",
"analyzer": "autocomplete_analyzer"
},
"keyword": {
"type": "keyword"
}
}
},
"bio": {
"type": "text",
"analyzer": "name_analyzer"
},
"urls": {
"type": "keyword"
},
"imageUrl": {
"type": "keyword"
},
"storyCount": {
"type": "integer"
},
"averageRating": {
"type": "float"
},
"totalWordCount": {
"type": "long"
},
"totalReadingTime": {
"type": "integer"
},
"createdAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"updatedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"libraryId": {
"type": "keyword"
}
}
}
}

View File

@@ -0,0 +1,73 @@
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"collection_analyzer": {
"type": "standard",
"stopwords": "_english_"
},
"autocomplete_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "edge_ngram"]
}
},
"filter": {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 20
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"name": {
"type": "text",
"analyzer": "collection_analyzer",
"fields": {
"autocomplete": {
"type": "text",
"analyzer": "autocomplete_analyzer"
},
"keyword": {
"type": "keyword"
}
}
},
"description": {
"type": "text",
"analyzer": "collection_analyzer"
},
"storyCount": {
"type": "integer"
},
"totalWordCount": {
"type": "long"
},
"averageRating": {
"type": "float"
},
"isPublic": {
"type": "boolean"
},
"createdAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"updatedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"libraryId": {
"type": "keyword"
}
}
}
}

View File

@@ -0,0 +1,120 @@
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"story_analyzer": {
"type": "standard",
"stopwords": "_english_"
},
"autocomplete_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "edge_ngram"]
}
},
"filter": {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 20
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"title": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"autocomplete": {
"type": "text",
"analyzer": "autocomplete_analyzer"
},
"keyword": {
"type": "keyword"
}
}
},
"content": {
"type": "text",
"analyzer": "story_analyzer"
},
"summary": {
"type": "text",
"analyzer": "story_analyzer"
},
"authorNames": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"authorIds": {
"type": "keyword"
},
"tagNames": {
"type": "keyword"
},
"seriesTitle": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"seriesId": {
"type": "keyword"
},
"wordCount": {
"type": "integer"
},
"rating": {
"type": "float"
},
"readingTime": {
"type": "integer"
},
"language": {
"type": "keyword"
},
"status": {
"type": "keyword"
},
"createdAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"updatedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"publishedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"isRead": {
"type": "boolean"
},
"isFavorite": {
"type": "boolean"
},
"readingProgress": {
"type": "float"
},
"libraryId": {
"type": "keyword"
}
}
}
}

View File

@@ -0,0 +1,77 @@
{
"policy": {
"description": "StoryCove index lifecycle policy",
"default_state": "hot",
"states": [
{
"name": "hot",
"actions": [
{
"rollover": {
"min_size": "50gb",
"min_doc_count": 1000000,
"min_age": "7d"
}
}
],
"transitions": [
{
"state_name": "warm",
"conditions": {
"min_age": "7d"
}
}
]
},
{
"name": "warm",
"actions": [
{
"replica_count": {
"number_of_replicas": 0
}
},
{
"force_merge": {
"max_num_segments": 1
}
}
],
"transitions": [
{
"state_name": "cold",
"conditions": {
"min_age": "30d"
}
}
]
},
{
"name": "cold",
"actions": [],
"transitions": [
{
"state_name": "delete",
"conditions": {
"min_age": "365d"
}
}
]
},
{
"name": "delete",
"actions": [
{
"delete": {}
}
]
}
],
"ism_template": [
{
"index_patterns": ["stories_*", "authors_*", "collections_*"],
"priority": 100
}
]
}
}

View File

@@ -0,0 +1,124 @@
{
"index_patterns": ["stories_*"],
"priority": 1,
"template": {
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"analysis": {
"analyzer": {
"story_analyzer": {
"type": "standard",
"stopwords": "_english_"
},
"autocomplete_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": ["lowercase", "edge_ngram"]
}
},
"filter": {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 20
}
}
}
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"title": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"autocomplete": {
"type": "text",
"analyzer": "autocomplete_analyzer"
},
"keyword": {
"type": "keyword"
}
}
},
"content": {
"type": "text",
"analyzer": "story_analyzer"
},
"summary": {
"type": "text",
"analyzer": "story_analyzer"
},
"authorNames": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"authorIds": {
"type": "keyword"
},
"tagNames": {
"type": "keyword"
},
"seriesTitle": {
"type": "text",
"analyzer": "story_analyzer",
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"seriesId": {
"type": "keyword"
},
"wordCount": {
"type": "integer"
},
"rating": {
"type": "float"
},
"readingTime": {
"type": "integer"
},
"language": {
"type": "keyword"
},
"status": {
"type": "keyword"
},
"createdAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"updatedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"publishedAt": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"isRead": {
"type": "boolean"
},
"isFavorite": {
"type": "boolean"
},
"readingProgress": {
"type": "float"
},
"libraryId": {
"type": "keyword"
}
}
}
}
}

View File

@@ -1,12 +1,8 @@
package com.storycove.config; package com.storycove.config;
import com.storycove.service.TypesenseService;
import org.springframework.boot.test.context.TestConfiguration; import org.springframework.boot.test.context.TestConfiguration;
import org.springframework.boot.test.mock.mockito.MockBean;
@TestConfiguration @TestConfiguration
public class TestConfig { public class TestConfig {
// Test configuration
@MockBean
public TypesenseService typesenseService;
} }

View File

@@ -44,8 +44,9 @@ class AuthorServiceTest {
testAuthor.setId(testId); testAuthor.setId(testId);
testAuthor.setNotes("Test notes"); testAuthor.setNotes("Test notes");
// Initialize service with null TypesenseService (which is allowed for tests) // Initialize service with mock SearchServiceAdapter
authorService = new AuthorService(authorRepository, null); SearchServiceAdapter mockSearchServiceAdapter = mock(SearchServiceAdapter.class);
authorService = new AuthorService(authorRepository, mockSearchServiceAdapter);
} }
@Test @Test

View File

@@ -33,6 +33,9 @@ class StoryServiceTest {
@Mock @Mock
private ReadingPositionRepository readingPositionRepository; private ReadingPositionRepository readingPositionRepository;
@Mock
private SearchServiceAdapter searchServiceAdapter;
private StoryService storyService; private StoryService storyService;
private Story testStory; private Story testStory;
private UUID testId; private UUID testId;
@@ -44,16 +47,16 @@ class StoryServiceTest {
testStory.setId(testId); testStory.setId(testId);
testStory.setContentHtml("<p>Test content for reading progress tracking</p>"); testStory.setContentHtml("<p>Test content for reading progress tracking</p>");
// Create StoryService with only required repositories, all services can be null for these tests // Create StoryService with mocked dependencies
storyService = new StoryService( storyService = new StoryService(
storyRepository, storyRepository,
tagRepository, tagRepository,
readingPositionRepository, // added for foreign key constraint handling readingPositionRepository,
null, // authorService - not needed for reading progress tests null, // authorService - not needed for reading progress tests
null, // tagService - not needed for reading progress tests null, // tagService - not needed for reading progress tests
null, // seriesService - not needed for reading progress tests null, // seriesService - not needed for reading progress tests
null, // sanitizationService - not needed for reading progress tests null, // sanitizationService - not needed for reading progress tests
null // typesenseService - will test both with and without searchServiceAdapter
); );
} }

View File

@@ -18,11 +18,12 @@ storycove:
expiration: 86400000 expiration: 86400000
auth: auth:
password: test-password password: test-password
typesense: search:
enabled: false engine: opensearch
api-key: test-key opensearch:
host: localhost host: localhost
port: 8108 port: 9200
scheme: http
images: images:
storage-path: /tmp/test-images storage-path: /tmp/test-images

4308
backend/test_results.log Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -2,3 +2,4 @@
# https://curl.se/docs/http-cookies.html # https://curl.se/docs/http-cookies.html
# This file was generated by libcurl! Edit at your own risk. # This file was generated by libcurl! Edit at your own risk.
#HttpOnly_localhost FALSE / FALSE 1758433252 token eyJhbGciOiJIUzUxMiJ9.eyJzdWIiOiJ1c2VyIiwiaWF0IjoxNzU4MzQ2ODUyLCJleHAiOjE3NTg0MzMyNTIsImxpYnJhcnlJZCI6InNlY3JldCJ9.zEAQT5_11-pxPxmIhufSQqE26hvHldde4kFNE2HWWgBa5lT_Wt7jwpoPUMkQGQfShQwDZ9N-hFX3R2ew8jD7WQ

View File

@@ -34,9 +34,10 @@ services:
- SPRING_DATASOURCE_USERNAME=storycove - SPRING_DATASOURCE_USERNAME=storycove
- SPRING_DATASOURCE_PASSWORD=${DB_PASSWORD} - SPRING_DATASOURCE_PASSWORD=${DB_PASSWORD}
- JWT_SECRET=${JWT_SECRET} - JWT_SECRET=${JWT_SECRET}
- TYPESENSE_API_KEY=${TYPESENSE_API_KEY} - OPENSEARCH_HOST=opensearch
- TYPESENSE_HOST=typesense - OPENSEARCH_PORT=9200
- TYPESENSE_PORT=8108 - OPENSEARCH_SCHEME=http
- SEARCH_ENGINE=${SEARCH_ENGINE:-opensearch}
- IMAGE_STORAGE_PATH=/app/images - IMAGE_STORAGE_PATH=/app/images
- APP_PASSWORD=${APP_PASSWORD} - APP_PASSWORD=${APP_PASSWORD}
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925} - STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
@@ -45,7 +46,7 @@ services:
- library_config:/app/config - library_config:/app/config
depends_on: depends_on:
- postgres - postgres
- typesense - opensearch
networks: networks:
- storycove-network - storycove-network
@@ -63,20 +64,46 @@ services:
networks: networks:
- storycove-network - storycove-network
typesense:
image: typesense/typesense:29.0 opensearch:
image: opensearchproject/opensearch:3.2.0
# No port mapping - only accessible within the Docker network # No port mapping - only accessible within the Docker network
environment: environment:
- TYPESENSE_API_KEY=${TYPESENSE_API_KEY} - cluster.name=storycove-opensearch
- TYPESENSE_DATA_DIR=/data - node.name=opensearch-node
- discovery.type=single-node
- bootstrap.memory_lock=false
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
- "DISABLE_INSTALL_DEMO_CONFIG=true"
- "DISABLE_SECURITY_PLUGIN=true"
ulimits:
memlock:
soft: -1
hard: -1
nofile:
soft: 65536
hard: 65536
volumes: volumes:
- typesense_data:/data - opensearch_data:/usr/share/opensearch/data
networks:
- storycove-network
restart: unless-stopped
opensearch-dashboards:
image: opensearchproject/opensearch-dashboards:3.2.0
ports:
- "5601:5601" # Expose OpenSearch Dashboard
environment:
- OPENSEARCH_HOSTS=http://opensearch:9200
- "DISABLE_SECURITY_DASHBOARDS_PLUGIN=true"
depends_on:
- opensearch
networks: networks:
- storycove-network - storycove-network
volumes: volumes:
postgres_data: postgres_data:
typesense_data: opensearch_data:
images_data: images_data:
library_config: library_config:
@@ -122,13 +149,5 @@ configs:
expires 1y; expires 1y;
add_header Cache-Control public; add_header Cache-Control public;
} }
location /typesense/ {
proxy_pass http://typesense:8108/;
proxy_set_header Host $$host;
proxy_set_header X-Real-IP $$remote_addr;
proxy_set_header X-Forwarded-For $$proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $$scheme;
proxy_set_header X-Typesense-API-Key $$http_x_typesense_api_key;
}
} }
} }

View File

@@ -22,7 +22,7 @@ export default function AuthorsPage() {
const [currentPage, setCurrentPage] = useState(0); const [currentPage, setCurrentPage] = useState(0);
const [totalHits, setTotalHits] = useState(0); const [totalHits, setTotalHits] = useState(0);
const [hasMore, setHasMore] = useState(false); const [hasMore, setHasMore] = useState(false);
const ITEMS_PER_PAGE = 50; // Safe limit under Typesense's 250 limit const ITEMS_PER_PAGE = 50;
useEffect(() => { useEffect(() => {
const debounceTimer = setTimeout(() => { const debounceTimer = setTimeout(() => {
@@ -35,41 +35,30 @@ export default function AuthorsPage() {
} else { } else {
setSearchLoading(true); setSearchLoading(true);
} }
const searchResults = await authorApi.searchAuthorsTypesense({ const searchResults = await authorApi.getAuthors({
q: searchQuery || '*', page: currentPage,
page: currentPage,
size: ITEMS_PER_PAGE, size: ITEMS_PER_PAGE,
sortBy: sortBy, sortBy: sortBy,
sortOrder: sortOrder sortDir: sortOrder
}); });
if (currentPage === 0) { if (currentPage === 0) {
// First page - replace all results // First page - replace all results
setAuthors(searchResults.results || []); setAuthors(searchResults.content || []);
setFilteredAuthors(searchResults.results || []); setFilteredAuthors(searchResults.content || []);
} else { } else {
// Subsequent pages - append results // Subsequent pages - append results
setAuthors(prev => [...prev, ...(searchResults.results || [])]); setAuthors(prev => [...prev, ...(searchResults.content || [])]);
setFilteredAuthors(prev => [...prev, ...(searchResults.results || [])]); setFilteredAuthors(prev => [...prev, ...(searchResults.content || [])]);
} }
setTotalHits(searchResults.totalHits); setTotalHits(searchResults.totalElements || 0);
setHasMore(searchResults.results.length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < searchResults.totalHits); setHasMore(searchResults.content.length === ITEMS_PER_PAGE && (currentPage + 1) * ITEMS_PER_PAGE < (searchResults.totalElements || 0));
} catch (error) { } catch (error) {
console.error('Failed to load authors:', error); console.error('Failed to load authors:', error);
// Fallback to regular API if Typesense fails (only for first page) // Error handling for API failures
if (currentPage === 0) { console.error('Failed to load authors:', error);
try {
const authorsResult = await authorApi.getAuthors({ page: 0, size: ITEMS_PER_PAGE });
setAuthors(authorsResult.content || []);
setFilteredAuthors(authorsResult.content || []);
setTotalHits(authorsResult.totalElements || 0);
setHasMore(authorsResult.content.length === ITEMS_PER_PAGE);
} catch (fallbackError) {
console.error('Fallback also failed:', fallbackError);
}
}
} finally { } finally {
setLoading(false); setLoading(false);
setSearchLoading(false); setSearchLoading(false);
@@ -95,7 +84,17 @@ export default function AuthorsPage() {
} }
}; };
// Client-side filtering no longer needed since we use Typesense // Client-side filtering for search query when using regular API
useEffect(() => {
if (searchQuery) {
const filtered = authors.filter(author =>
author.name.toLowerCase().includes(searchQuery.toLowerCase())
);
setFilteredAuthors(filtered);
} else {
setFilteredAuthors(authors);
}
}, [authors, searchQuery]);
// Note: We no longer have individual story ratings in the author list // Note: We no longer have individual story ratings in the author list
// Average rating would need to be calculated on backend if needed // Average rating would need to be calculated on backend if needed
@@ -118,9 +117,9 @@ export default function AuthorsPage() {
<div> <div>
<h1 className="text-3xl font-bold theme-header">Authors</h1> <h1 className="text-3xl font-bold theme-header">Authors</h1>
<p className="theme-text mt-1"> <p className="theme-text mt-1">
{filteredAuthors.length} of {totalHits} {totalHits === 1 ? 'author' : 'authors'} {searchQuery ? `${filteredAuthors.length} of ${authors.length}` : filteredAuthors.length} {(searchQuery ? authors.length : filteredAuthors.length) === 1 ? 'author' : 'authors'}
{searchQuery ? ` found` : ` in your library`} {searchQuery ? ` found` : ` in your library`}
{hasMore && ` (showing first ${filteredAuthors.length})`} {!searchQuery && hasMore && ` (showing first ${filteredAuthors.length})`}
</p> </p>
</div> </div>
@@ -218,7 +217,7 @@ export default function AuthorsPage() {
)} )}
{/* Load More Button */} {/* Load More Button */}
{hasMore && ( {hasMore && !searchQuery && (
<div className="flex justify-center pt-8"> <div className="flex justify-center pt-8">
<Button <Button
onClick={loadMore} onClick={loadMore}
@@ -227,7 +226,7 @@ export default function AuthorsPage() {
className="px-8 py-3" className="px-8 py-3"
loading={loading} loading={loading}
> >
{loading ? 'Loading...' : `Load More Authors (${totalHits - filteredAuthors.length} remaining)`} {loading ? 'Loading...' : `Load More Authors (${totalHits - authors.length} remaining)`}
</Button> </Button>
</div> </div>
)} )}

View File

@@ -501,11 +501,11 @@ async function processIndividualMode(
console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`); console.log(`Bulk import completed: ${importedCount} imported, ${skippedCount} skipped, ${errorCount} errors`);
// Trigger Typesense reindex if any stories were imported // Trigger OpenSearch reindex if any stories were imported
if (importedCount > 0) { if (importedCount > 0) {
try { try {
console.log('Triggering Typesense reindex after bulk import...'); console.log('Triggering OpenSearch reindex after bulk import...');
const reindexUrl = `http://backend:8080/api/stories/reindex-typesense`; const reindexUrl = `http://backend:8080/api/admin/search/opensearch/reindex`;
const reindexResponse = await fetch(reindexUrl, { const reindexResponse = await fetch(reindexUrl, {
method: 'POST', method: 'POST',
headers: { headers: {
@@ -513,15 +513,15 @@ async function processIndividualMode(
'Content-Type': 'application/json', 'Content-Type': 'application/json',
}, },
}); });
if (reindexResponse.ok) { if (reindexResponse.ok) {
const reindexResult = await reindexResponse.json(); const reindexResult = await reindexResponse.json();
console.log('Typesense reindex completed:', reindexResult); console.log('OpenSearch reindex completed:', reindexResult);
} else { } else {
console.warn('Typesense reindex failed:', reindexResponse.status); console.warn('OpenSearch reindex failed:', reindexResponse.status);
} }
} catch (error) { } catch (error) {
console.warn('Failed to trigger Typesense reindex:', error); console.warn('Failed to trigger OpenSearch reindex:', error);
// Don't fail the whole request if reindex fails // Don't fail the whole request if reindex fails
} }
} }

View File

@@ -49,7 +49,7 @@ export default function StoryReadingPage() {
)); ));
// Convert to character position in the plain text content // Convert to character position in the plain text content
const textLength = story.contentPlain?.length || story.contentHtml.length; const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
return Math.floor(scrollRatio * textLength); return Math.floor(scrollRatio * textLength);
}, [story]); }, [story]);
@@ -57,7 +57,7 @@ export default function StoryReadingPage() {
const calculateReadingPercentage = useCallback((currentPosition: number): number => { const calculateReadingPercentage = useCallback((currentPosition: number): number => {
if (!story) return 0; if (!story) return 0;
const totalLength = story.contentPlain?.length || story.contentHtml.length; const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
if (totalLength === 0) return 0; if (totalLength === 0) return 0;
return Math.round((currentPosition / totalLength) * 100); return Math.round((currentPosition / totalLength) * 100);
@@ -67,7 +67,7 @@ export default function StoryReadingPage() {
const scrollToCharacterPosition = useCallback((position: number) => { const scrollToCharacterPosition = useCallback((position: number) => {
if (!contentRef.current || !story || hasScrolledToPosition) return; if (!contentRef.current || !story || hasScrolledToPosition) return;
const textLength = story.contentPlain?.length || story.contentHtml.length; const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
if (textLength === 0 || position === 0) return; if (textLength === 0 || position === 0) return;
const ratio = position / textLength; const ratio = position / textLength;

View File

@@ -40,7 +40,7 @@ export default function CollectionReadingView({
)); ));
// Convert to character position in the plain text content // Convert to character position in the plain text content
const textLength = story.contentPlain?.length || story.contentHtml.length; const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
return Math.floor(scrollRatio * textLength); return Math.floor(scrollRatio * textLength);
}, [story]); }, [story]);
@@ -48,7 +48,7 @@ export default function CollectionReadingView({
const calculateReadingPercentage = useCallback((currentPosition: number): number => { const calculateReadingPercentage = useCallback((currentPosition: number): number => {
if (!story) return 0; if (!story) return 0;
const totalLength = story.contentPlain?.length || story.contentHtml.length; const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
if (totalLength === 0) return 0; if (totalLength === 0) return 0;
return Math.round((currentPosition / totalLength) * 100); return Math.round((currentPosition / totalLength) * 100);
@@ -58,7 +58,7 @@ export default function CollectionReadingView({
const scrollToCharacterPosition = useCallback((position: number) => { const scrollToCharacterPosition = useCallback((position: number) => {
if (!contentRef.current || !story || hasScrolledToPosition) return; if (!contentRef.current || !story || hasScrolledToPosition) return;
const textLength = story.contentPlain?.length || story.contentHtml.length; const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
if (textLength === 0 || position === 0) return; if (textLength === 0 || position === 0) return;
const ratio = position / textLength; const ratio = position / textLength;

View File

@@ -127,29 +127,6 @@ const FILTER_PRESETS: FilterPreset[] = [
description: 'Stories that are part of a series', description: 'Stories that are part of a series',
filters: { seriesFilter: 'series' }, filters: { seriesFilter: 'series' },
category: 'content' category: 'content'
},
// Organization presets
{
id: 'well-tagged',
label: '3+ tags',
description: 'Well-tagged stories with 3 or more tags',
filters: { minTagCount: 3 },
category: 'organization'
},
{
id: 'popular',
label: 'Popular',
description: 'Stories with above-average ratings',
filters: { popularOnly: true },
category: 'organization'
},
{
id: 'hidden-gems',
label: 'Hidden Gems',
description: 'Underrated or unrated stories to discover',
filters: { hiddenGemsOnly: true },
category: 'organization'
} }
]; ];

View File

@@ -1,21 +1,35 @@
'use client'; 'use client';
import { useState } from 'react'; import React, { useState, useEffect } from 'react';
import Button from '../ui/Button'; import Button from '../ui/Button';
import { storyApi, authorApi, databaseApi, configApi } from '../../lib/api'; import { databaseApi, configApi, searchAdminApi } from '../../lib/api';
interface SystemSettingsProps { interface SystemSettingsProps {
// No props needed - this component manages its own state // No props needed - this component manages its own state
} }
export default function SystemSettings({}: SystemSettingsProps) { export default function SystemSettings({}: SystemSettingsProps) {
const [typesenseStatus, setTypesenseStatus] = useState<{ const [searchEngineStatus, setSearchEngineStatus] = useState<{
currentEngine: string;
openSearchAvailable: boolean;
loading: boolean;
message: string;
success?: boolean;
}>({
currentEngine: 'opensearch',
openSearchAvailable: false,
loading: false,
message: ''
});
const [openSearchStatus, setOpenSearchStatus] = useState<{
reindex: { loading: boolean; message: string; success?: boolean }; reindex: { loading: boolean; message: string; success?: boolean };
recreate: { loading: boolean; message: string; success?: boolean }; recreate: { loading: boolean; message: string; success?: boolean };
}>({ }>({
reindex: { loading: false, message: '' }, reindex: { loading: false, message: '' },
recreate: { loading: false, message: '' } recreate: { loading: false, message: '' }
}); });
const [databaseStatus, setDatabaseStatus] = useState<{ const [databaseStatus, setDatabaseStatus] = useState<{
completeBackup: { loading: boolean; message: string; success?: boolean }; completeBackup: { loading: boolean; message: string; success?: boolean };
completeRestore: { loading: boolean; message: string; success?: boolean }; completeRestore: { loading: boolean; message: string; success?: boolean };
@@ -33,135 +47,7 @@ export default function SystemSettings({}: SystemSettingsProps) {
execute: { loading: false, message: '' } execute: { loading: false, message: '' }
}); });
const handleFullReindex = async () => {
setTypesenseStatus(prev => ({
...prev,
reindex: { loading: true, message: 'Reindexing all collections...', success: undefined }
}));
try {
// Run both story and author reindex in parallel
const [storiesResult, authorsResult] = await Promise.all([
storyApi.reindexTypesense(),
authorApi.reindexTypesense()
]);
const allSuccessful = storiesResult.success && authorsResult.success;
const messages: string[] = [];
if (storiesResult.success) {
messages.push(`Stories: ${storiesResult.message}`);
} else {
messages.push(`Stories failed: ${storiesResult.error || 'Unknown error'}`);
}
if (authorsResult.success) {
messages.push(`Authors: ${authorsResult.message}`);
} else {
messages.push(`Authors failed: ${authorsResult.error || 'Unknown error'}`);
}
setTypesenseStatus(prev => ({
...prev,
reindex: {
loading: false,
message: allSuccessful
? `Full reindex completed successfully. ${messages.join(', ')}`
: `Reindex completed with errors. ${messages.join(', ')}`,
success: allSuccessful
}
}));
// Clear message after 8 seconds (longer for combined operation)
setTimeout(() => {
setTypesenseStatus(prev => ({
...prev,
reindex: { loading: false, message: '', success: undefined }
}));
}, 8000);
} catch (error) {
setTypesenseStatus(prev => ({
...prev,
reindex: {
loading: false,
message: 'Network error occurred during reindex',
success: false
}
}));
setTimeout(() => {
setTypesenseStatus(prev => ({
...prev,
reindex: { loading: false, message: '', success: undefined }
}));
}, 8000);
}
};
const handleRecreateAllCollections = async () => {
setTypesenseStatus(prev => ({
...prev,
recreate: { loading: true, message: 'Recreating all collections...', success: undefined }
}));
try {
// Run both story and author recreation in parallel
const [storiesResult, authorsResult] = await Promise.all([
storyApi.recreateTypesenseCollection(),
authorApi.recreateTypesenseCollection()
]);
const allSuccessful = storiesResult.success && authorsResult.success;
const messages: string[] = [];
if (storiesResult.success) {
messages.push(`Stories: ${storiesResult.message}`);
} else {
messages.push(`Stories failed: ${storiesResult.error || 'Unknown error'}`);
}
if (authorsResult.success) {
messages.push(`Authors: ${authorsResult.message}`);
} else {
messages.push(`Authors failed: ${authorsResult.error || 'Unknown error'}`);
}
setTypesenseStatus(prev => ({
...prev,
recreate: {
loading: false,
message: allSuccessful
? `All collections recreated successfully. ${messages.join(', ')}`
: `Recreation completed with errors. ${messages.join(', ')}`,
success: allSuccessful
}
}));
// Clear message after 8 seconds (longer for combined operation)
setTimeout(() => {
setTypesenseStatus(prev => ({
...prev,
recreate: { loading: false, message: '', success: undefined }
}));
}, 8000);
} catch (error) {
setTypesenseStatus(prev => ({
...prev,
recreate: {
loading: false,
message: 'Network error occurred during recreation',
success: false
}
}));
setTimeout(() => {
setTypesenseStatus(prev => ({
...prev,
recreate: { loading: false, message: '', success: undefined }
}));
}, 8000);
}
};
const handleCompleteBackup = async () => { const handleCompleteBackup = async () => {
setDatabaseStatus(prev => ({ setDatabaseStatus(prev => ({
@@ -419,62 +305,182 @@ export default function SystemSettings({}: SystemSettingsProps) {
}, 10000); }, 10000);
}; };
// Search Engine Management Functions
const loadSearchEngineStatus = async () => {
try {
const status = await searchAdminApi.getStatus();
setSearchEngineStatus(prev => ({
...prev,
currentEngine: status.primaryEngine,
openSearchAvailable: status.openSearchAvailable,
}));
} catch (error: any) {
console.error('Failed to load search engine status:', error);
}
};
const handleOpenSearchReindex = async () => {
setOpenSearchStatus(prev => ({
...prev,
reindex: { loading: true, message: 'Reindexing OpenSearch...', success: undefined }
}));
try {
const result = await searchAdminApi.reindexOpenSearch();
setOpenSearchStatus(prev => ({
...prev,
reindex: {
loading: false,
message: result.success ? result.message : (result.error || 'Reindex failed'),
success: result.success
}
}));
setTimeout(() => {
setOpenSearchStatus(prev => ({
...prev,
reindex: { loading: false, message: '', success: undefined }
}));
}, 8000);
} catch (error: any) {
setOpenSearchStatus(prev => ({
...prev,
reindex: {
loading: false,
message: error.message || 'Network error occurred',
success: false
}
}));
setTimeout(() => {
setOpenSearchStatus(prev => ({
...prev,
reindex: { loading: false, message: '', success: undefined }
}));
}, 8000);
}
};
const handleOpenSearchRecreate = async () => {
setOpenSearchStatus(prev => ({
...prev,
recreate: { loading: true, message: 'Recreating OpenSearch indices...', success: undefined }
}));
try {
const result = await searchAdminApi.recreateOpenSearchIndices();
setOpenSearchStatus(prev => ({
...prev,
recreate: {
loading: false,
message: result.success ? result.message : (result.error || 'Recreation failed'),
success: result.success
}
}));
setTimeout(() => {
setOpenSearchStatus(prev => ({
...prev,
recreate: { loading: false, message: '', success: undefined }
}));
}, 8000);
} catch (error: any) {
setOpenSearchStatus(prev => ({
...prev,
recreate: {
loading: false,
message: error.message || 'Network error occurred',
success: false
}
}));
setTimeout(() => {
setOpenSearchStatus(prev => ({
...prev,
recreate: { loading: false, message: '', success: undefined }
}));
}, 8000);
}
};
// Load status on component mount
useEffect(() => {
loadSearchEngineStatus();
}, []);
return ( return (
<div className="space-y-6"> <div className="space-y-6">
{/* Typesense Search Management */} {/* Search Management */}
<div className="theme-card theme-shadow rounded-lg p-6"> <div className="theme-card theme-shadow rounded-lg p-6">
<h2 className="text-xl font-semibold theme-header mb-4">Search Index Management</h2> <h2 className="text-xl font-semibold theme-header mb-4">Search Management</h2>
<p className="theme-text mb-6"> <p className="theme-text mb-6">
Manage all Typesense search indexes (stories, authors, collections, etc.). Use these tools if search functionality isn't working properly. Manage OpenSearch indices for stories and authors. Use these tools if search isn't returning expected results.
</p> </p>
<div className="space-y-6"> <div className="space-y-6">
{/* Simplified Operations */} {/* Current Status */}
<div className="border theme-border rounded-lg p-4">
<h3 className="text-lg font-semibold theme-header mb-3">Search Status</h3>
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 text-sm">
<div className="flex justify-between">
<span>OpenSearch:</span>
<span className={`font-medium ${searchEngineStatus.openSearchAvailable ? 'text-green-600 dark:text-green-400' : 'text-red-600 dark:text-red-400'}`}>
{searchEngineStatus.openSearchAvailable ? 'Available' : 'Unavailable'}
</span>
</div>
</div>
</div>
{/* Search Operations */}
<div className="border theme-border rounded-lg p-4"> <div className="border theme-border rounded-lg p-4">
<h3 className="text-lg font-semibold theme-header mb-3">Search Operations</h3> <h3 className="text-lg font-semibold theme-header mb-3">Search Operations</h3>
<p className="text-sm theme-text mb-4"> <p className="text-sm theme-text mb-4">
Perform maintenance operations on all search indexes (stories, authors, collections, etc.). Perform maintenance operations on search indices. Use these if search isn't returning expected results.
</p> </p>
<div className="flex flex-col sm:flex-row gap-3 mb-4"> <div className="flex flex-col sm:flex-row gap-3 mb-4">
<Button <Button
onClick={handleFullReindex} onClick={handleOpenSearchReindex}
disabled={typesenseStatus.reindex.loading || typesenseStatus.recreate.loading} disabled={openSearchStatus.reindex.loading || openSearchStatus.recreate.loading || !searchEngineStatus.openSearchAvailable}
loading={typesenseStatus.reindex.loading} loading={openSearchStatus.reindex.loading}
variant="ghost" variant="ghost"
className="flex-1" className="flex-1"
> >
{typesenseStatus.reindex.loading ? 'Reindexing All...' : '🔄 Full Reindex'} {openSearchStatus.reindex.loading ? 'Reindexing...' : '🔄 Reindex All'}
</Button> </Button>
<Button <Button
onClick={handleRecreateAllCollections} onClick={handleOpenSearchRecreate}
disabled={typesenseStatus.reindex.loading || typesenseStatus.recreate.loading} disabled={openSearchStatus.reindex.loading || openSearchStatus.recreate.loading || !searchEngineStatus.openSearchAvailable}
loading={typesenseStatus.recreate.loading} loading={openSearchStatus.recreate.loading}
variant="secondary" variant="secondary"
className="flex-1" className="flex-1"
> >
{typesenseStatus.recreate.loading ? 'Recreating All...' : '🏗 Recreate All Collections'} {openSearchStatus.recreate.loading ? 'Recreating...' : '🏗️ Recreate Indices'}
</Button> </Button>
</div> </div>
{/* Status Messages */} {/* Status Messages */}
{typesenseStatus.reindex.message && ( {openSearchStatus.reindex.message && (
<div className={`text-sm p-3 rounded mb-3 ${ <div className={`text-sm p-3 rounded mb-3 ${
typesenseStatus.reindex.success openSearchStatus.reindex.success
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200' ? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200' : 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
}`}> }`}>
{typesenseStatus.reindex.message} {openSearchStatus.reindex.message}
</div> </div>
)} )}
{typesenseStatus.recreate.message && ( {openSearchStatus.recreate.message && (
<div className={`text-sm p-3 rounded mb-3 ${ <div className={`text-sm p-3 rounded mb-3 ${
typesenseStatus.recreate.success openSearchStatus.recreate.success
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200' ? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200' : 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
}`}> }`}>
{typesenseStatus.recreate.message} {openSearchStatus.recreate.message}
</div> </div>
)} )}
</div> </div>
@@ -482,9 +488,8 @@ export default function SystemSettings({}: SystemSettingsProps) {
<div className="text-sm theme-text bg-blue-50 dark:bg-blue-900/20 p-3 rounded-lg"> <div className="text-sm theme-text bg-blue-50 dark:bg-blue-900/20 p-3 rounded-lg">
<p className="font-medium mb-1">When to use these tools:</p> <p className="font-medium mb-1">When to use these tools:</p>
<ul className="text-xs space-y-1 ml-4"> <ul className="text-xs space-y-1 ml-4">
<li>• <strong>Full Reindex:</strong> Refresh all search data while keeping existing schemas (fixes data sync issues)</li> <li> <strong>Reindex All:</strong> Refresh all search data while keeping existing schemas (fixes data sync issues)</li>
<li>• <strong>Recreate All Collections:</strong> Delete and rebuild all search indexes from scratch (fixes schema and structure issues)</li> <li> <strong>Recreate Indices:</strong> Delete and rebuild all search indexes from scratch (fixes schema and structure issues)</li>
<li>• <strong>Operations run in parallel</strong> across all index types for better performance</li>
</ul> </ul>
</div> </div>
</div> </div>

View File

@@ -17,17 +17,34 @@ interface StoryCardProps {
onSelect?: () => void; onSelect?: () => void;
} }
export default function StoryCard({ export default function StoryCard({
story, story,
viewMode, viewMode,
onUpdate, onUpdate,
showSelection = false, showSelection = false,
isSelected = false, isSelected = false,
onSelect onSelect
}: StoryCardProps) { }: StoryCardProps) {
const [rating, setRating] = useState(story.rating || 0); const [rating, setRating] = useState(story.rating || 0);
const [updating, setUpdating] = useState(false); const [updating, setUpdating] = useState(false);
// Helper function to get tags from either tags array or tagNames array
const getTags = () => {
if (Array.isArray(story.tags) && story.tags.length > 0) {
return story.tags;
}
if (Array.isArray(story.tagNames) && story.tagNames.length > 0) {
// Convert tagNames to Tag objects for display compatibility
return story.tagNames.map((name, index) => ({
id: `tag-${index}`, // Temporary ID for display
name: name
}));
}
return [];
};
const displayTags = getTags();
const handleRatingClick = async (e: React.MouseEvent, newRating: number) => { const handleRatingClick = async (e: React.MouseEvent, newRating: number) => {
// Prevent default and stop propagation to avoid triggering navigation // Prevent default and stop propagation to avoid triggering navigation
e.preventDefault(); e.preventDefault();
@@ -58,7 +75,7 @@ export default function StoryCard({
const calculateReadingPercentage = (story: Story): number => { const calculateReadingPercentage = (story: Story): number => {
if (!story.readingPosition) return 0; if (!story.readingPosition) return 0;
const totalLength = story.contentPlain?.length || story.contentHtml.length; const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
if (totalLength === 0) return 0; if (totalLength === 0) return 0;
return Math.round((story.readingPosition / totalLength) * 100); return Math.round((story.readingPosition / totalLength) * 100);
@@ -124,9 +141,9 @@ export default function StoryCard({
</div> </div>
{/* Tags */} {/* Tags */}
{Array.isArray(story.tags) && story.tags.length > 0 && ( {displayTags.length > 0 && (
<div className="flex flex-wrap gap-1 mt-2"> <div className="flex flex-wrap gap-1 mt-2">
{story.tags.slice(0, 3).map((tag) => ( {displayTags.slice(0, 3).map((tag) => (
<TagDisplay <TagDisplay
key={tag.id} key={tag.id}
tag={tag} tag={tag}
@@ -134,9 +151,9 @@ export default function StoryCard({
clickable={false} clickable={false}
/> />
))} ))}
{story.tags.length > 3 && ( {displayTags.length > 3 && (
<span className="px-2 py-1 text-xs theme-text"> <span className="px-2 py-1 text-xs theme-text">
+{story.tags.length - 3} more +{displayTags.length - 3} more
</span> </span>
)} )}
</div> </div>
@@ -260,9 +277,9 @@ export default function StoryCard({
</div> </div>
{/* Tags */} {/* Tags */}
{Array.isArray(story.tags) && story.tags.length > 0 && ( {displayTags.length > 0 && (
<div className="flex flex-wrap gap-1 mt-2"> <div className="flex flex-wrap gap-1 mt-2">
{story.tags.slice(0, 2).map((tag) => ( {displayTags.slice(0, 2).map((tag) => (
<TagDisplay <TagDisplay
key={tag.id} key={tag.id}
tag={tag} tag={tag}
@@ -270,9 +287,9 @@ export default function StoryCard({
clickable={false} clickable={false}
/> />
))} ))}
{story.tags.length > 2 && ( {displayTags.length > 2 && (
<span className="px-2 py-1 text-xs theme-text"> <span className="px-2 py-1 text-xs theme-text">
+{story.tags.length - 2} +{displayTags.length - 2}
</span> </span>
)} )}
</div> </div>

View File

@@ -179,15 +179,6 @@ export const storyApi = {
return response.data; return response.data;
}, },
reindexTypesense: async (): Promise<{ success: boolean; message: string; count?: number; error?: string }> => {
const response = await api.post('/stories/reindex-typesense');
return response.data;
},
recreateTypesenseCollection: async (): Promise<{ success: boolean; message: string; count?: number; error?: string }> => {
const response = await api.post('/stories/recreate-typesense-collection');
return response.data;
},
checkDuplicate: async (title: string, authorName: string): Promise<{ checkDuplicate: async (title: string, authorName: string): Promise<{
hasDuplicates: boolean; hasDuplicates: boolean;
@@ -305,38 +296,6 @@ export const authorApi = {
await api.delete(`/authors/${id}/avatar`); await api.delete(`/authors/${id}/avatar`);
}, },
searchAuthorsTypesense: async (params?: {
q?: string;
page?: number;
size?: number;
sortBy?: string;
sortOrder?: string;
}): Promise<{
results: Author[];
totalHits: number;
page: number;
perPage: number;
query: string;
searchTimeMs: number;
}> => {
const response = await api.get('/authors/search-typesense', { params });
return response.data;
},
reindexTypesense: async (): Promise<{ success: boolean; message: string; count?: number; error?: string }> => {
const response = await api.post('/authors/reindex-typesense');
return response.data;
},
recreateTypesenseCollection: async (): Promise<{ success: boolean; message: string; count?: number; error?: string }> => {
const response = await api.post('/authors/recreate-typesense-collection');
return response.data;
},
getTypesenseSchema: async (): Promise<{ success: boolean; schema?: any; error?: string }> => {
const response = await api.get('/authors/typesense-schema');
return response.data;
},
}; };
// Tag endpoints // Tag endpoints
@@ -611,6 +570,74 @@ export const configApi = {
}, },
}; };
// Search Engine Management API
export const searchAdminApi = {
// Get migration status
getStatus: async (): Promise<{
primaryEngine: string;
dualWrite: boolean;
openSearchAvailable: boolean;
}> => {
const response = await api.get('/admin/search/status');
return response.data;
},
// Configure search engine
configure: async (config: { engine: string; dualWrite: boolean }): Promise<{ message: string }> => {
const response = await api.post('/admin/search/configure', config);
return response.data;
},
// Enable/disable dual-write
enableDualWrite: async (): Promise<{ message: string }> => {
const response = await api.post('/admin/search/dual-write/enable');
return response.data;
},
disableDualWrite: async (): Promise<{ message: string }> => {
const response = await api.post('/admin/search/dual-write/disable');
return response.data;
},
// Switch engines
switchToOpenSearch: async (): Promise<{ message: string }> => {
const response = await api.post('/admin/search/switch/opensearch');
return response.data;
},
// Emergency rollback
emergencyRollback: async (): Promise<{ message: string }> => {
const response = await api.post('/admin/search/emergency-rollback');
return response.data;
},
// OpenSearch operations
reindexOpenSearch: async (): Promise<{
success: boolean;
message: string;
storiesCount?: number;
authorsCount?: number;
totalCount?: number;
error?: string;
}> => {
const response = await api.post('/admin/search/opensearch/reindex');
return response.data;
},
recreateOpenSearchIndices: async (): Promise<{
success: boolean;
message: string;
storiesCount?: number;
authorsCount?: number;
totalCount?: number;
error?: string;
}> => {
const response = await api.post('/admin/search/opensearch/recreate');
return response.data;
},
};
// Collection endpoints // Collection endpoints
export const collectionApi = { export const collectionApi = {
getCollections: async (params?: { getCollections: async (params?: {

File diff suppressed because one or more lines are too long