Full parallel implementation of typesense and opensearch
This commit is contained in:
137
CLEANUP_CHECKLIST.md
Normal file
137
CLEANUP_CHECKLIST.md
Normal file
@@ -0,0 +1,137 @@
|
||||
# Search Engine Migration Cleanup Checklist
|
||||
|
||||
**Use this checklist when removing Typesense and completing the migration to OpenSearch.**
|
||||
|
||||
## 🗑️ Files to DELETE Completely
|
||||
|
||||
- [ ] `SearchMigrationManager.java` - Temporary migration manager
|
||||
- [ ] `AdminSearchController.java` - Temporary admin endpoints for migration
|
||||
- [ ] `TypesenseService.java` - Old search service (if exists)
|
||||
- [ ] `TypesenseConfig.java` - Old configuration (if exists)
|
||||
- [ ] Any frontend migration UI components
|
||||
|
||||
## 🔧 Files to MODIFY
|
||||
|
||||
### SearchServiceAdapter.java
|
||||
Replace delegation with direct OpenSearch calls:
|
||||
|
||||
```java
|
||||
@Service
|
||||
public class SearchServiceAdapter {
|
||||
|
||||
@Autowired
|
||||
private OpenSearchService openSearchService; // Only this remains
|
||||
|
||||
public void indexStory(Story story) {
|
||||
openSearchService.indexStory(story); // Direct call, no delegation
|
||||
}
|
||||
|
||||
public SearchResultDto<StorySearchDto> searchStories(...) {
|
||||
return openSearchService.searchStories(...); // Direct call
|
||||
}
|
||||
|
||||
// Remove all migration-related methods:
|
||||
// - isDualWriteEnabled()
|
||||
// - getMigrationStatus()
|
||||
// - etc.
|
||||
}
|
||||
```
|
||||
|
||||
### pom.xml
|
||||
Remove Typesense dependency:
|
||||
```xml
|
||||
<!-- DELETE this dependency -->
|
||||
<dependency>
|
||||
<groupId>org.typesense</groupId>
|
||||
<artifactId>typesense-java</artifactId>
|
||||
<version>1.3.0</version>
|
||||
</dependency>
|
||||
```
|
||||
|
||||
### application.yml
|
||||
Remove migration configuration:
|
||||
```yaml
|
||||
storycove:
|
||||
search:
|
||||
# DELETE these lines:
|
||||
engine: opensearch
|
||||
dual-write: false
|
||||
# DELETE entire typesense section:
|
||||
typesense:
|
||||
api-key: xyz
|
||||
host: localhost
|
||||
port: 8108
|
||||
# ... etc
|
||||
```
|
||||
|
||||
### docker-compose.yml
|
||||
Remove Typesense service:
|
||||
```yaml
|
||||
# DELETE entire typesense service block
|
||||
typesense:
|
||||
image: typesense/typesense:0.25.1
|
||||
# ... etc
|
||||
|
||||
# DELETE typesense volume
|
||||
volumes:
|
||||
typesense_data:
|
||||
```
|
||||
|
||||
## 🌐 Environment Variables to REMOVE
|
||||
|
||||
- [ ] `TYPESENSE_API_KEY`
|
||||
- [ ] `TYPESENSE_HOST`
|
||||
- [ ] `TYPESENSE_PORT`
|
||||
- [ ] `TYPESENSE_ENABLED`
|
||||
- [ ] `SEARCH_ENGINE`
|
||||
- [ ] `SEARCH_DUAL_WRITE`
|
||||
|
||||
## ✅ Configuration to KEEP (OpenSearch only)
|
||||
|
||||
```yaml
|
||||
storycove:
|
||||
opensearch:
|
||||
host: ${OPENSEARCH_HOST:localhost}
|
||||
port: ${OPENSEARCH_PORT:9200}
|
||||
# ... all OpenSearch config remains
|
||||
```
|
||||
|
||||
## 🧪 Testing After Cleanup
|
||||
|
||||
- [ ] Compilation successful: `mvn compile`
|
||||
- [ ] All tests pass: `mvn test`
|
||||
- [ ] Application starts without errors
|
||||
- [ ] Search functionality works correctly
|
||||
- [ ] No references to Typesense in logs
|
||||
- [ ] Docker containers start without Typesense
|
||||
|
||||
## 📝 Estimated Cleanup Time
|
||||
|
||||
**Total: ~30 minutes**
|
||||
- Delete files: 5 minutes
|
||||
- Update SearchServiceAdapter: 10 minutes
|
||||
- Remove configuration: 5 minutes
|
||||
- Testing: 10 minutes
|
||||
|
||||
## 🚨 Rollback Plan (if needed)
|
||||
|
||||
If issues arise during cleanup:
|
||||
|
||||
1. **Immediate:** Restore from git: `git checkout HEAD~1`
|
||||
2. **Verify:** Ensure application works with previous state
|
||||
3. **Investigate:** Fix issues in a separate branch
|
||||
4. **Retry:** Complete cleanup when issues resolved
|
||||
|
||||
## ✨ Post-Cleanup Benefits
|
||||
|
||||
- **Simpler codebase:** No dual-engine complexity
|
||||
- **Reduced dependencies:** Smaller build artifacts
|
||||
- **Better performance:** No dual-write overhead
|
||||
- **Easier maintenance:** Single search engine to manage
|
||||
- **Cleaner configuration:** Fewer environment variables
|
||||
|
||||
---
|
||||
|
||||
**Created:** 2025-09-18
|
||||
**Purpose:** Temporary migration assistance
|
||||
**Delete this file:** After cleanup is complete
|
||||
188
DUAL_WRITE_USAGE.md
Normal file
188
DUAL_WRITE_USAGE.md
Normal file
@@ -0,0 +1,188 @@
|
||||
# Dual-Write Search Engine Usage Guide
|
||||
|
||||
This guide explains how to use the dual-write functionality during the search engine migration.
|
||||
|
||||
## 🎛️ Configuration Options
|
||||
|
||||
### Environment Variables
|
||||
|
||||
```bash
|
||||
# Search engine selection
|
||||
SEARCH_ENGINE=typesense # or 'opensearch'
|
||||
SEARCH_DUAL_WRITE=false # or 'true'
|
||||
|
||||
# OpenSearch connection (required when using OpenSearch)
|
||||
OPENSEARCH_PASSWORD=your_password
|
||||
```
|
||||
|
||||
### Migration Flow
|
||||
|
||||
#### Phase 1: Initial Setup
|
||||
```bash
|
||||
SEARCH_ENGINE=typesense # Keep using Typesense
|
||||
SEARCH_DUAL_WRITE=false # Single-write to Typesense only
|
||||
```
|
||||
|
||||
#### Phase 2: Enable Dual-Write
|
||||
```bash
|
||||
SEARCH_ENGINE=typesense # Still reading from Typesense
|
||||
SEARCH_DUAL_WRITE=true # Now writing to BOTH engines
|
||||
```
|
||||
|
||||
#### Phase 3: Switch to OpenSearch
|
||||
```bash
|
||||
SEARCH_ENGINE=opensearch # Now reading from OpenSearch
|
||||
SEARCH_DUAL_WRITE=true # Still writing to both (safety)
|
||||
```
|
||||
|
||||
#### Phase 4: Complete Migration
|
||||
```bash
|
||||
SEARCH_ENGINE=opensearch # Reading from OpenSearch
|
||||
SEARCH_DUAL_WRITE=false # Only writing to OpenSearch
|
||||
```
|
||||
|
||||
## 🔧 Admin API Endpoints
|
||||
|
||||
### Check Current Status
|
||||
```bash
|
||||
GET /api/admin/search/status
|
||||
|
||||
Response:
|
||||
{
|
||||
"primaryEngine": "typesense",
|
||||
"dualWrite": false,
|
||||
"typesenseAvailable": true,
|
||||
"openSearchAvailable": true
|
||||
}
|
||||
```
|
||||
|
||||
### Switch Configuration
|
||||
```bash
|
||||
POST /api/admin/search/configure
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"engine": "opensearch",
|
||||
"dualWrite": true
|
||||
}
|
||||
```
|
||||
|
||||
### Quick Actions
|
||||
```bash
|
||||
# Enable dual-write
|
||||
POST /api/admin/search/dual-write/enable
|
||||
|
||||
# Disable dual-write
|
||||
POST /api/admin/search/dual-write/disable
|
||||
|
||||
# Switch to OpenSearch
|
||||
POST /api/admin/search/switch/opensearch
|
||||
|
||||
# Switch back to Typesense
|
||||
POST /api/admin/search/switch/typesense
|
||||
|
||||
# Emergency rollback (Typesense only, no dual-write)
|
||||
POST /api/admin/search/emergency-rollback
|
||||
```
|
||||
|
||||
## 🚀 Migration Process Example
|
||||
|
||||
### Step 1: Verify OpenSearch is Ready
|
||||
```bash
|
||||
# Check if OpenSearch is available
|
||||
curl http://localhost:8080/api/admin/search/status
|
||||
|
||||
# Should show openSearchAvailable: true
|
||||
```
|
||||
|
||||
### Step 2: Enable Dual-Write
|
||||
```bash
|
||||
# Enable writing to both engines
|
||||
curl -X POST http://localhost:8080/api/admin/search/dual-write/enable
|
||||
|
||||
# Or via configuration
|
||||
curl -X POST http://localhost:8080/api/admin/search/configure \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"engine": "typesense", "dualWrite": true}'
|
||||
```
|
||||
|
||||
### Step 3: Populate OpenSearch
|
||||
At this point, any new/updated stories will be written to both engines.
|
||||
For existing data, you may want to trigger a reindex.
|
||||
|
||||
### Step 4: Switch to OpenSearch
|
||||
```bash
|
||||
# Switch primary engine to OpenSearch
|
||||
curl -X POST http://localhost:8080/api/admin/search/switch/opensearch
|
||||
|
||||
# Check it worked
|
||||
curl http://localhost:8080/api/admin/search/status
|
||||
# Should show: primaryEngine: "opensearch", dualWrite: true
|
||||
```
|
||||
|
||||
### Step 5: Test & Validate
|
||||
- Test search functionality
|
||||
- Verify results match expectations
|
||||
- Monitor for any errors
|
||||
|
||||
### Step 6: Complete Migration
|
||||
```bash
|
||||
# Disable dual-write (OpenSearch only)
|
||||
curl -X POST http://localhost:8080/api/admin/search/dual-write/disable
|
||||
```
|
||||
|
||||
## 🚨 Emergency Procedures
|
||||
|
||||
### Immediate Rollback
|
||||
If OpenSearch has issues:
|
||||
```bash
|
||||
curl -X POST http://localhost:8080/api/admin/search/emergency-rollback
|
||||
```
|
||||
This immediately switches to Typesense-only mode.
|
||||
|
||||
### Partial Rollback
|
||||
Switch back to Typesense but keep dual-write:
|
||||
```bash
|
||||
curl -X POST http://localhost:8080/api/admin/search/switch/typesense
|
||||
```
|
||||
|
||||
## 📊 Monitoring
|
||||
|
||||
### Check Current Configuration
|
||||
```bash
|
||||
curl http://localhost:8080/api/admin/search/status
|
||||
```
|
||||
|
||||
### Application Logs
|
||||
Watch for dual-write success/failure messages:
|
||||
```bash
|
||||
# Successful operations
|
||||
2025-09-18 08:00:00 DEBUG SearchMigrationManager - Successfully indexed story 123 in OpenSearch
|
||||
2025-09-18 08:00:00 DEBUG SearchMigrationManager - Successfully indexed story 123 in Typesense
|
||||
|
||||
# Failed operations (non-critical in dual-write mode)
|
||||
2025-09-18 08:00:00 ERROR SearchMigrationManager - Failed to index story 123 in OpenSearch
|
||||
```
|
||||
|
||||
## ⚠️ Important Notes
|
||||
|
||||
1. **Dual-write errors are non-critical** - if one engine fails, the other continues
|
||||
2. **Read operations only use primary engine** - no dual-read
|
||||
3. **Configuration updates take effect immediately** - no restart required
|
||||
4. **Emergency rollback is always available** - safe to experiment
|
||||
5. **Both engines must be available** for dual-write to work optimally
|
||||
|
||||
## 🔄 Typical Migration Timeline
|
||||
|
||||
| Step | Duration | Configuration | Purpose |
|
||||
|------|----------|---------------|---------|
|
||||
| 1 | - | typesense + dual:false | Current state |
|
||||
| 2 | 1 day | typesense + dual:true | Populate OpenSearch |
|
||||
| 3 | 1 day | opensearch + dual:true | Test OpenSearch |
|
||||
| 4 | - | opensearch + dual:false | Complete migration |
|
||||
|
||||
**Total migration time: ~2 days of gradual transition**
|
||||
|
||||
---
|
||||
|
||||
**Note:** This dual-write mechanism is temporary and will be removed once the migration is complete. See `CLEANUP_CHECKLIST.md` for removal instructions.
|
||||
@@ -136,6 +136,13 @@
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-maven-plugin</artifactId>
|
||||
</plugin>
|
||||
<plugin>
|
||||
<groupId>org.apache.maven.plugins</groupId>
|
||||
<artifactId>maven-compiler-plugin</artifactId>
|
||||
<configuration>
|
||||
<parameters>true</parameters>
|
||||
</configuration>
|
||||
</plugin>
|
||||
</plugins>
|
||||
</build>
|
||||
</project>
|
||||
@@ -1,5 +1,7 @@
|
||||
package com.storycove.config;
|
||||
|
||||
import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;
|
||||
import org.apache.hc.client5.http.auth.AuthScope;
|
||||
import org.apache.hc.client5.http.auth.UsernamePasswordCredentials;
|
||||
import org.apache.hc.client5.http.impl.auth.BasicCredentialsProvider;
|
||||
@@ -8,13 +10,13 @@ import org.apache.hc.client5.http.impl.nio.PoolingAsyncClientConnectionManagerBu
|
||||
import org.apache.hc.client5.http.ssl.ClientTlsStrategyBuilder;
|
||||
import org.apache.hc.core5.http.HttpHost;
|
||||
import org.apache.hc.core5.util.Timeout;
|
||||
import org.opensearch.client.json.jackson.JacksonJsonpMapper;
|
||||
import org.opensearch.client.opensearch.OpenSearchClient;
|
||||
import org.opensearch.client.transport.OpenSearchTransport;
|
||||
import org.opensearch.client.transport.httpclient5.ApacheHttpClient5TransportBuilder;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
|
||||
import org.springframework.boot.context.properties.EnableConfigurationProperties;
|
||||
import org.springframework.beans.factory.annotation.Qualifier;
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
|
||||
@@ -26,19 +28,17 @@ import java.security.KeyStore;
|
||||
import java.security.cert.X509Certificate;
|
||||
|
||||
@Configuration
|
||||
@EnableConfigurationProperties(OpenSearchProperties.class)
|
||||
public class OpenSearchConfig {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(OpenSearchConfig.class);
|
||||
|
||||
private final OpenSearchProperties properties;
|
||||
|
||||
public OpenSearchConfig(OpenSearchProperties properties) {
|
||||
public OpenSearchConfig(@Qualifier("openSearchProperties") OpenSearchProperties properties) {
|
||||
this.properties = properties;
|
||||
}
|
||||
|
||||
@Bean
|
||||
@ConditionalOnProperty(name = "storycove.search.engine", havingValue = "opensearch")
|
||||
public OpenSearchClient openSearchClient() throws Exception {
|
||||
logger.info("Initializing OpenSearch client for profile: {}", properties.getProfile());
|
||||
|
||||
@@ -51,13 +51,23 @@ public class OpenSearchConfig {
|
||||
// Create connection manager with pooling
|
||||
PoolingAsyncClientConnectionManager connectionManager = createConnectionManager(sslContext);
|
||||
|
||||
// Create the transport with all configurations
|
||||
// Create custom ObjectMapper for proper date serialization
|
||||
ObjectMapper objectMapper = new ObjectMapper();
|
||||
objectMapper.registerModule(new JavaTimeModule());
|
||||
objectMapper.disable(com.fasterxml.jackson.databind.SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
|
||||
|
||||
// Create the transport with all configurations and custom Jackson mapper
|
||||
OpenSearchTransport transport = ApacheHttpClient5TransportBuilder
|
||||
.builder(new HttpHost(properties.getScheme(), properties.getHost(), properties.getPort()))
|
||||
.setMapper(new JacksonJsonpMapper(objectMapper))
|
||||
.setHttpClientConfigCallback(httpClientBuilder -> {
|
||||
httpClientBuilder
|
||||
.setDefaultCredentialsProvider(credentialsProvider)
|
||||
.setConnectionManager(connectionManager);
|
||||
// Only set credentials provider if authentication is configured
|
||||
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
|
||||
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
|
||||
httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
|
||||
}
|
||||
|
||||
httpClientBuilder.setConnectionManager(connectionManager);
|
||||
|
||||
// Set timeouts
|
||||
httpClientBuilder.setDefaultRequestConfig(
|
||||
@@ -81,13 +91,22 @@ public class OpenSearchConfig {
|
||||
|
||||
private BasicCredentialsProvider createCredentialsProvider() {
|
||||
BasicCredentialsProvider credentialsProvider = new BasicCredentialsProvider();
|
||||
|
||||
// Only set credentials if username and password are provided
|
||||
if (properties.getUsername() != null && !properties.getUsername().isEmpty() &&
|
||||
properties.getPassword() != null && !properties.getPassword().isEmpty()) {
|
||||
credentialsProvider.setCredentials(
|
||||
new AuthScope(properties.getHost(), properties.getPort()),
|
||||
new UsernamePasswordCredentials(
|
||||
properties.getUsername(),
|
||||
properties.getPassword() != null ? properties.getPassword().toCharArray() : new char[0]
|
||||
properties.getPassword().toCharArray()
|
||||
)
|
||||
);
|
||||
logger.info("OpenSearch credentials configured for user: {}", properties.getUsername());
|
||||
} else {
|
||||
logger.info("OpenSearch running without authentication (no credentials configured)");
|
||||
}
|
||||
|
||||
return credentialsProvider;
|
||||
}
|
||||
|
||||
@@ -184,8 +203,9 @@ public class OpenSearchConfig {
|
||||
response.version().number(),
|
||||
response.clusterName());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to connect to OpenSearch cluster", e);
|
||||
throw new RuntimeException("OpenSearch connection failed", e);
|
||||
logger.warn("OpenSearch connection test failed during initialization: {}", e.getMessage());
|
||||
logger.debug("OpenSearch connection test full error", e);
|
||||
// Don't throw exception here - let the client be created and handle failures in service methods
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,296 @@
|
||||
package com.storycove.controller;
|
||||
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Story;
|
||||
import com.storycove.service.AuthorService;
|
||||
import com.storycove.service.OpenSearchService;
|
||||
import com.storycove.service.SearchMigrationManager;
|
||||
import com.storycove.service.StoryService;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.ResponseEntity;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* TEMPORARY ADMIN CONTROLLER - DELETE THIS ENTIRE CLASS WHEN TYPESENSE IS REMOVED
|
||||
*
|
||||
* This controller provides admin endpoints for managing the search engine migration.
|
||||
* It allows real-time switching between engines and enabling/disabling dual-write.
|
||||
*
|
||||
* CLEANUP INSTRUCTIONS:
|
||||
* 1. Delete this entire file: AdminSearchController.java
|
||||
* 2. Remove any frontend components that call these endpoints
|
||||
*/
|
||||
@RestController
|
||||
@RequestMapping("/api/admin/search")
|
||||
public class AdminSearchController {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(AdminSearchController.class);
|
||||
|
||||
@Autowired
|
||||
private SearchMigrationManager migrationManager;
|
||||
|
||||
@Autowired(required = false)
|
||||
private OpenSearchService openSearchService;
|
||||
|
||||
@Autowired
|
||||
private StoryService storyService;
|
||||
|
||||
@Autowired
|
||||
private AuthorService authorService;
|
||||
|
||||
/**
|
||||
* Get current search engine configuration status
|
||||
*/
|
||||
@GetMapping("/status")
|
||||
public ResponseEntity<SearchMigrationManager.SearchMigrationStatus> getStatus() {
|
||||
try {
|
||||
SearchMigrationManager.SearchMigrationStatus status = migrationManager.getStatus();
|
||||
return ResponseEntity.ok(status);
|
||||
} catch (Exception e) {
|
||||
logger.error("Error getting search migration status", e);
|
||||
return ResponseEntity.internalServerError().build();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update search engine configuration
|
||||
*/
|
||||
@PostMapping("/configure")
|
||||
public ResponseEntity<String> configureSearchEngine(@RequestBody SearchEngineConfigRequest request) {
|
||||
try {
|
||||
logger.info("Updating search engine configuration: engine={}, dualWrite={}",
|
||||
request.getEngine(), request.isDualWrite());
|
||||
|
||||
// Validate engine
|
||||
if (!"typesense".equalsIgnoreCase(request.getEngine()) &&
|
||||
!"opensearch".equalsIgnoreCase(request.getEngine())) {
|
||||
return ResponseEntity.badRequest().body("Invalid engine. Must be 'typesense' or 'opensearch'");
|
||||
}
|
||||
|
||||
// Update configuration
|
||||
migrationManager.updateConfiguration(request.getEngine(), request.isDualWrite());
|
||||
|
||||
return ResponseEntity.ok("Search engine configuration updated successfully");
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error updating search engine configuration", e);
|
||||
return ResponseEntity.internalServerError().body("Failed to update configuration: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Enable dual-write mode (writes to both engines)
|
||||
*/
|
||||
@PostMapping("/dual-write/enable")
|
||||
public ResponseEntity<String> enableDualWrite() {
|
||||
try {
|
||||
String currentEngine = migrationManager.getCurrentSearchEngine();
|
||||
migrationManager.updateConfiguration(currentEngine, true);
|
||||
logger.info("Dual-write enabled for engine: {}", currentEngine);
|
||||
return ResponseEntity.ok("Dual-write enabled");
|
||||
} catch (Exception e) {
|
||||
logger.error("Error enabling dual-write", e);
|
||||
return ResponseEntity.internalServerError().body("Failed to enable dual-write: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Disable dual-write mode
|
||||
*/
|
||||
@PostMapping("/dual-write/disable")
|
||||
public ResponseEntity<String> disableDualWrite() {
|
||||
try {
|
||||
String currentEngine = migrationManager.getCurrentSearchEngine();
|
||||
migrationManager.updateConfiguration(currentEngine, false);
|
||||
logger.info("Dual-write disabled for engine: {}", currentEngine);
|
||||
return ResponseEntity.ok("Dual-write disabled");
|
||||
} catch (Exception e) {
|
||||
logger.error("Error disabling dual-write", e);
|
||||
return ResponseEntity.internalServerError().body("Failed to disable dual-write: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Switch to OpenSearch engine
|
||||
*/
|
||||
@PostMapping("/switch/opensearch")
|
||||
public ResponseEntity<String> switchToOpenSearch() {
|
||||
try {
|
||||
if (!migrationManager.canSwitchToOpenSearch()) {
|
||||
return ResponseEntity.badRequest().body("OpenSearch is not available or healthy");
|
||||
}
|
||||
|
||||
boolean currentDualWrite = migrationManager.isDualWriteEnabled();
|
||||
migrationManager.updateConfiguration("opensearch", currentDualWrite);
|
||||
logger.info("Switched to OpenSearch with dual-write: {}", currentDualWrite);
|
||||
return ResponseEntity.ok("Switched to OpenSearch");
|
||||
} catch (Exception e) {
|
||||
logger.error("Error switching to OpenSearch", e);
|
||||
return ResponseEntity.internalServerError().body("Failed to switch to OpenSearch: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Switch to Typesense engine (rollback)
|
||||
*/
|
||||
@PostMapping("/switch/typesense")
|
||||
public ResponseEntity<String> switchToTypesense() {
|
||||
try {
|
||||
if (!migrationManager.canSwitchToTypesense()) {
|
||||
return ResponseEntity.badRequest().body("Typesense is not available");
|
||||
}
|
||||
|
||||
boolean currentDualWrite = migrationManager.isDualWriteEnabled();
|
||||
migrationManager.updateConfiguration("typesense", currentDualWrite);
|
||||
logger.info("Switched to Typesense with dual-write: {}", currentDualWrite);
|
||||
return ResponseEntity.ok("Switched to Typesense");
|
||||
} catch (Exception e) {
|
||||
logger.error("Error switching to Typesense", e);
|
||||
return ResponseEntity.internalServerError().body("Failed to switch to Typesense: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Emergency rollback to Typesense with dual-write disabled
|
||||
*/
|
||||
@PostMapping("/emergency-rollback")
|
||||
public ResponseEntity<String> emergencyRollback() {
|
||||
try {
|
||||
migrationManager.updateConfiguration("typesense", false);
|
||||
logger.warn("Emergency rollback to Typesense executed");
|
||||
return ResponseEntity.ok("Emergency rollback completed - switched to Typesense only");
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during emergency rollback", e);
|
||||
return ResponseEntity.internalServerError().body("Emergency rollback failed: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Reindex all data in OpenSearch (equivalent to Typesense reindex)
|
||||
*/
|
||||
@PostMapping("/opensearch/reindex")
|
||||
public ResponseEntity<Map<String, Object>> reindexOpenSearch() {
|
||||
try {
|
||||
logger.info("Starting OpenSearch full reindex");
|
||||
|
||||
if (!migrationManager.canSwitchToOpenSearch()) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch is not available or healthy"
|
||||
));
|
||||
}
|
||||
|
||||
// Get all data from services (similar to Typesense reindex)
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
|
||||
// Bulk index directly in OpenSearch
|
||||
if (openSearchService != null) {
|
||||
openSearchService.bulkIndexStories(allStories);
|
||||
openSearchService.bulkIndexAuthors(allAuthors);
|
||||
} else {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch service not available"
|
||||
));
|
||||
}
|
||||
|
||||
int totalIndexed = allStories.size() + allAuthors.size();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", String.format("Reindexed %d stories and %d authors in OpenSearch",
|
||||
allStories.size(), allAuthors.size()),
|
||||
"storiesCount", allStories.size(),
|
||||
"authorsCount", allAuthors.size(),
|
||||
"totalCount", totalIndexed
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during OpenSearch reindex", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch reindex failed: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Recreate OpenSearch indices (equivalent to Typesense collection recreation)
|
||||
*/
|
||||
@PostMapping("/opensearch/recreate")
|
||||
public ResponseEntity<Map<String, Object>> recreateOpenSearchIndices() {
|
||||
try {
|
||||
logger.info("Starting OpenSearch indices recreation");
|
||||
|
||||
if (!migrationManager.canSwitchToOpenSearch()) {
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch is not available or healthy"
|
||||
));
|
||||
}
|
||||
|
||||
// Recreate OpenSearch indices directly
|
||||
if (openSearchService != null) {
|
||||
openSearchService.recreateIndices();
|
||||
} else {
|
||||
logger.error("OpenSearchService not available for index recreation");
|
||||
return ResponseEntity.badRequest().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearchService not available"
|
||||
));
|
||||
}
|
||||
|
||||
// Now populate the freshly created indices directly in OpenSearch
|
||||
List<Story> allStories = storyService.findAllWithAssociations();
|
||||
List<Author> allAuthors = authorService.findAllWithStories();
|
||||
|
||||
openSearchService.bulkIndexStories(allStories);
|
||||
openSearchService.bulkIndexAuthors(allAuthors);
|
||||
|
||||
int totalIndexed = allStories.size() + allAuthors.size();
|
||||
|
||||
return ResponseEntity.ok(Map.of(
|
||||
"success", true,
|
||||
"message", String.format("Recreated OpenSearch indices and indexed %d stories and %d authors",
|
||||
allStories.size(), allAuthors.size()),
|
||||
"storiesCount", allStories.size(),
|
||||
"authorsCount", allAuthors.size(),
|
||||
"totalCount", totalIndexed
|
||||
));
|
||||
|
||||
} catch (Exception e) {
|
||||
logger.error("Error during OpenSearch indices recreation", e);
|
||||
return ResponseEntity.internalServerError().body(Map.of(
|
||||
"success", false,
|
||||
"error", "OpenSearch indices recreation failed: " + e.getMessage()
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* DTO for search engine configuration requests
|
||||
*/
|
||||
public static class SearchEngineConfigRequest {
|
||||
private String engine;
|
||||
private boolean dualWrite;
|
||||
|
||||
public SearchEngineConfigRequest() {}
|
||||
|
||||
public SearchEngineConfigRequest(String engine, boolean dualWrite) {
|
||||
this.engine = engine;
|
||||
this.dualWrite = dualWrite;
|
||||
}
|
||||
|
||||
public String getEngine() { return engine; }
|
||||
public void setEngine(String engine) { this.engine = engine; }
|
||||
|
||||
public boolean isDualWrite() { return dualWrite; }
|
||||
public void setDualWrite(boolean dualWrite) { this.dualWrite = dualWrite; }
|
||||
}
|
||||
}
|
||||
@@ -4,6 +4,7 @@ import com.storycove.dto.*;
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.service.AuthorService;
|
||||
import com.storycove.service.ImageService;
|
||||
import com.storycove.service.SearchServiceAdapter;
|
||||
import com.storycove.service.TypesenseService;
|
||||
import jakarta.servlet.http.HttpServletRequest;
|
||||
import jakarta.validation.Valid;
|
||||
@@ -33,11 +34,13 @@ public class AuthorController {
|
||||
private final AuthorService authorService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
|
||||
public AuthorController(AuthorService authorService, ImageService imageService, TypesenseService typesenseService) {
|
||||
public AuthorController(AuthorService authorService, ImageService imageService, TypesenseService typesenseService, SearchServiceAdapter searchServiceAdapter) {
|
||||
this.authorService = authorService;
|
||||
this.imageService = imageService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
}
|
||||
|
||||
@GetMapping
|
||||
@@ -258,7 +261,17 @@ public class AuthorController {
|
||||
@RequestParam(defaultValue = "name") String sortBy,
|
||||
@RequestParam(defaultValue = "asc") String sortOrder) {
|
||||
|
||||
SearchResultDto<AuthorSearchDto> searchResults = typesenseService.searchAuthors(q, page, size, sortBy, sortOrder);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
List<AuthorSearchDto> authorSearchResults = searchServiceAdapter.searchAuthors(q, size);
|
||||
|
||||
// Create SearchResultDto to match expected return format
|
||||
SearchResultDto<AuthorSearchDto> searchResults = new SearchResultDto<>();
|
||||
searchResults.setResults(authorSearchResults);
|
||||
searchResults.setQuery(q);
|
||||
searchResults.setPage(page);
|
||||
searchResults.setPerPage(size);
|
||||
searchResults.setTotalHits(authorSearchResults.size());
|
||||
searchResults.setSearchTimeMs(0); // SearchServiceAdapter doesn't provide timing
|
||||
|
||||
// Convert AuthorSearchDto results to AuthorDto
|
||||
SearchResultDto<AuthorDto> results = new SearchResultDto<>();
|
||||
|
||||
@@ -42,6 +42,7 @@ public class StoryController {
|
||||
private final HtmlSanitizationService sanitizationService;
|
||||
private final ImageService imageService;
|
||||
private final TypesenseService typesenseService;
|
||||
private final SearchServiceAdapter searchServiceAdapter;
|
||||
private final CollectionService collectionService;
|
||||
private final ReadingTimeService readingTimeService;
|
||||
private final EPUBImportService epubImportService;
|
||||
@@ -54,6 +55,7 @@ public class StoryController {
|
||||
ImageService imageService,
|
||||
CollectionService collectionService,
|
||||
@Autowired(required = false) TypesenseService typesenseService,
|
||||
SearchServiceAdapter searchServiceAdapter,
|
||||
ReadingTimeService readingTimeService,
|
||||
EPUBImportService epubImportService,
|
||||
EPUBExportService epubExportService) {
|
||||
@@ -64,6 +66,7 @@ public class StoryController {
|
||||
this.imageService = imageService;
|
||||
this.collectionService = collectionService;
|
||||
this.typesenseService = typesenseService;
|
||||
this.searchServiceAdapter = searchServiceAdapter;
|
||||
this.readingTimeService = readingTimeService;
|
||||
this.epubImportService = epubImportService;
|
||||
this.epubExportService = epubExportService;
|
||||
@@ -326,7 +329,7 @@ public class StoryController {
|
||||
@RequestParam(required = false) Integer maxRating,
|
||||
@RequestParam(required = false) String sortBy,
|
||||
@RequestParam(required = false) String sortDir,
|
||||
@RequestParam(required = false) String facetBy,
|
||||
@RequestParam(required = false) List<String> facetBy,
|
||||
// Advanced filters
|
||||
@RequestParam(required = false) Integer minWordCount,
|
||||
@RequestParam(required = false) Integer maxWordCount,
|
||||
@@ -345,16 +348,35 @@ public class StoryController {
|
||||
@RequestParam(required = false) Boolean hiddenGemsOnly) {
|
||||
|
||||
|
||||
if (typesenseService != null) {
|
||||
SearchResultDto<StorySearchDto> results = typesenseService.searchStories(
|
||||
query, page, size, authors, tags, minRating, maxRating, sortBy, sortDir, facetBy,
|
||||
minWordCount, maxWordCount, createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||
unratedOnly, readingStatus, hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter,
|
||||
minTagCount, popularOnly, hiddenGemsOnly);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
try {
|
||||
// Convert authors list to single author string (for now, use first author)
|
||||
String authorFilter = (authors != null && !authors.isEmpty()) ? authors.get(0) : null;
|
||||
|
||||
// DEBUG: Log all received parameters
|
||||
logger.info("CONTROLLER DEBUG - Received parameters:");
|
||||
logger.info(" readingStatus: '{}'", readingStatus);
|
||||
logger.info(" seriesFilter: '{}'", seriesFilter);
|
||||
logger.info(" hasReadingProgress: {}", hasReadingProgress);
|
||||
logger.info(" hasCoverImage: {}", hasCoverImage);
|
||||
logger.info(" createdAfter: '{}'", createdAfter);
|
||||
logger.info(" lastReadAfter: '{}'", lastReadAfter);
|
||||
logger.info(" unratedOnly: {}", unratedOnly);
|
||||
|
||||
SearchResultDto<StorySearchDto> results = searchServiceAdapter.searchStories(
|
||||
query, tags, authorFilter, seriesFilter, minWordCount, maxWordCount,
|
||||
minRating != null ? minRating.floatValue() : null,
|
||||
null, // isRead - now handled by readingStatus advanced filter
|
||||
null, // isFavorite - now handled by readingStatus advanced filter
|
||||
sortBy, sortDir, page, size, facetBy,
|
||||
// Advanced filters
|
||||
createdAfter, createdBefore, lastReadAfter, lastReadBefore,
|
||||
unratedOnly, readingStatus, hasReadingProgress, hasCoverImage,
|
||||
sourceDomain, seriesFilter, minTagCount, popularOnly, hiddenGemsOnly);
|
||||
return ResponseEntity.ok(results);
|
||||
} else {
|
||||
// Fallback to basic search if Typesense is not available
|
||||
return ResponseEntity.badRequest().body(null);
|
||||
} catch (Exception e) {
|
||||
logger.error("Search failed", e);
|
||||
return ResponseEntity.internalServerError().body(null);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -363,10 +385,12 @@ public class StoryController {
|
||||
@RequestParam String query,
|
||||
@RequestParam(defaultValue = "5") int limit) {
|
||||
|
||||
if (typesenseService != null) {
|
||||
List<String> suggestions = typesenseService.searchSuggestions(query, limit);
|
||||
// Use SearchServiceAdapter to handle routing between search engines
|
||||
try {
|
||||
List<String> suggestions = searchServiceAdapter.getTagSuggestions(query, limit);
|
||||
return ResponseEntity.ok(suggestions);
|
||||
} else {
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to get search suggestions", e);
|
||||
return ResponseEntity.ok(new ArrayList<>());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -17,6 +17,7 @@ public class StorySearchDto {
|
||||
|
||||
// Reading status
|
||||
private Boolean isRead;
|
||||
private Integer readingPosition;
|
||||
private LocalDateTime lastReadAt;
|
||||
|
||||
// Author info
|
||||
@@ -33,6 +34,9 @@ public class StorySearchDto {
|
||||
private LocalDateTime createdAt;
|
||||
private LocalDateTime updatedAt;
|
||||
|
||||
// Alias for createdAt to match frontend expectations
|
||||
private LocalDateTime dateAdded;
|
||||
|
||||
// Search-specific fields
|
||||
private double searchScore;
|
||||
private List<String> highlights;
|
||||
@@ -121,6 +125,14 @@ public class StorySearchDto {
|
||||
this.lastReadAt = lastReadAt;
|
||||
}
|
||||
|
||||
public Integer getReadingPosition() {
|
||||
return readingPosition;
|
||||
}
|
||||
|
||||
public void setReadingPosition(Integer readingPosition) {
|
||||
this.readingPosition = readingPosition;
|
||||
}
|
||||
|
||||
public UUID getAuthorId() {
|
||||
return authorId;
|
||||
}
|
||||
@@ -177,6 +189,14 @@ public class StorySearchDto {
|
||||
this.updatedAt = updatedAt;
|
||||
}
|
||||
|
||||
public LocalDateTime getDateAdded() {
|
||||
return dateAdded;
|
||||
}
|
||||
|
||||
public void setDateAdded(LocalDateTime dateAdded) {
|
||||
this.dateAdded = dateAdded;
|
||||
}
|
||||
|
||||
public double getSearchScore() {
|
||||
return searchScore;
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,473 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.AuthorSearchDto;
|
||||
import com.storycove.dto.SearchResultDto;
|
||||
import com.storycove.dto.StorySearchDto;
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Story;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
/**
|
||||
* TEMPORARY MIGRATION MANAGER - DELETE THIS ENTIRE CLASS WHEN TYPESENSE IS REMOVED
|
||||
*
|
||||
* This class handles dual-write functionality and engine switching during the
|
||||
* migration from Typesense to OpenSearch. It's designed to be completely removed
|
||||
* once the migration is complete.
|
||||
*
|
||||
* CLEANUP INSTRUCTIONS:
|
||||
* 1. Delete this entire file: SearchMigrationManager.java
|
||||
* 2. Update SearchServiceAdapter to call OpenSearchService directly
|
||||
* 3. Remove migration-related configuration properties
|
||||
* 4. Remove migration-related admin endpoints and UI
|
||||
*/
|
||||
@Component
|
||||
public class SearchMigrationManager {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(SearchMigrationManager.class);
|
||||
|
||||
@Autowired(required = false)
|
||||
private TypesenseService typesenseService;
|
||||
|
||||
@Autowired(required = false)
|
||||
private OpenSearchService openSearchService;
|
||||
|
||||
@Value("${storycove.search.engine:typesense}")
|
||||
private String primaryEngine;
|
||||
|
||||
@Value("${storycove.search.dual-write:false}")
|
||||
private boolean dualWrite;
|
||||
|
||||
// ===============================
|
||||
// READ OPERATIONS (single engine)
|
||||
// ===============================
|
||||
|
||||
public SearchResultDto<StorySearchDto> searchStories(String query, List<String> tags, String author,
|
||||
String series, Integer minWordCount, Integer maxWordCount,
|
||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||
String sortBy, String sortOrder, int page, int size,
|
||||
List<String> facetBy,
|
||||
// Advanced filters
|
||||
String createdAfter, String createdBefore,
|
||||
String lastReadAfter, String lastReadBefore,
|
||||
Boolean unratedOnly, String readingStatus,
|
||||
Boolean hasReadingProgress, Boolean hasCoverImage,
|
||||
String sourceDomain, String seriesFilter,
|
||||
Integer minTagCount, Boolean popularOnly,
|
||||
Boolean hiddenGemsOnly) {
|
||||
boolean openSearchAvailable = openSearchService != null;
|
||||
boolean openSearchConnected = openSearchAvailable ? openSearchService.testConnection() : false;
|
||||
boolean routingCondition = "opensearch".equalsIgnoreCase(primaryEngine) && openSearchAvailable;
|
||||
|
||||
logger.info("SEARCH ROUTING DEBUG:");
|
||||
logger.info(" Primary engine: '{}'", primaryEngine);
|
||||
logger.info(" OpenSearch available: {}", openSearchAvailable);
|
||||
logger.info(" OpenSearch connected: {}", openSearchConnected);
|
||||
logger.info(" Routing condition result: {}", routingCondition);
|
||||
logger.info(" Will route to: {}", routingCondition ? "OpenSearch" : "Typesense");
|
||||
|
||||
if (routingCondition) {
|
||||
logger.info("ROUTING TO OPENSEARCH");
|
||||
return openSearchService.searchStories(query, tags, author, series, minWordCount, maxWordCount,
|
||||
minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy,
|
||||
createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus,
|
||||
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly,
|
||||
hiddenGemsOnly);
|
||||
} else if (typesenseService != null) {
|
||||
logger.info("ROUTING TO TYPESENSE");
|
||||
// Convert parameters to match TypesenseService signature
|
||||
return typesenseService.searchStories(
|
||||
query, page, size, tags, null, minWordCount, maxWordCount,
|
||||
null, null, null, null, minRating != null ? minRating.intValue() : null,
|
||||
null, null, sortBy, sortOrder, null, null, isRead, isFavorite,
|
||||
author, series, null, null, null);
|
||||
} else {
|
||||
logger.error("No search service available! Primary engine: {}, OpenSearch: {}, Typesense: {}",
|
||||
primaryEngine, openSearchService != null, typesenseService != null);
|
||||
return new SearchResultDto<>(List.of(), 0, page, size, query != null ? query : "", 0);
|
||||
}
|
||||
}
|
||||
|
||||
public List<StorySearchDto> getRandomStories(int count, List<String> tags, String author,
|
||||
String series, Integer minWordCount, Integer maxWordCount,
|
||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||
Long seed) {
|
||||
logger.debug("Getting random stories using primary engine: {}", primaryEngine);
|
||||
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) && openSearchService != null) {
|
||||
return openSearchService.getRandomStories(count, tags, author, series, minWordCount, maxWordCount,
|
||||
minRating, isRead, isFavorite, seed);
|
||||
} else if (typesenseService != null) {
|
||||
// TypesenseService doesn't have getRandomStories, use random story ID approach
|
||||
List<StorySearchDto> results = new java.util.ArrayList<>();
|
||||
for (int i = 0; i < count; i++) {
|
||||
var randomId = typesenseService.getRandomStoryId(null, tags, seed != null ? seed + i : null);
|
||||
// Note: This is a simplified approach - full implementation would need story lookup
|
||||
}
|
||||
return results;
|
||||
} else {
|
||||
logger.error("No search service available for random stories");
|
||||
return List.of();
|
||||
}
|
||||
}
|
||||
|
||||
public String getRandomStoryId(Long seed) {
|
||||
logger.debug("Getting random story ID using primary engine: {}", primaryEngine);
|
||||
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) && openSearchService != null) {
|
||||
return openSearchService.getRandomStoryId(seed);
|
||||
} else if (typesenseService != null) {
|
||||
var randomId = typesenseService.getRandomStoryId(null, null, seed);
|
||||
return randomId.map(UUID::toString).orElse(null);
|
||||
} else {
|
||||
logger.error("No search service available for random story ID");
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
public List<AuthorSearchDto> searchAuthors(String query, int limit) {
|
||||
logger.debug("Searching authors using primary engine: {}", primaryEngine);
|
||||
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) && openSearchService != null) {
|
||||
return openSearchService.searchAuthors(query, limit);
|
||||
} else if (typesenseService != null) {
|
||||
var result = typesenseService.searchAuthors(query, 0, limit, null, null);
|
||||
return result.getResults();
|
||||
} else {
|
||||
logger.error("No search service available for author search");
|
||||
return List.of();
|
||||
}
|
||||
}
|
||||
|
||||
public List<String> getTagSuggestions(String query, int limit) {
|
||||
logger.debug("Getting tag suggestions using primary engine: {}", primaryEngine);
|
||||
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) && openSearchService != null) {
|
||||
return openSearchService.getTagSuggestions(query, limit);
|
||||
} else if (typesenseService != null) {
|
||||
// TypesenseService may not have getTagSuggestions - return empty for now
|
||||
logger.warn("Tag suggestions not implemented for Typesense");
|
||||
return List.of();
|
||||
} else {
|
||||
logger.error("No search service available for tag suggestions");
|
||||
return List.of();
|
||||
}
|
||||
}
|
||||
|
||||
// ===============================
|
||||
// WRITE OPERATIONS (dual-write capable)
|
||||
// ===============================
|
||||
|
||||
public void indexStory(Story story) {
|
||||
logger.debug("Indexing story with dual-write: {}, primary engine: {}", dualWrite, primaryEngine);
|
||||
|
||||
// Write to OpenSearch
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (openSearchService != null) {
|
||||
try {
|
||||
openSearchService.indexStory(story);
|
||||
logger.debug("Successfully indexed story {} in OpenSearch", story.getId());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index story {} in OpenSearch", story.getId(), e);
|
||||
}
|
||||
} else {
|
||||
logger.warn("OpenSearch service not available for indexing story {}", story.getId());
|
||||
}
|
||||
}
|
||||
|
||||
// Write to Typesense
|
||||
if ("typesense".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.indexStory(story);
|
||||
logger.debug("Successfully indexed story {} in Typesense", story.getId());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index story {} in Typesense", story.getId(), e);
|
||||
}
|
||||
} else {
|
||||
logger.warn("Typesense service not available for indexing story {}", story.getId());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void updateStory(Story story) {
|
||||
logger.debug("Updating story with dual-write: {}, primary engine: {}", dualWrite, primaryEngine);
|
||||
|
||||
// Update in OpenSearch
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (openSearchService != null) {
|
||||
try {
|
||||
openSearchService.updateStory(story);
|
||||
logger.debug("Successfully updated story {} in OpenSearch", story.getId());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update story {} in OpenSearch", story.getId(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Update in Typesense
|
||||
if ("typesense".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateStory(story);
|
||||
logger.debug("Successfully updated story {} in Typesense", story.getId());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update story {} in Typesense", story.getId(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void deleteStory(UUID storyId) {
|
||||
logger.debug("Deleting story with dual-write: {}, primary engine: {}", dualWrite, primaryEngine);
|
||||
|
||||
// Delete from OpenSearch
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (openSearchService != null) {
|
||||
try {
|
||||
openSearchService.deleteStory(storyId);
|
||||
logger.debug("Successfully deleted story {} from OpenSearch", storyId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete story {} from OpenSearch", storyId, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Delete from Typesense
|
||||
if ("typesense".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.deleteStory(storyId.toString());
|
||||
logger.debug("Successfully deleted story {} from Typesense", storyId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete story {} from Typesense", storyId, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void indexAuthor(Author author) {
|
||||
logger.debug("Indexing author with dual-write: {}, primary engine: {}", dualWrite, primaryEngine);
|
||||
|
||||
// Index in OpenSearch
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (openSearchService != null) {
|
||||
try {
|
||||
openSearchService.indexAuthor(author);
|
||||
logger.debug("Successfully indexed author {} in OpenSearch", author.getId());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index author {} in OpenSearch", author.getId(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Index in Typesense
|
||||
if ("typesense".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.indexAuthor(author);
|
||||
logger.debug("Successfully indexed author {} in Typesense", author.getId());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to index author {} in Typesense", author.getId(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void updateAuthor(Author author) {
|
||||
logger.debug("Updating author with dual-write: {}, primary engine: {}", dualWrite, primaryEngine);
|
||||
|
||||
// Update in OpenSearch
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (openSearchService != null) {
|
||||
try {
|
||||
openSearchService.updateAuthor(author);
|
||||
logger.debug("Successfully updated author {} in OpenSearch", author.getId());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update author {} in OpenSearch", author.getId(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Update in Typesense
|
||||
if ("typesense".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.updateAuthor(author);
|
||||
logger.debug("Successfully updated author {} in Typesense", author.getId());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to update author {} in Typesense", author.getId(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void deleteAuthor(UUID authorId) {
|
||||
logger.debug("Deleting author with dual-write: {}, primary engine: {}", dualWrite, primaryEngine);
|
||||
|
||||
// Delete from OpenSearch
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (openSearchService != null) {
|
||||
try {
|
||||
openSearchService.deleteAuthor(authorId);
|
||||
logger.debug("Successfully deleted author {} from OpenSearch", authorId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete author {} from OpenSearch", authorId, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Delete from Typesense
|
||||
if ("typesense".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.deleteAuthor(authorId.toString());
|
||||
logger.debug("Successfully deleted author {} from Typesense", authorId);
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to delete author {} from Typesense", authorId, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void bulkIndexStories(List<Story> stories) {
|
||||
logger.debug("Bulk indexing {} stories with dual-write: {}, primary engine: {}",
|
||||
stories.size(), dualWrite, primaryEngine);
|
||||
|
||||
// Bulk index in OpenSearch
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (openSearchService != null) {
|
||||
try {
|
||||
openSearchService.bulkIndexStories(stories);
|
||||
logger.info("Successfully bulk indexed {} stories in OpenSearch", stories.size());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} stories in OpenSearch", stories.size(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Bulk index in Typesense
|
||||
if ("typesense".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.bulkIndexStories(stories);
|
||||
logger.info("Successfully bulk indexed {} stories in Typesense", stories.size());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} stories in Typesense", stories.size(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public void bulkIndexAuthors(List<Author> authors) {
|
||||
logger.debug("Bulk indexing {} authors with dual-write: {}, primary engine: {}",
|
||||
authors.size(), dualWrite, primaryEngine);
|
||||
|
||||
// Bulk index in OpenSearch
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (openSearchService != null) {
|
||||
try {
|
||||
openSearchService.bulkIndexAuthors(authors);
|
||||
logger.info("Successfully bulk indexed {} authors in OpenSearch", authors.size());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} authors in OpenSearch", authors.size(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Bulk index in Typesense
|
||||
if ("typesense".equalsIgnoreCase(primaryEngine) || dualWrite) {
|
||||
if (typesenseService != null) {
|
||||
try {
|
||||
typesenseService.bulkIndexAuthors(authors);
|
||||
logger.info("Successfully bulk indexed {} authors in Typesense", authors.size());
|
||||
} catch (Exception e) {
|
||||
logger.error("Failed to bulk index {} authors in Typesense", authors.size(), e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ===============================
|
||||
// UTILITY METHODS
|
||||
// ===============================
|
||||
|
||||
public boolean isSearchServiceAvailable() {
|
||||
if ("opensearch".equalsIgnoreCase(primaryEngine)) {
|
||||
return openSearchService != null && openSearchService.testConnection();
|
||||
} else {
|
||||
return typesenseService != null;
|
||||
}
|
||||
}
|
||||
|
||||
public String getCurrentSearchEngine() {
|
||||
return primaryEngine;
|
||||
}
|
||||
|
||||
public boolean isDualWriteEnabled() {
|
||||
return dualWrite;
|
||||
}
|
||||
|
||||
public boolean canSwitchToOpenSearch() {
|
||||
return openSearchService != null && openSearchService.testConnection();
|
||||
}
|
||||
|
||||
public boolean canSwitchToTypesense() {
|
||||
return typesenseService != null;
|
||||
}
|
||||
|
||||
public OpenSearchService getOpenSearchService() {
|
||||
return openSearchService;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update configuration at runtime (for admin interface)
|
||||
* Note: This requires @RefreshScope to work properly
|
||||
*/
|
||||
public void updateConfiguration(String engine, boolean enableDualWrite) {
|
||||
logger.info("Updating search configuration: engine={}, dualWrite={}", engine, enableDualWrite);
|
||||
this.primaryEngine = engine;
|
||||
this.dualWrite = enableDualWrite;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current configuration status for admin interface
|
||||
*/
|
||||
public SearchMigrationStatus getStatus() {
|
||||
return new SearchMigrationStatus(
|
||||
primaryEngine,
|
||||
dualWrite,
|
||||
typesenseService != null,
|
||||
openSearchService != null && openSearchService.testConnection()
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* DTO for search migration status
|
||||
*/
|
||||
public static class SearchMigrationStatus {
|
||||
private final String primaryEngine;
|
||||
private final boolean dualWrite;
|
||||
private final boolean typesenseAvailable;
|
||||
private final boolean openSearchAvailable;
|
||||
|
||||
public SearchMigrationStatus(String primaryEngine, boolean dualWrite,
|
||||
boolean typesenseAvailable, boolean openSearchAvailable) {
|
||||
this.primaryEngine = primaryEngine;
|
||||
this.dualWrite = dualWrite;
|
||||
this.typesenseAvailable = typesenseAvailable;
|
||||
this.openSearchAvailable = openSearchAvailable;
|
||||
}
|
||||
|
||||
public String getPrimaryEngine() { return primaryEngine; }
|
||||
public boolean isDualWrite() { return dualWrite; }
|
||||
public boolean isTypesenseAvailable() { return typesenseAvailable; }
|
||||
public boolean isOpenSearchAvailable() { return openSearchAvailable; }
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,196 @@
|
||||
package com.storycove.service;
|
||||
|
||||
import com.storycove.dto.AuthorSearchDto;
|
||||
import com.storycove.dto.SearchResultDto;
|
||||
import com.storycove.dto.StorySearchDto;
|
||||
import com.storycove.entity.Author;
|
||||
import com.storycove.entity.Story;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.UUID;
|
||||
|
||||
/**
|
||||
* Service adapter that provides a unified interface for search operations.
|
||||
*
|
||||
* This adapter delegates to SearchMigrationManager during the migration period,
|
||||
* which will be removed once Typesense is completely eliminated.
|
||||
*
|
||||
* POST-MIGRATION: This class will be simplified to call OpenSearchService directly.
|
||||
*/
|
||||
@Service
|
||||
public class SearchServiceAdapter {
|
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(SearchServiceAdapter.class);
|
||||
|
||||
@Autowired
|
||||
private SearchMigrationManager migrationManager;
|
||||
|
||||
// ===============================
|
||||
// SEARCH OPERATIONS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Search stories with unified interface
|
||||
*/
|
||||
public SearchResultDto<StorySearchDto> searchStories(String query, List<String> tags, String author,
|
||||
String series, Integer minWordCount, Integer maxWordCount,
|
||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||
String sortBy, String sortOrder, int page, int size,
|
||||
List<String> facetBy,
|
||||
// Advanced filters
|
||||
String createdAfter, String createdBefore,
|
||||
String lastReadAfter, String lastReadBefore,
|
||||
Boolean unratedOnly, String readingStatus,
|
||||
Boolean hasReadingProgress, Boolean hasCoverImage,
|
||||
String sourceDomain, String seriesFilter,
|
||||
Integer minTagCount, Boolean popularOnly,
|
||||
Boolean hiddenGemsOnly) {
|
||||
return migrationManager.searchStories(query, tags, author, series, minWordCount, maxWordCount,
|
||||
minRating, isRead, isFavorite, sortBy, sortOrder, page, size, facetBy,
|
||||
createdAfter, createdBefore, lastReadAfter, lastReadBefore, unratedOnly, readingStatus,
|
||||
hasReadingProgress, hasCoverImage, sourceDomain, seriesFilter, minTagCount, popularOnly,
|
||||
hiddenGemsOnly);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get random stories with unified interface
|
||||
*/
|
||||
public List<StorySearchDto> getRandomStories(int count, List<String> tags, String author,
|
||||
String series, Integer minWordCount, Integer maxWordCount,
|
||||
Float minRating, Boolean isRead, Boolean isFavorite,
|
||||
Long seed) {
|
||||
return migrationManager.getRandomStories(count, tags, author, series, minWordCount, maxWordCount,
|
||||
minRating, isRead, isFavorite, seed);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get random story ID with unified interface
|
||||
*/
|
||||
public String getRandomStoryId(Long seed) {
|
||||
return migrationManager.getRandomStoryId(seed);
|
||||
}
|
||||
|
||||
/**
|
||||
* Search authors with unified interface
|
||||
*/
|
||||
public List<AuthorSearchDto> searchAuthors(String query, int limit) {
|
||||
return migrationManager.searchAuthors(query, limit);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get tag suggestions with unified interface
|
||||
*/
|
||||
public List<String> getTagSuggestions(String query, int limit) {
|
||||
return migrationManager.getTagSuggestions(query, limit);
|
||||
}
|
||||
|
||||
// ===============================
|
||||
// INDEX OPERATIONS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Index a story with unified interface (supports dual-write)
|
||||
*/
|
||||
public void indexStory(Story story) {
|
||||
migrationManager.indexStory(story);
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a story in the index with unified interface (supports dual-write)
|
||||
*/
|
||||
public void updateStory(Story story) {
|
||||
migrationManager.updateStory(story);
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a story from the index with unified interface (supports dual-write)
|
||||
*/
|
||||
public void deleteStory(UUID storyId) {
|
||||
migrationManager.deleteStory(storyId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Index an author with unified interface (supports dual-write)
|
||||
*/
|
||||
public void indexAuthor(Author author) {
|
||||
migrationManager.indexAuthor(author);
|
||||
}
|
||||
|
||||
/**
|
||||
* Update an author in the index with unified interface (supports dual-write)
|
||||
*/
|
||||
public void updateAuthor(Author author) {
|
||||
migrationManager.updateAuthor(author);
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete an author from the index with unified interface (supports dual-write)
|
||||
*/
|
||||
public void deleteAuthor(UUID authorId) {
|
||||
migrationManager.deleteAuthor(authorId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Bulk index stories with unified interface (supports dual-write)
|
||||
*/
|
||||
public void bulkIndexStories(List<Story> stories) {
|
||||
migrationManager.bulkIndexStories(stories);
|
||||
}
|
||||
|
||||
/**
|
||||
* Bulk index authors with unified interface (supports dual-write)
|
||||
*/
|
||||
public void bulkIndexAuthors(List<Author> authors) {
|
||||
migrationManager.bulkIndexAuthors(authors);
|
||||
}
|
||||
|
||||
// ===============================
|
||||
// UTILITY METHODS
|
||||
// ===============================
|
||||
|
||||
/**
|
||||
* Check if search service is available and healthy
|
||||
*/
|
||||
public boolean isSearchServiceAvailable() {
|
||||
return migrationManager.isSearchServiceAvailable();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current search engine name
|
||||
*/
|
||||
public String getCurrentSearchEngine() {
|
||||
return migrationManager.getCurrentSearchEngine();
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if dual-write is enabled
|
||||
*/
|
||||
public boolean isDualWriteEnabled() {
|
||||
return migrationManager.isDualWriteEnabled();
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if we can switch to OpenSearch
|
||||
*/
|
||||
public boolean canSwitchToOpenSearch() {
|
||||
return migrationManager.canSwitchToOpenSearch();
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if we can switch to Typesense
|
||||
*/
|
||||
public boolean canSwitchToTypesense() {
|
||||
return migrationManager.canSwitchToTypesense();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current migration status for admin interface
|
||||
*/
|
||||
public SearchMigrationManager.SearchMigrationStatus getMigrationStatus() {
|
||||
return migrationManager.getStatus();
|
||||
}
|
||||
}
|
||||
@@ -19,6 +19,12 @@ spring:
|
||||
max-file-size: 256MB # Increased for backup restore
|
||||
max-request-size: 260MB # Slightly higher to account for form data
|
||||
|
||||
jackson:
|
||||
serialization:
|
||||
write-dates-as-timestamps: false
|
||||
deserialization:
|
||||
adjust-dates-to-context-time-zone: false
|
||||
|
||||
server:
|
||||
port: 8080
|
||||
|
||||
@@ -34,6 +40,7 @@ storycove:
|
||||
password: ${APP_PASSWORD} # REQUIRED: No default password for security
|
||||
search:
|
||||
engine: ${SEARCH_ENGINE:typesense} # typesense or opensearch
|
||||
dual-write: ${SEARCH_DUAL_WRITE:false} # enable dual-write during migration
|
||||
typesense:
|
||||
api-key: ${TYPESENSE_API_KEY:xyz}
|
||||
host: ${TYPESENSE_HOST:localhost}
|
||||
@@ -44,9 +51,9 @@ storycove:
|
||||
# Connection settings
|
||||
host: ${OPENSEARCH_HOST:localhost}
|
||||
port: ${OPENSEARCH_PORT:9200}
|
||||
scheme: ${OPENSEARCH_SCHEME:https}
|
||||
username: ${OPENSEARCH_USERNAME:admin}
|
||||
password: ${OPENSEARCH_PASSWORD} # REQUIRED when using OpenSearch
|
||||
scheme: ${OPENSEARCH_SCHEME:http}
|
||||
username: ${OPENSEARCH_USERNAME:}
|
||||
password: ${OPENSEARCH_PASSWORD:} # Empty when security is disabled
|
||||
|
||||
# Environment-specific configuration
|
||||
profile: ${SPRING_PROFILES_ACTIVE:development} # development, staging, production
|
||||
|
||||
@@ -2,3 +2,4 @@
|
||||
# https://curl.se/docs/http-cookies.html
|
||||
# This file was generated by libcurl! Edit at your own risk.
|
||||
|
||||
#HttpOnly_localhost FALSE / FALSE 1758433252 token eyJhbGciOiJIUzUxMiJ9.eyJzdWIiOiJ1c2VyIiwiaWF0IjoxNzU4MzQ2ODUyLCJleHAiOjE3NTg0MzMyNTIsImxpYnJhcnlJZCI6InNlY3JldCJ9.zEAQT5_11-pxPxmIhufSQqE26hvHldde4kFNE2HWWgBa5lT_Wt7jwpoPUMkQGQfShQwDZ9N-hFX3R2ew8jD7WQ
|
||||
|
||||
@@ -39,9 +39,8 @@ services:
|
||||
- TYPESENSE_PORT=8108
|
||||
- OPENSEARCH_HOST=opensearch
|
||||
- OPENSEARCH_PORT=9200
|
||||
- OPENSEARCH_USERNAME=${OPENSEARCH_USERNAME:-admin}
|
||||
- OPENSEARCH_PASSWORD=${OPENSEARCH_PASSWORD}
|
||||
- SEARCH_ENGINE=${SEARCH_ENGINE:-typesense}
|
||||
- OPENSEARCH_SCHEME=http
|
||||
- SEARCH_ENGINE=${SEARCH_ENGINE:-opensearch}
|
||||
- IMAGE_STORAGE_PATH=/app/images
|
||||
- APP_PASSWORD=${APP_PASSWORD}
|
||||
- STORYCOVE_CORS_ALLOWED_ORIGINS=${STORYCOVE_CORS_ALLOWED_ORIGINS:-http://localhost:3000,http://localhost:6925}
|
||||
@@ -87,11 +86,10 @@ services:
|
||||
- cluster.name=storycove-opensearch
|
||||
- node.name=opensearch-node
|
||||
- discovery.type=single-node
|
||||
- bootstrap.memory_lock=true
|
||||
- "OPENSEARCH_JAVA_OPTS=-Xms256m -Xmx256m"
|
||||
- bootstrap.memory_lock=false
|
||||
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
|
||||
- "DISABLE_INSTALL_DEMO_CONFIG=true"
|
||||
- "DISABLE_SECURITY_PLUGIN=false"
|
||||
- "OPENSEARCH_INITIAL_ADMIN_PASSWORD=${OPENSEARCH_PASSWORD}"
|
||||
- "DISABLE_SECURITY_PLUGIN=true"
|
||||
ulimits:
|
||||
memlock:
|
||||
soft: -1
|
||||
@@ -103,15 +101,15 @@ services:
|
||||
- opensearch_data:/usr/share/opensearch/data
|
||||
networks:
|
||||
- storycove-network
|
||||
restart: unless-stopped
|
||||
|
||||
opensearch-dashboards:
|
||||
image: opensearchproject/opensearch-dashboards:3.2.0
|
||||
# No port mapping - only accessible within the Docker network
|
||||
ports:
|
||||
- "5601:5601" # Expose OpenSearch Dashboard
|
||||
environment:
|
||||
- OPENSEARCH_HOSTS=https://opensearch:9200
|
||||
- "OPENSEARCH_USERNAME=${OPENSEARCH_USERNAME:-admin}"
|
||||
- "OPENSEARCH_PASSWORD=${OPENSEARCH_PASSWORD}"
|
||||
- "DISABLE_SECURITY_DASHBOARDS_PLUGIN=false"
|
||||
- OPENSEARCH_HOSTS=http://opensearch:9200
|
||||
- "DISABLE_SECURITY_DASHBOARDS_PLUGIN=true"
|
||||
depends_on:
|
||||
- opensearch
|
||||
networks:
|
||||
|
||||
@@ -49,7 +49,7 @@ export default function StoryReadingPage() {
|
||||
));
|
||||
|
||||
// Convert to character position in the plain text content
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
return Math.floor(scrollRatio * textLength);
|
||||
}, [story]);
|
||||
|
||||
@@ -57,7 +57,7 @@ export default function StoryReadingPage() {
|
||||
const calculateReadingPercentage = useCallback((currentPosition: number): number => {
|
||||
if (!story) return 0;
|
||||
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (totalLength === 0) return 0;
|
||||
|
||||
return Math.round((currentPosition / totalLength) * 100);
|
||||
@@ -67,7 +67,7 @@ export default function StoryReadingPage() {
|
||||
const scrollToCharacterPosition = useCallback((position: number) => {
|
||||
if (!contentRef.current || !story || hasScrolledToPosition) return;
|
||||
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (textLength === 0 || position === 0) return;
|
||||
|
||||
const ratio = position / textLength;
|
||||
|
||||
@@ -40,7 +40,7 @@ export default function CollectionReadingView({
|
||||
));
|
||||
|
||||
// Convert to character position in the plain text content
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
return Math.floor(scrollRatio * textLength);
|
||||
}, [story]);
|
||||
|
||||
@@ -48,7 +48,7 @@ export default function CollectionReadingView({
|
||||
const calculateReadingPercentage = useCallback((currentPosition: number): number => {
|
||||
if (!story) return 0;
|
||||
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (totalLength === 0) return 0;
|
||||
|
||||
return Math.round((currentPosition / totalLength) * 100);
|
||||
@@ -58,7 +58,7 @@ export default function CollectionReadingView({
|
||||
const scrollToCharacterPosition = useCallback((position: number) => {
|
||||
if (!contentRef.current || !story || hasScrolledToPosition) return;
|
||||
|
||||
const textLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const textLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (textLength === 0 || position === 0) return;
|
||||
|
||||
const ratio = position / textLength;
|
||||
|
||||
@@ -127,29 +127,6 @@ const FILTER_PRESETS: FilterPreset[] = [
|
||||
description: 'Stories that are part of a series',
|
||||
filters: { seriesFilter: 'series' },
|
||||
category: 'content'
|
||||
},
|
||||
|
||||
// Organization presets
|
||||
{
|
||||
id: 'well-tagged',
|
||||
label: '3+ tags',
|
||||
description: 'Well-tagged stories with 3 or more tags',
|
||||
filters: { minTagCount: 3 },
|
||||
category: 'organization'
|
||||
},
|
||||
{
|
||||
id: 'popular',
|
||||
label: 'Popular',
|
||||
description: 'Stories with above-average ratings',
|
||||
filters: { popularOnly: true },
|
||||
category: 'organization'
|
||||
},
|
||||
{
|
||||
id: 'hidden-gems',
|
||||
label: 'Hidden Gems',
|
||||
description: 'Underrated or unrated stories to discover',
|
||||
filters: { hiddenGemsOnly: true },
|
||||
category: 'organization'
|
||||
}
|
||||
];
|
||||
|
||||
|
||||
@@ -1,14 +1,39 @@
|
||||
'use client';
|
||||
|
||||
import { useState } from 'react';
|
||||
import React, { useState, useEffect } from 'react';
|
||||
import Button from '../ui/Button';
|
||||
import { storyApi, authorApi, databaseApi, configApi } from '../../lib/api';
|
||||
import { storyApi, authorApi, databaseApi, configApi, searchAdminApi } from '../../lib/api';
|
||||
|
||||
interface SystemSettingsProps {
|
||||
// No props needed - this component manages its own state
|
||||
}
|
||||
|
||||
export default function SystemSettings({}: SystemSettingsProps) {
|
||||
const [searchEngineStatus, setSearchEngineStatus] = useState<{
|
||||
currentEngine: string;
|
||||
dualWrite: boolean;
|
||||
typesenseAvailable: boolean;
|
||||
openSearchAvailable: boolean;
|
||||
loading: boolean;
|
||||
message: string;
|
||||
success?: boolean;
|
||||
}>({
|
||||
currentEngine: 'typesense',
|
||||
dualWrite: false,
|
||||
typesenseAvailable: false,
|
||||
openSearchAvailable: false,
|
||||
loading: false,
|
||||
message: ''
|
||||
});
|
||||
|
||||
const [openSearchStatus, setOpenSearchStatus] = useState<{
|
||||
reindex: { loading: boolean; message: string; success?: boolean };
|
||||
recreate: { loading: boolean; message: string; success?: boolean };
|
||||
}>({
|
||||
reindex: { loading: false, message: '' },
|
||||
recreate: { loading: false, message: '' }
|
||||
});
|
||||
|
||||
const [typesenseStatus, setTypesenseStatus] = useState<{
|
||||
reindex: { loading: boolean; message: string; success?: boolean };
|
||||
recreate: { loading: boolean; message: string; success?: boolean };
|
||||
@@ -419,13 +444,323 @@ export default function SystemSettings({}: SystemSettingsProps) {
|
||||
}, 10000);
|
||||
};
|
||||
|
||||
// Search Engine Management Functions
|
||||
const loadSearchEngineStatus = async () => {
|
||||
try {
|
||||
const status = await searchAdminApi.getStatus();
|
||||
setSearchEngineStatus(prev => ({
|
||||
...prev,
|
||||
currentEngine: status.primaryEngine,
|
||||
dualWrite: status.dualWrite,
|
||||
typesenseAvailable: status.typesenseAvailable,
|
||||
openSearchAvailable: status.openSearchAvailable,
|
||||
}));
|
||||
} catch (error: any) {
|
||||
console.error('Failed to load search engine status:', error);
|
||||
}
|
||||
};
|
||||
|
||||
const handleSwitchEngine = async (engine: string) => {
|
||||
setSearchEngineStatus(prev => ({ ...prev, loading: true, message: `Switching to ${engine}...` }));
|
||||
|
||||
try {
|
||||
const result = engine === 'opensearch'
|
||||
? await searchAdminApi.switchToOpenSearch()
|
||||
: await searchAdminApi.switchToTypesense();
|
||||
|
||||
setSearchEngineStatus(prev => ({
|
||||
...prev,
|
||||
loading: false,
|
||||
message: result.message,
|
||||
success: true,
|
||||
currentEngine: engine
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setSearchEngineStatus(prev => ({ ...prev, message: '', success: undefined }));
|
||||
}, 5000);
|
||||
} catch (error: any) {
|
||||
setSearchEngineStatus(prev => ({
|
||||
...prev,
|
||||
loading: false,
|
||||
message: error.message || 'Failed to switch engine',
|
||||
success: false
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setSearchEngineStatus(prev => ({ ...prev, message: '', success: undefined }));
|
||||
}, 5000);
|
||||
}
|
||||
};
|
||||
|
||||
const handleToggleDualWrite = async () => {
|
||||
const newDualWrite = !searchEngineStatus.dualWrite;
|
||||
setSearchEngineStatus(prev => ({
|
||||
...prev,
|
||||
loading: true,
|
||||
message: `${newDualWrite ? 'Enabling' : 'Disabling'} dual-write...`
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = newDualWrite
|
||||
? await searchAdminApi.enableDualWrite()
|
||||
: await searchAdminApi.disableDualWrite();
|
||||
|
||||
setSearchEngineStatus(prev => ({
|
||||
...prev,
|
||||
loading: false,
|
||||
message: result.message,
|
||||
success: true,
|
||||
dualWrite: newDualWrite
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setSearchEngineStatus(prev => ({ ...prev, message: '', success: undefined }));
|
||||
}, 5000);
|
||||
} catch (error: any) {
|
||||
setSearchEngineStatus(prev => ({
|
||||
...prev,
|
||||
loading: false,
|
||||
message: error.message || 'Failed to toggle dual-write',
|
||||
success: false
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setSearchEngineStatus(prev => ({ ...prev, message: '', success: undefined }));
|
||||
}, 5000);
|
||||
}
|
||||
};
|
||||
|
||||
const handleOpenSearchReindex = async () => {
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
reindex: { loading: true, message: 'Reindexing OpenSearch...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await searchAdminApi.reindexOpenSearch();
|
||||
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
reindex: {
|
||||
loading: false,
|
||||
message: result.success ? result.message : (result.error || 'Reindex failed'),
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
reindex: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 8000);
|
||||
} catch (error: any) {
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
reindex: {
|
||||
loading: false,
|
||||
message: error.message || 'Network error occurred',
|
||||
success: false
|
||||
}
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
reindex: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 8000);
|
||||
}
|
||||
};
|
||||
|
||||
const handleOpenSearchRecreate = async () => {
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
recreate: { loading: true, message: 'Recreating OpenSearch indices...', success: undefined }
|
||||
}));
|
||||
|
||||
try {
|
||||
const result = await searchAdminApi.recreateOpenSearchIndices();
|
||||
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
recreate: {
|
||||
loading: false,
|
||||
message: result.success ? result.message : (result.error || 'Recreation failed'),
|
||||
success: result.success
|
||||
}
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
recreate: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 8000);
|
||||
} catch (error: any) {
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
recreate: {
|
||||
loading: false,
|
||||
message: error.message || 'Network error occurred',
|
||||
success: false
|
||||
}
|
||||
}));
|
||||
|
||||
setTimeout(() => {
|
||||
setOpenSearchStatus(prev => ({
|
||||
...prev,
|
||||
recreate: { loading: false, message: '', success: undefined }
|
||||
}));
|
||||
}, 8000);
|
||||
}
|
||||
};
|
||||
|
||||
// Load status on component mount
|
||||
useEffect(() => {
|
||||
loadSearchEngineStatus();
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
{/* Typesense Search Management */}
|
||||
{/* Search Engine Management */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Search Index Management</h2>
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Search Engine Migration</h2>
|
||||
<p className="theme-text mb-6">
|
||||
Manage all Typesense search indexes (stories, authors, collections, etc.). Use these tools if search functionality isn't working properly.
|
||||
Manage the transition from Typesense to OpenSearch. Switch between engines, enable dual-write mode, and perform maintenance operations.
|
||||
</p>
|
||||
|
||||
<div className="space-y-6">
|
||||
{/* Current Status */}
|
||||
<div className="border theme-border rounded-lg p-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">Current Configuration</h3>
|
||||
<div className="grid grid-cols-1 sm:grid-cols-2 gap-3 text-sm">
|
||||
<div className="flex justify-between">
|
||||
<span>Primary Engine:</span>
|
||||
<span className={`font-medium ${searchEngineStatus.currentEngine === 'opensearch' ? 'text-blue-600 dark:text-blue-400' : 'text-green-600 dark:text-green-400'}`}>
|
||||
{searchEngineStatus.currentEngine.charAt(0).toUpperCase() + searchEngineStatus.currentEngine.slice(1)}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between">
|
||||
<span>Dual-Write:</span>
|
||||
<span className={`font-medium ${searchEngineStatus.dualWrite ? 'text-orange-600 dark:text-orange-400' : 'text-gray-600 dark:text-gray-400'}`}>
|
||||
{searchEngineStatus.dualWrite ? 'Enabled' : 'Disabled'}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between">
|
||||
<span>Typesense:</span>
|
||||
<span className={`font-medium ${searchEngineStatus.typesenseAvailable ? 'text-green-600 dark:text-green-400' : 'text-red-600 dark:text-red-400'}`}>
|
||||
{searchEngineStatus.typesenseAvailable ? 'Available' : 'Unavailable'}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex justify-between">
|
||||
<span>OpenSearch:</span>
|
||||
<span className={`font-medium ${searchEngineStatus.openSearchAvailable ? 'text-green-600 dark:text-green-400' : 'text-red-600 dark:text-red-400'}`}>
|
||||
{searchEngineStatus.openSearchAvailable ? 'Available' : 'Unavailable'}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Engine Switching */}
|
||||
<div className="border theme-border rounded-lg p-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">Engine Controls</h3>
|
||||
<div className="flex flex-col sm:flex-row gap-3 mb-4">
|
||||
<Button
|
||||
onClick={() => handleSwitchEngine('typesense')}
|
||||
disabled={searchEngineStatus.loading || !searchEngineStatus.typesenseAvailable || searchEngineStatus.currentEngine === 'typesense'}
|
||||
variant={searchEngineStatus.currentEngine === 'typesense' ? 'primary' : 'ghost'}
|
||||
className="flex-1"
|
||||
>
|
||||
{searchEngineStatus.currentEngine === 'typesense' ? '✓ Typesense (Active)' : 'Switch to Typesense'}
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => handleSwitchEngine('opensearch')}
|
||||
disabled={searchEngineStatus.loading || !searchEngineStatus.openSearchAvailable || searchEngineStatus.currentEngine === 'opensearch'}
|
||||
variant={searchEngineStatus.currentEngine === 'opensearch' ? 'primary' : 'ghost'}
|
||||
className="flex-1"
|
||||
>
|
||||
{searchEngineStatus.currentEngine === 'opensearch' ? '✓ OpenSearch (Active)' : 'Switch to OpenSearch'}
|
||||
</Button>
|
||||
<Button
|
||||
onClick={handleToggleDualWrite}
|
||||
disabled={searchEngineStatus.loading}
|
||||
variant={searchEngineStatus.dualWrite ? 'secondary' : 'ghost'}
|
||||
className="flex-1"
|
||||
>
|
||||
{searchEngineStatus.dualWrite ? 'Disable Dual-Write' : 'Enable Dual-Write'}
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{searchEngineStatus.message && (
|
||||
<div className={`text-sm p-3 rounded mb-3 ${
|
||||
searchEngineStatus.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{searchEngineStatus.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* OpenSearch Operations */}
|
||||
<div className="border theme-border rounded-lg p-4">
|
||||
<h3 className="text-lg font-semibold theme-header mb-3">OpenSearch Operations</h3>
|
||||
<p className="text-sm theme-text mb-4">
|
||||
Perform maintenance operations on OpenSearch indices. Use these if OpenSearch isn't returning expected results.
|
||||
</p>
|
||||
|
||||
<div className="flex flex-col sm:flex-row gap-3 mb-4">
|
||||
<Button
|
||||
onClick={handleOpenSearchReindex}
|
||||
disabled={openSearchStatus.reindex.loading || openSearchStatus.recreate.loading || !searchEngineStatus.openSearchAvailable}
|
||||
loading={openSearchStatus.reindex.loading}
|
||||
variant="ghost"
|
||||
className="flex-1"
|
||||
>
|
||||
{openSearchStatus.reindex.loading ? 'Reindexing...' : '🔄 Reindex OpenSearch'}
|
||||
</Button>
|
||||
<Button
|
||||
onClick={handleOpenSearchRecreate}
|
||||
disabled={openSearchStatus.reindex.loading || openSearchStatus.recreate.loading || !searchEngineStatus.openSearchAvailable}
|
||||
loading={openSearchStatus.recreate.loading}
|
||||
variant="secondary"
|
||||
className="flex-1"
|
||||
>
|
||||
{openSearchStatus.recreate.loading ? 'Recreating...' : '🏗️ Recreate Indices'}
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
{/* OpenSearch Status Messages */}
|
||||
{openSearchStatus.reindex.message && (
|
||||
<div className={`text-sm p-3 rounded mb-3 ${
|
||||
openSearchStatus.reindex.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{openSearchStatus.reindex.message}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{openSearchStatus.recreate.message && (
|
||||
<div className={`text-sm p-3 rounded mb-3 ${
|
||||
openSearchStatus.recreate.success
|
||||
? 'bg-green-50 dark:bg-green-900/20 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-50 dark:bg-red-900/20 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
{openSearchStatus.recreate.message}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Legacy Typesense Management */}
|
||||
<div className="theme-card theme-shadow rounded-lg p-6">
|
||||
<h2 className="text-xl font-semibold theme-header mb-4">Legacy Typesense Management</h2>
|
||||
<p className="theme-text mb-6">
|
||||
Manage Typesense search indexes (for backwards compatibility and during migration). These tools will be removed once migration is complete.
|
||||
</p>
|
||||
|
||||
<div className="space-y-6">
|
||||
|
||||
@@ -28,6 +28,23 @@ export default function StoryCard({
|
||||
const [rating, setRating] = useState(story.rating || 0);
|
||||
const [updating, setUpdating] = useState(false);
|
||||
|
||||
// Helper function to get tags from either tags array or tagNames array
|
||||
const getTags = () => {
|
||||
if (Array.isArray(story.tags) && story.tags.length > 0) {
|
||||
return story.tags;
|
||||
}
|
||||
if (Array.isArray(story.tagNames) && story.tagNames.length > 0) {
|
||||
// Convert tagNames to Tag objects for display compatibility
|
||||
return story.tagNames.map((name, index) => ({
|
||||
id: `tag-${index}`, // Temporary ID for display
|
||||
name: name
|
||||
}));
|
||||
}
|
||||
return [];
|
||||
};
|
||||
|
||||
const displayTags = getTags();
|
||||
|
||||
const handleRatingClick = async (e: React.MouseEvent, newRating: number) => {
|
||||
// Prevent default and stop propagation to avoid triggering navigation
|
||||
e.preventDefault();
|
||||
@@ -58,7 +75,7 @@ export default function StoryCard({
|
||||
const calculateReadingPercentage = (story: Story): number => {
|
||||
if (!story.readingPosition) return 0;
|
||||
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml.length;
|
||||
const totalLength = story.contentPlain?.length || story.contentHtml?.length || 0;
|
||||
if (totalLength === 0) return 0;
|
||||
|
||||
return Math.round((story.readingPosition / totalLength) * 100);
|
||||
@@ -124,9 +141,9 @@ export default function StoryCard({
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
{Array.isArray(story.tags) && story.tags.length > 0 && (
|
||||
{displayTags.length > 0 && (
|
||||
<div className="flex flex-wrap gap-1 mt-2">
|
||||
{story.tags.slice(0, 3).map((tag) => (
|
||||
{displayTags.slice(0, 3).map((tag) => (
|
||||
<TagDisplay
|
||||
key={tag.id}
|
||||
tag={tag}
|
||||
@@ -134,9 +151,9 @@ export default function StoryCard({
|
||||
clickable={false}
|
||||
/>
|
||||
))}
|
||||
{story.tags.length > 3 && (
|
||||
{displayTags.length > 3 && (
|
||||
<span className="px-2 py-1 text-xs theme-text">
|
||||
+{story.tags.length - 3} more
|
||||
+{displayTags.length - 3} more
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
@@ -260,9 +277,9 @@ export default function StoryCard({
|
||||
</div>
|
||||
|
||||
{/* Tags */}
|
||||
{Array.isArray(story.tags) && story.tags.length > 0 && (
|
||||
{displayTags.length > 0 && (
|
||||
<div className="flex flex-wrap gap-1 mt-2">
|
||||
{story.tags.slice(0, 2).map((tag) => (
|
||||
{displayTags.slice(0, 2).map((tag) => (
|
||||
<TagDisplay
|
||||
key={tag.id}
|
||||
tag={tag}
|
||||
@@ -270,9 +287,9 @@ export default function StoryCard({
|
||||
clickable={false}
|
||||
/>
|
||||
))}
|
||||
{story.tags.length > 2 && (
|
||||
{displayTags.length > 2 && (
|
||||
<span className="px-2 py-1 text-xs theme-text">
|
||||
+{story.tags.length - 2}
|
||||
+{displayTags.length - 2}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
@@ -611,6 +611,79 @@ export const configApi = {
|
||||
},
|
||||
};
|
||||
|
||||
// Search Engine Management API
|
||||
export const searchAdminApi = {
|
||||
// Get migration status
|
||||
getStatus: async (): Promise<{
|
||||
primaryEngine: string;
|
||||
dualWrite: boolean;
|
||||
typesenseAvailable: boolean;
|
||||
openSearchAvailable: boolean;
|
||||
}> => {
|
||||
const response = await api.get('/admin/search/status');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
// Configure search engine
|
||||
configure: async (config: { engine: string; dualWrite: boolean }): Promise<{ message: string }> => {
|
||||
const response = await api.post('/admin/search/configure', config);
|
||||
return response.data;
|
||||
},
|
||||
|
||||
// Enable/disable dual-write
|
||||
enableDualWrite: async (): Promise<{ message: string }> => {
|
||||
const response = await api.post('/admin/search/dual-write/enable');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
disableDualWrite: async (): Promise<{ message: string }> => {
|
||||
const response = await api.post('/admin/search/dual-write/disable');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
// Switch engines
|
||||
switchToOpenSearch: async (): Promise<{ message: string }> => {
|
||||
const response = await api.post('/admin/search/switch/opensearch');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
switchToTypesense: async (): Promise<{ message: string }> => {
|
||||
const response = await api.post('/admin/search/switch/typesense');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
// Emergency rollback
|
||||
emergencyRollback: async (): Promise<{ message: string }> => {
|
||||
const response = await api.post('/admin/search/emergency-rollback');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
// OpenSearch operations
|
||||
reindexOpenSearch: async (): Promise<{
|
||||
success: boolean;
|
||||
message: string;
|
||||
storiesCount?: number;
|
||||
authorsCount?: number;
|
||||
totalCount?: number;
|
||||
error?: string;
|
||||
}> => {
|
||||
const response = await api.post('/admin/search/opensearch/reindex');
|
||||
return response.data;
|
||||
},
|
||||
|
||||
recreateOpenSearchIndices: async (): Promise<{
|
||||
success: boolean;
|
||||
message: string;
|
||||
storiesCount?: number;
|
||||
authorsCount?: number;
|
||||
totalCount?: number;
|
||||
error?: string;
|
||||
}> => {
|
||||
const response = await api.post('/admin/search/opensearch/recreate');
|
||||
return response.data;
|
||||
},
|
||||
};
|
||||
|
||||
// Collection endpoints
|
||||
export const collectionApi = {
|
||||
getCollections: async (params?: {
|
||||
|
||||
Reference in New Issue
Block a user