2013-08-06 93 views
5

我使用Hibernate Search在Web应用程序上编制索引和全文搜索项,无问题!Lucene/Hibernate搜索锁定异常

从我的pom.xml:

<hibernate.search.version>3.4.2.Final</hibernate.search.version> 
<apache.lucene.version>3.6.2</apache.lucene.version> 
<apache.solr.version>3.6.2</apache.solr.version> 
<hibernate.version>3.6.9.Final</hibernate.version> 

现在,才去生产我想强调的测试使用Apache JMeter的我的web应用程序的搜索功能。当使用一个以上的线程测试中,我收到吨以下的异常:

17:11:57,670 ERROR LogErrorHandler:82 - Exception occurred org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: [email protected]/opt/myapp/item_index/myapp.item.domain.Item/write.lock 
Primary Failure: 
    Entity myapp.item.domain.Item Id 4 Work Type org.hibernate.search.backend.DeleteLuceneWork 
org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: [email protected]/opt/myapp/item_index/myapp.item.domain.Item/write.lock 
    at org.apache.lucene.store.Lock.obtain(Lock.java:84) 
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1098) 
    at org.hibernate.search.backend.Workspace.createNewIndexWriter(Workspace.java:202) 
    at org.hibernate.search.backend.Workspace.getIndexWriter(Workspace.java:180) 
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:103) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) 
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:138) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) 
    at java.lang.Thread.run(Thread.java:662) 
17:11:57,670 ERROR PerDPQueueProcessor:118 - Unexpected error in Lucene Backend: 
org.hibernate.search.SearchException: Unable to remove class myapp.item.domain.Item#4 from index. 
    at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:77) 
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:106) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) 
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:138) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) 
    at java.lang.Thread.run(Thread.java:662) 
Caused by: java.lang.NullPointerException 
    at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:72) 
    ... 7 more 

我创建索引时使用下面的函数(从@Transactional方法调用)完成的:

@Override 
public void createInitialIndexFromDB() { 
    // get session and set flush mode to manually to control the commit 
    FullTextSession fullTextSession = getFullTextSession(); 
    fullTextSession.setFlushMode(FlushMode.MANUAL); 
    // do not add any data to the object context 
    fullTextSession.setCacheMode(CacheMode.IGNORE); 

    addResultsToIndex(fullTextSession, FETCH_ITEMS_TO_INDEX); 
    addResultsToIndex(fullTextSession, FETCH_DRAFTS_TO_INDEX); 
    addResultsToIndex(fullTextSession, FETCH_RESERVATIONS_TO_INDEX); 
    addResultsToIndex(fullTextSession, FETCH_SALES_TO_INDEX); 

    fullTextSession.flushToIndexes(); 
    fullTextSession.clear(); 
} 

private void addResultsToIndex(FullTextSession fullTextSession, String query) { 
    ScrollableResults results = fullTextSession.createQuery(query).scroll(
      ScrollMode.FORWARD_ONLY); 
    for (int index = 1; results.next(); index++) { 
     fullTextSession.index(results.get(0)); 
     if (index % BATCH_SIZE == 0 || results.isLast()) { 
      fullTextSession.flushToIndexes(); 
      fullTextSession.clear(); 
     } 
    } 
} 

private FullTextSession getFullTextSession() { 
    Session session = this.sessionFactory.getCurrentSession(); 
    return Search.getFullTextSession(session); 
} 

创建了索引之后我的各项指标均变化正在经历一个自定义FullTextIndexEventListener:

public final class HibernateItemEventListener extends 
     FullTextIndexEventListener { 

    private static final Logger log = LoggerFactory 
      .getLogger(HibernateItemEventListener.class); 

    public HibernateItemEventListener() { 
     super(Installation.SINGLE_INSTANCE); 
    } 

    @Override 
    public void onPostInsert(PostInsertEvent event) { 

     log.debug("onPostInsert"); 
     if (!isIndexed(event.getEntity())) 
      return; 

     // Without these checks the elements are added twice to the index! 
     if (event.getEntity() instanceof ItemReservation) 
      return; 

     if (event.getEntity() instanceof ItemSale) 
      return; 

     super.onPostInsert(event); 
    } 

    @Override 
    public void onPostUpdate(PostUpdateEvent event) { 
     log.debug("onPostUpdate - Start"); 
     if (!isIndexed(event.getEntity())) 
      return; 

     Serializable id = event.getId(); 
     log.debug("onPostUpdate - Need update for id " + id); 

     if (used) { 
      boolean identifierRollbackEnabled = event.getSession().getFactory() 
        .getSettings().isIdentifierRollbackEnabled(); 
      final Object entity = event.getEntity(); 
      if (searchFactoryImplementor.getDocumentBuilderIndexedEntity(entity 
        .getClass()) != null 
        || searchFactoryImplementor 
          .getDocumentBuilderContainedEntity(entity 
            .getClass()) != null) { 

       // Remove item 
       if (entity instanceof Item) { 
        Item item = (Item) entity; 
        if (item.getQuantity() < 1) { 
         processWork(entity, id, WorkType.PURGE, event, 
           identifierRollbackEnabled); 
         return; 
        } 
       } 

       // Remove reservation 
       if (entity instanceof ItemReservation) { 
        ItemReservation ir = (ItemReservation) entity; 
        if (ir.getActive() < 1) { 
         processWork(entity, id, WorkType.PURGE, event, 
           identifierRollbackEnabled); 
         return; 
        } 
       } 

       // Update entity 
       processWork(entity, id, WorkType.UPDATE, event, 
         identifierRollbackEnabled); 
      } else { 
       // Add entity 
       processWork(entity, id, WorkType.ADD, event, 
         identifierRollbackEnabled); 
      } 
     } 
    } 

    @Override 
    public void onPostDelete(PostDeleteEvent event) { 
     log.debug("onPostDelete - Start"); 
     if (!isIndexed(event.getEntity())) 
      return; 
     log.debug("onPostDelete - Need delete for id " + event.getId()); 
     super.onPostDelete(event); 
    } 

    private boolean isIndexed(Object entity) { 
     return entity instanceof Item || entity instanceof Draft 
       || entity instanceof ItemReservation 
       || entity instanceof ItemSale; 
    } 
} 

异常上面并没有影响到应用程序本身(搜索确实WO rk),但它有时会导致另一个更严重的例外(我认为这与锁问题有关):

17:11:58,866 ERROR LogErrorHandler:82 - Exception occurred java.io.FileNotFoundException: _iz.fdx 
java.io.FileNotFoundException: _iz.fdx 
    at org.apache.lucene.store.FSDirectory.fileLength(FSDirectory.java:284) 
    at org.apache.lucene.index.SegmentInfo.sizeInBytes(SegmentInfo.java:303) 
    at org.apache.lucene.index.LogMergePolicy.sizeBytes(LogMergePolicy.java:193) 
    at org.apache.lucene.index.LogByteSizeMergePolicy.size(LogByteSizeMergePolicy.java:45) 
    at org.apache.lucene.index.LogMergePolicy.useCompoundFile(LogMergePolicy.java:147) 
    at org.apache.lucene.index.DocumentsWriter.flush(DocumentsWriter.java:593) 
    at org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3587) 
    at org.apache.lucene.index.IndexWriter.prepareCommit(IndexWriter.java:3376) 
    at org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3485) 
    at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3467) 
    at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3451) 
    at org.hibernate.search.backend.Workspace.commitIndexWriter(Workspace.java:220) 
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:109) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) 
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:138) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) 
    at java.lang.Thread.run(Thread.java:662) 
17:11:59,991 ERROR LogErrorHandler:82 - Exception occurred java.io.FileNotFoundException: /opt/myapp/item_index/myapp.item.domain.Item/_iz.cfs (No such file or directory) 

任何提示?

EDIT: 库调整和JMeter的攻击性增加后(多个线程)异常和索引损坏仍然出现:

12:08:11,163 ERROR LogErrorHandler:82 - Exception occurred java.io.FileNotFoundException: /opt/myapp/item_index/myapp.item.domain.Item/_8gy.cfs (No such file or directory) 
Primary Failure: 
    Entity myapp.item.domain.Item Id 596 Work Type org.hibernate.search.backend.DeleteLuceneWork 
Subsequent failures: 
    Entity myapp.item.domain.Item Id 596 Work Type org.hibernate.search.backend.AddLuceneWork 
    Entity myapp.item.domain.Item Id 734 Work Type org.hibernate.search.backend.DeleteLuceneWork 
    Entity myapp.item.domain.Item Id 734 Work Type org.hibernate.search.backend.AddLuceneWork 
    Entity myapp.item.domain.Item Id 599 Work Type org.hibernate.search.backend.DeleteLuceneWork 
    Entity myapp.item.domain.Item Id 599 Work Type org.hibernate.search.backend.AddLuceneWork 
    Entity myapp.item.domain.Item Id 735 Work Type org.hibernate.search.backend.DeleteLuceneWork 
    Entity myapp.item.domain.Item Id 735 Work Type org.hibernate.search.backend.AddLuceneWork 
    Entity myapp.item.domain.Item Id 598 Work Type org.hibernate.search.backend.DeleteLuceneWork 
    Entity myapp.item.domain.Item Id 598 Work Type org.hibernate.search.backend.AddLuceneWork 
    Entity myapp.item.domain.Item Id 720 Work Type org.hibernate.search.backend.DeleteLuceneWork 
    Entity myapp.item.domain.Item Id 720 Work Type org.hibernate.search.backend.AddLuceneWork 

java.io.FileNotFoundException: /opt/myapp/item_index/myapp.item.domain.Item/_8gy.cfs (No such file or directory) 
    at java.io.RandomAccessFile.open(Native Method) 
    at java.io.RandomAccessFile.<init>(RandomAccessFile.java:216) 
    at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput$Descriptor.<init>(SimpleFSDirectory.java:69) 
    at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput.<init>(SimpleFSDirectory.java:90) 
    at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.<init>(NIOFSDirectory.java:91) 
    at org.apache.lucene.store.NIOFSDirectory.openInput(NIOFSDirectory.java:78) 
    at org.apache.lucene.index.CompoundFileReader.<init>(CompoundFileReader.java:66) 
    at org.apache.lucene.index.CompoundFileReader.<init>(CompoundFileReader.java:55) 
    at org.apache.lucene.index.IndexWriter.getFieldInfos(IndexWriter.java:1193) 
    at org.apache.lucene.index.IndexWriter.getCurrentFieldInfos(IndexWriter.java:1213) 
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1149) 
    at org.hibernate.search.backend.Workspace.createNewIndexWriter(Workspace.java:202) 
    at org.hibernate.search.backend.Workspace.getIndexWriter(Workspace.java:180) 
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:103) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) 
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:138) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) 
    at java.lang.Thread.run(Thread.java:662) 
12:08:11,163 ERROR PerDPQueueProcessor:118 - Unexpected error in Lucene Backend: 
org.hibernate.search.SearchException: Unable to remove class myapp.item.domain.Item#596 from index. 
    at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:77) 
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:106) 
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) 
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) 
    at java.util.concurrent.FutureTask.run(FutureTask.java:138) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) 
    at java.lang.Thread.run(Thread.java:662) 
Caused by: java.lang.NullPointerException 

EDIT调试之后: FileNotFound(和因此的IndexWriter NPE)产生at

IndexWriter writer = new IndexWriter(directoryProvider.getDirectory(), writerConfig); 

in Workspace.createNewIndexWriter()method。

writerConfig: 
matchVersion=LUCENE_31 
analyzer=org.apache.lucene.analysis.SimpleAnalyzer 
delPolicy=org.apache.lucene.index.KeepOnlyLastCommitDeletionPolicy 
commit=null 
openMode=CREATE_OR_APPEND 
similarity=org.apache.lucene.search.DefaultSimilarity 
termIndexInterval=128 
mergeScheduler=org.hibernate.search.backend.impl.lucene.overrides.ConcurrentMergeScheduler 
default WRITE_LOCK_TIMEOUT=1000 
writeLockTimeout=1000 
maxBufferedDeleteTerms=-1 
ramBufferSizeMB=16.0 
maxBufferedDocs=-1 
mergedSegmentWarmer=null 
mergePolicy=[LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=10, maxMergeSize=2147483648, maxMergeSizeForOptimize=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, useCompoundFile=true] 
maxThreadStates=8 
readerPooling=false 
readerTermsIndexDivisor=1 
+0

你的索引存储在哪种文件系统上?它可靠吗?此外,你有一个NullPointerException,导致我怀疑你的自定义FullTextIndexEventListener中有错误。你能总结一下你需要这个自定义侦听器吗? – Sanne

+0

我在xubuntu和CentOS上使用ext4文件系统(我试过两种不同的系统)。索引文件与tomcat位于同一台服务器上,文件权限分配正确。自定义侦听器用于从索引(即来自搜索结果)中删除不可用的项目(数量<1),但它们保留在数据库中。你在听者代码中看到了什么错误吗? – Wizche

+0

我没有发现听众中的任何内容,但我认为您应该首先调试NullPointerException:我怀疑它会在关键路径上杀死索引队列,导致您遇到文件锁定问题。我认为NPE可能是由听众中的某些东西引起的。 – Sanne

回答

1

hibernate search forum thread

我想我解决了这个问题。

问题不在于Hibernate Search本身,而是在超过并发事务阈值 (可能使文件描述符保持打开状态)后,Atomikos 杀死了线程。我增加了 并发事务限制(它是50),问题就没有了。

尽管如此,我仍然不明白为什么每次执行lucene查询时,hibernate搜索都会打开 索引编写器。

1

Hibernate Search的高度耦合到Lucene的API的变化,你使用Lucene的版本是由Hibernate 3.4.2搜索的发展过程中所使用的完全不同。

这些都是建议的版本,我在看他们the pom.xml from the 3.4.2.Final tag

<hibernate.search.version>3.4.2.Final</hibernate.search.version> 
<apache.lucene.version>3.1.0</apache.lucene.version> 
<apache.solr.version>3.1.0</apache.solr.version> 
<hibernate.version>3.6.10.Final</hibernate.version> 
+0

Sanne,我只是试图增加JMeter线程/请求参数,而在约1000个请求后仍然有异常。看到我上面的编辑。我完全按照Hibernate Search 3.4.2 pom文件中的说明修复了这些库。 – Wizche