edu.berkeley.nlp.lm.cache
Interface ContextEncodedLmCache

All Superinterfaces:
Serializable
All Known Implementing Classes:
ContextEncodedDirectMappedLmCache

public interface ContextEncodedLmCache
extends Serializable


Method Summary
 int capacity()
           
 float getCached(long contextOffset, int contextOrder, int word, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)
          Should return Float.NaN if requested n-gram is not in the cache.
 void putCached(long contextOffset, int contextOrder, int word, float prob, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)
           
 

Method Detail

getCached

float getCached(long contextOffset,
                int contextOrder,
                int word,
                int hash,
                ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)
Should return Float.NaN if requested n-gram is not in the cache.

Parameters:
contextOffset -
contextOrder -
word -
hash -
outputPrefix -
Returns:

putCached

void putCached(long contextOffset,
               int contextOrder,
               int word,
               float prob,
               int hash,
               ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)

capacity

int capacity()