edu.berkeley.nlp.lm.cache
Class ContextEncodedDirectMappedLmCache
java.lang.Object
edu.berkeley.nlp.lm.cache.ContextEncodedDirectMappedLmCache
- All Implemented Interfaces:
- ContextEncodedLmCache, Serializable
public final class ContextEncodedDirectMappedLmCache
- extends Object
- implements ContextEncodedLmCache
- See Also:
- Serialized Form
Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
ContextEncodedDirectMappedLmCache
public ContextEncodedDirectMappedLmCache(int cacheBits,
boolean threadSafe)
getCached
public float getCached(long contextOffset,
int contextOrder,
int word,
int hash,
ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)
- Description copied from interface:
ContextEncodedLmCache
- Should return Float.NaN if requested n-gram is not in the cache.
- Specified by:
getCached
in interface ContextEncodedLmCache
- Returns:
putCached
public void putCached(long contextOffset,
int contextOrder,
int word,
float score,
int hash,
ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)
- Specified by:
putCached
in interface ContextEncodedLmCache
capacity
public int capacity()
- Specified by:
capacity
in interface ContextEncodedLmCache