gotosocial/vendor/codeberg.org/gruf/go-cache/v3/cache.go

76 lines
3.3 KiB
Go
Raw Normal View History

package cache
import (
"time"
[feature] simpler cache size configuration (#2051) * add automatic cache max size generation based on ratios of a singular fixed memory target Signed-off-by: kim <grufwub@gmail.com> * remove now-unused cache max-size config variables Signed-off-by: kim <grufwub@gmail.com> * slight ratio tweak Signed-off-by: kim <grufwub@gmail.com> * remove unused visibility config var Signed-off-by: kim <grufwub@gmail.com> * add secret little ratio config trick Signed-off-by: kim <grufwub@gmail.com> * fixed a word Signed-off-by: kim <grufwub@gmail.com> * update cache library to remove use of TTL in result caches + slice cache Signed-off-by: kim <grufwub@gmail.com> * update other cache usages to use correct interface Signed-off-by: kim <grufwub@gmail.com> * update example config to explain the cache memory target Signed-off-by: kim <grufwub@gmail.com> * update env parsing test with new config values Signed-off-by: kim <grufwub@gmail.com> * do some ratio twiddling Signed-off-by: kim <grufwub@gmail.com> * add missing header * update envparsing with latest defaults Signed-off-by: kim <grufwub@gmail.com> * update size calculations to take into account result cache, simple cache and extra map overheads Signed-off-by: kim <grufwub@gmail.com> * tweak the ratios some more Signed-off-by: kim <grufwub@gmail.com> * more nan rampaging Signed-off-by: kim <grufwub@gmail.com> * fix envparsing script Signed-off-by: kim <grufwub@gmail.com> * update cache library, add sweep function to keep caches trim Signed-off-by: kim <grufwub@gmail.com> * sweep caches once a minute Signed-off-by: kim <grufwub@gmail.com> * add a regular job to sweep caches and keep under 80% utilisation Signed-off-by: kim <grufwub@gmail.com> * remove dead code Signed-off-by: kim <grufwub@gmail.com> * add new size library used to libraries section of readme Signed-off-by: kim <grufwub@gmail.com> * add better explanations for the mem-ratio numbers Signed-off-by: kim <grufwub@gmail.com> * update go-cache Signed-off-by: kim <grufwub@gmail.com> * library version bump Signed-off-by: kim <grufwub@gmail.com> * update cache.result{} size model estimation Signed-off-by: kim <grufwub@gmail.com> --------- Signed-off-by: kim <grufwub@gmail.com>
2023-08-03 09:34:35 +00:00
"codeberg.org/gruf/go-cache/v3/simple"
"codeberg.org/gruf/go-cache/v3/ttl"
)
[feature] simpler cache size configuration (#2051) * add automatic cache max size generation based on ratios of a singular fixed memory target Signed-off-by: kim <grufwub@gmail.com> * remove now-unused cache max-size config variables Signed-off-by: kim <grufwub@gmail.com> * slight ratio tweak Signed-off-by: kim <grufwub@gmail.com> * remove unused visibility config var Signed-off-by: kim <grufwub@gmail.com> * add secret little ratio config trick Signed-off-by: kim <grufwub@gmail.com> * fixed a word Signed-off-by: kim <grufwub@gmail.com> * update cache library to remove use of TTL in result caches + slice cache Signed-off-by: kim <grufwub@gmail.com> * update other cache usages to use correct interface Signed-off-by: kim <grufwub@gmail.com> * update example config to explain the cache memory target Signed-off-by: kim <grufwub@gmail.com> * update env parsing test with new config values Signed-off-by: kim <grufwub@gmail.com> * do some ratio twiddling Signed-off-by: kim <grufwub@gmail.com> * add missing header * update envparsing with latest defaults Signed-off-by: kim <grufwub@gmail.com> * update size calculations to take into account result cache, simple cache and extra map overheads Signed-off-by: kim <grufwub@gmail.com> * tweak the ratios some more Signed-off-by: kim <grufwub@gmail.com> * more nan rampaging Signed-off-by: kim <grufwub@gmail.com> * fix envparsing script Signed-off-by: kim <grufwub@gmail.com> * update cache library, add sweep function to keep caches trim Signed-off-by: kim <grufwub@gmail.com> * sweep caches once a minute Signed-off-by: kim <grufwub@gmail.com> * add a regular job to sweep caches and keep under 80% utilisation Signed-off-by: kim <grufwub@gmail.com> * remove dead code Signed-off-by: kim <grufwub@gmail.com> * add new size library used to libraries section of readme Signed-off-by: kim <grufwub@gmail.com> * add better explanations for the mem-ratio numbers Signed-off-by: kim <grufwub@gmail.com> * update go-cache Signed-off-by: kim <grufwub@gmail.com> * library version bump Signed-off-by: kim <grufwub@gmail.com> * update cache.result{} size model estimation Signed-off-by: kim <grufwub@gmail.com> --------- Signed-off-by: kim <grufwub@gmail.com>
2023-08-03 09:34:35 +00:00
// TTLCache represents a TTL cache with customizable callbacks, it exists here to abstract away the "unsafe" methods in the case that you do not want your own implementation atop ttl.Cache{}.
type TTLCache[Key comparable, Value any] interface {
// Start will start the cache background eviction routine with given sweep frequency. If already running or a freq <= 0 provided, this is a no-op. This will block until the eviction routine has started.
Start(freq time.Duration) bool
// Stop will stop cache background eviction routine. If not running this is a no-op. This will block until the eviction routine has stopped.
Stop() bool
[feature] simpler cache size configuration (#2051) * add automatic cache max size generation based on ratios of a singular fixed memory target Signed-off-by: kim <grufwub@gmail.com> * remove now-unused cache max-size config variables Signed-off-by: kim <grufwub@gmail.com> * slight ratio tweak Signed-off-by: kim <grufwub@gmail.com> * remove unused visibility config var Signed-off-by: kim <grufwub@gmail.com> * add secret little ratio config trick Signed-off-by: kim <grufwub@gmail.com> * fixed a word Signed-off-by: kim <grufwub@gmail.com> * update cache library to remove use of TTL in result caches + slice cache Signed-off-by: kim <grufwub@gmail.com> * update other cache usages to use correct interface Signed-off-by: kim <grufwub@gmail.com> * update example config to explain the cache memory target Signed-off-by: kim <grufwub@gmail.com> * update env parsing test with new config values Signed-off-by: kim <grufwub@gmail.com> * do some ratio twiddling Signed-off-by: kim <grufwub@gmail.com> * add missing header * update envparsing with latest defaults Signed-off-by: kim <grufwub@gmail.com> * update size calculations to take into account result cache, simple cache and extra map overheads Signed-off-by: kim <grufwub@gmail.com> * tweak the ratios some more Signed-off-by: kim <grufwub@gmail.com> * more nan rampaging Signed-off-by: kim <grufwub@gmail.com> * fix envparsing script Signed-off-by: kim <grufwub@gmail.com> * update cache library, add sweep function to keep caches trim Signed-off-by: kim <grufwub@gmail.com> * sweep caches once a minute Signed-off-by: kim <grufwub@gmail.com> * add a regular job to sweep caches and keep under 80% utilisation Signed-off-by: kim <grufwub@gmail.com> * remove dead code Signed-off-by: kim <grufwub@gmail.com> * add new size library used to libraries section of readme Signed-off-by: kim <grufwub@gmail.com> * add better explanations for the mem-ratio numbers Signed-off-by: kim <grufwub@gmail.com> * update go-cache Signed-off-by: kim <grufwub@gmail.com> * library version bump Signed-off-by: kim <grufwub@gmail.com> * update cache.result{} size model estimation Signed-off-by: kim <grufwub@gmail.com> --------- Signed-off-by: kim <grufwub@gmail.com>
2023-08-03 09:34:35 +00:00
// SetTTL sets the cache item TTL. Update can be specified to force updates of existing items in the cache, this will simply add the change in TTL to their current expiry time.
SetTTL(ttl time.Duration, update bool)
// implements base cache.
Cache[Key, Value]
}
// Cache represents a cache with customizable callbacks, it exists here to abstract away the "unsafe" methods in the case that you do not want your own implementation atop simple.Cache{}.
type Cache[Key comparable, Value any] interface {
// SetEvictionCallback sets the eviction callback to the provided hook.
SetEvictionCallback(hook func(Key, Value))
// SetInvalidateCallback sets the invalidate callback to the provided hook.
SetInvalidateCallback(hook func(Key, Value))
// Get fetches the value with key from the cache, extending its TTL.
Get(key Key) (value Value, ok bool)
// Add attempts to place the value at key in the cache, doing nothing if a value with this key already exists. Returned bool is success state. Calls invalidate callback on success.
Add(key Key, value Value) bool
// Set places the value at key in the cache. This will overwrite any existing value. Existing values will have their TTL extended upon update. Always calls invalidate callback.
Set(key Key, value Value)
// CAS will attempt to perform a CAS operation on 'key', using provided old and new values, and comparator function. Returned bool is success.
CAS(key Key, old, new Value, cmp func(Value, Value) bool) bool
// Swap will attempt to perform a swap on 'key', replacing the value there and returning the existing value. If no value exists for key, this will set the value and return the zero value for V.
Swap(key Key, swp Value) Value
// Has checks the cache for a value with key, this will not update TTL.
Has(key Key) bool
// Invalidate deletes a value from the cache, calling the invalidate callback.
Invalidate(key Key) bool
// InvalidateAll is equivalent to multiple Invalidate calls.
InvalidateAll(keys ...Key) bool
// Clear empties the cache, calling the invalidate callback on each entry.
Clear()
// Len returns the current length of the cache.
Len() int
// Cap returns the maximum capacity of the cache.
Cap() int
}
[feature] simpler cache size configuration (#2051) * add automatic cache max size generation based on ratios of a singular fixed memory target Signed-off-by: kim <grufwub@gmail.com> * remove now-unused cache max-size config variables Signed-off-by: kim <grufwub@gmail.com> * slight ratio tweak Signed-off-by: kim <grufwub@gmail.com> * remove unused visibility config var Signed-off-by: kim <grufwub@gmail.com> * add secret little ratio config trick Signed-off-by: kim <grufwub@gmail.com> * fixed a word Signed-off-by: kim <grufwub@gmail.com> * update cache library to remove use of TTL in result caches + slice cache Signed-off-by: kim <grufwub@gmail.com> * update other cache usages to use correct interface Signed-off-by: kim <grufwub@gmail.com> * update example config to explain the cache memory target Signed-off-by: kim <grufwub@gmail.com> * update env parsing test with new config values Signed-off-by: kim <grufwub@gmail.com> * do some ratio twiddling Signed-off-by: kim <grufwub@gmail.com> * add missing header * update envparsing with latest defaults Signed-off-by: kim <grufwub@gmail.com> * update size calculations to take into account result cache, simple cache and extra map overheads Signed-off-by: kim <grufwub@gmail.com> * tweak the ratios some more Signed-off-by: kim <grufwub@gmail.com> * more nan rampaging Signed-off-by: kim <grufwub@gmail.com> * fix envparsing script Signed-off-by: kim <grufwub@gmail.com> * update cache library, add sweep function to keep caches trim Signed-off-by: kim <grufwub@gmail.com> * sweep caches once a minute Signed-off-by: kim <grufwub@gmail.com> * add a regular job to sweep caches and keep under 80% utilisation Signed-off-by: kim <grufwub@gmail.com> * remove dead code Signed-off-by: kim <grufwub@gmail.com> * add new size library used to libraries section of readme Signed-off-by: kim <grufwub@gmail.com> * add better explanations for the mem-ratio numbers Signed-off-by: kim <grufwub@gmail.com> * update go-cache Signed-off-by: kim <grufwub@gmail.com> * library version bump Signed-off-by: kim <grufwub@gmail.com> * update cache.result{} size model estimation Signed-off-by: kim <grufwub@gmail.com> --------- Signed-off-by: kim <grufwub@gmail.com>
2023-08-03 09:34:35 +00:00
// New returns a new initialized Cache with given initial length, maximum capacity.
func New[K comparable, V any](len, cap int) Cache[K, V] {
return simple.New[K, V](len, cap)
}
// NewTTL returns a new initialized TTLCache with given initial length, maximum capacity and TTL duration.
func NewTTL[K comparable, V any](len, cap int, _ttl time.Duration) TTLCache[K, V] {
return ttl.New[K, V](len, cap, _ttl)
}