mirror of
https://github.com/prometheus/statsd_exporter.git
synced 2025-10-04 06:29:46 +00:00
Merge pull request #157 from Kong/feat/fsm-matcher
Faster glob matching using a finite state machine
This commit is contained in:
commit
00e2c3ff26
12 changed files with 1619 additions and 16 deletions
7
Makefile
7
Makefile
|
@ -17,3 +17,10 @@ STATICCHECK_IGNORE = \
|
||||||
github.com/prometheus/statsd_exporter/main.go:SA1019 \
|
github.com/prometheus/statsd_exporter/main.go:SA1019 \
|
||||||
|
|
||||||
DOCKER_IMAGE_NAME ?= statsd-exporter
|
DOCKER_IMAGE_NAME ?= statsd-exporter
|
||||||
|
|
||||||
|
.PHONY: bench
|
||||||
|
bench:
|
||||||
|
@echo ">> running all benchmarks"
|
||||||
|
$(GO) test -bench . -race $(pkgs)
|
||||||
|
|
||||||
|
all: bench
|
||||||
|
|
41
README.md
41
README.md
|
@ -72,6 +72,8 @@ NOTE: Version 0.7.0 switched to the [kingpin](https://github.com/alecthomas/king
|
||||||
read buffer associated with the UDP connection. Please
|
read buffer associated with the UDP connection. Please
|
||||||
make sure the kernel parameters net.core.rmem_max is
|
make sure the kernel parameters net.core.rmem_max is
|
||||||
set to a value greater than the value specified.
|
set to a value greater than the value specified.
|
||||||
|
--debug.dump-fsm="" The path to dump internal FSM generated for glob
|
||||||
|
matching as Dot file.
|
||||||
--log.level="info" Only log messages with the given severity or above.
|
--log.level="info" Only log messages with the given severity or above.
|
||||||
Valid levels: [debug, info, warn, error, fatal]
|
Valid levels: [debug, info, warn, error, fatal]
|
||||||
--log.format="logger:stderr"
|
--log.format="logger:stderr"
|
||||||
|
@ -181,6 +183,8 @@ mappings:
|
||||||
code: "$1"
|
code: "$1"
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### StatsD timers
|
||||||
|
|
||||||
By default, statsd timers are represented as a Prometheus summary with
|
By default, statsd timers are represented as a Prometheus summary with
|
||||||
quantiles. You may optionally configure the [quantiles and acceptable
|
quantiles. You may optionally configure the [quantiles and acceptable
|
||||||
error](https://prometheus.io/docs/practices/histograms/#quantiles):
|
error](https://prometheus.io/docs/practices/histograms/#quantiles):
|
||||||
|
@ -223,6 +227,8 @@ mappings:
|
||||||
job: "${1}_server"
|
job: "${1}_server"
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Regular expression matching
|
||||||
|
|
||||||
Another capability when using YAML configuration is the ability to define matches
|
Another capability when using YAML configuration is the ability to define matches
|
||||||
using raw regular expressions as opposed to the default globbing style of match.
|
using raw regular expressions as opposed to the default globbing style of match.
|
||||||
This may allow for pulling structured data from otherwise poorly named statsd
|
This may allow for pulling structured data from otherwise poorly named statsd
|
||||||
|
@ -249,14 +255,19 @@ automatically.
|
||||||
only used when the statsd metric type is a timerand the `timer_type` is set to
|
only used when the statsd metric type is a timerand the `timer_type` is set to
|
||||||
"histogram."
|
"histogram."
|
||||||
|
|
||||||
|
### Global defaults
|
||||||
|
|
||||||
One may also set defaults for the timer type, buckets or quantiles, and match_type. These will be used
|
One may also set defaults for the timer type, buckets or quantiles, and match_type. These will be used
|
||||||
by all mappings that do not define these.
|
by all mappings that do not define these.
|
||||||
|
|
||||||
|
An option that can only be configured in `defaults` is `glob_disable_ordering`, which is `false` if omitted. By setting this to `true`, `glob` match type will not honor the occurance of rules in the mapping rules file and always treat `*` as lower priority than a general string.
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
defaults:
|
defaults:
|
||||||
timer_type: histogram
|
timer_type: histogram
|
||||||
buckets: [.005, .01, .025, .05, .1, .25, .5, 1, 2.5 ]
|
buckets: [.005, .01, .025, .05, .1, .25, .5, 1, 2.5 ]
|
||||||
match_type: glob
|
match_type: glob
|
||||||
|
glob_disable_ordering: false
|
||||||
mappings:
|
mappings:
|
||||||
# This will be a histogram using the buckets set in `defaults`.
|
# This will be a histogram using the buckets set in `defaults`.
|
||||||
- match: test.timing.*.*.*
|
- match: test.timing.*.*.*
|
||||||
|
@ -275,7 +286,33 @@ mappings:
|
||||||
job: "${1}_server_other"
|
job: "${1}_server_other"
|
||||||
```
|
```
|
||||||
|
|
||||||
You may also drop metrics by specifying a "drop" action on a match. For example:
|
### Choosing between glob or regex match type
|
||||||
|
|
||||||
|
Despite from the missing flexibility of using regular expression in mapping and
|
||||||
|
formatting labels, `glob` matching is optimized to have better performance than
|
||||||
|
`regex` in certain use cases. In short, glob will have best performance if the
|
||||||
|
rules amount is not so less and captures (using of `*`) is not to much in a
|
||||||
|
single rule. Whether disabling ordering in glob or not won't have a noticable
|
||||||
|
effect on performance in general use cases. In edge cases like the below however,
|
||||||
|
disabling ordering will be beneficial:
|
||||||
|
|
||||||
|
a.*.*.*.*
|
||||||
|
a.b.*.*.*
|
||||||
|
a.b.c.*.*
|
||||||
|
a.b.c.d.*
|
||||||
|
|
||||||
|
The reason is that the list assignment of captures (using of `*`) is the most
|
||||||
|
expensive operation in glob. Honoring ordering will result in up to 10 list
|
||||||
|
assignments, while without ordering it will need only 4 at most.
|
||||||
|
|
||||||
|
For details, see [pkg/mapper/fsm/README.md](pkg/mapper/fsm/README.md).
|
||||||
|
Running `go test -bench .` in **pkg/mapper** directory will produce
|
||||||
|
a detailed comparison between the two match type.
|
||||||
|
|
||||||
|
### `drop` action
|
||||||
|
|
||||||
|
You may also drop metrics by specifying a "drop" action on a match. For
|
||||||
|
example:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
mappings:
|
mappings:
|
||||||
|
@ -296,6 +333,8 @@ mappings:
|
||||||
You can drop any metric using the normal match syntax.
|
You can drop any metric using the normal match syntax.
|
||||||
The default action is "map" which does the normal metrics mapping.
|
The default action is "map" which does the normal metrics mapping.
|
||||||
|
|
||||||
|
### Explicit metric type mapping
|
||||||
|
|
||||||
StatsD allows emitting of different metric types under the same metric name,
|
StatsD allows emitting of different metric types under the same metric name,
|
||||||
but the Prometheus client library can't merge those. For this use-case the
|
but the Prometheus client library can't merge those. For this use-case the
|
||||||
mapping definition allows you to specify which metric type to match:
|
mapping definition allows you to specify which metric type to match:
|
||||||
|
|
23
main.go
23
main.go
|
@ -14,8 +14,10 @@
|
||||||
package main
|
package main
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"bufio"
|
||||||
"net"
|
"net"
|
||||||
"net/http"
|
"net/http"
|
||||||
|
"os"
|
||||||
"strconv"
|
"strconv"
|
||||||
|
|
||||||
"github.com/howeyc/fsnotify"
|
"github.com/howeyc/fsnotify"
|
||||||
|
@ -118,6 +120,20 @@ func watchConfig(fileName string, mapper *mapper.MetricMapper) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func dumpFSM(mapper *mapper.MetricMapper, dumpFilename string) error {
|
||||||
|
f, err := os.Create(dumpFilename)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
log.Infoln("Start dumping FSM to", dumpFilename)
|
||||||
|
w := bufio.NewWriter(f)
|
||||||
|
mapper.FSM.DumpFSM(w)
|
||||||
|
w.Flush()
|
||||||
|
f.Close()
|
||||||
|
log.Infoln("Finish dumping FSM")
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
var (
|
var (
|
||||||
listenAddress = kingpin.Flag("web.listen-address", "The address on which to expose the web interface and generated Prometheus metrics.").Default(":9102").String()
|
listenAddress = kingpin.Flag("web.listen-address", "The address on which to expose the web interface and generated Prometheus metrics.").Default(":9102").String()
|
||||||
|
@ -126,6 +142,7 @@ func main() {
|
||||||
statsdListenTCP = kingpin.Flag("statsd.listen-tcp", "The TCP address on which to receive statsd metric lines. \"\" disables it.").Default(":9125").String()
|
statsdListenTCP = kingpin.Flag("statsd.listen-tcp", "The TCP address on which to receive statsd metric lines. \"\" disables it.").Default(":9125").String()
|
||||||
mappingConfig = kingpin.Flag("statsd.mapping-config", "Metric mapping configuration file name.").String()
|
mappingConfig = kingpin.Flag("statsd.mapping-config", "Metric mapping configuration file name.").String()
|
||||||
readBuffer = kingpin.Flag("statsd.read-buffer", "Size (in bytes) of the operating system's transmit read buffer associated with the UDP connection. Please make sure the kernel parameters net.core.rmem_max is set to a value greater than the value specified.").Int()
|
readBuffer = kingpin.Flag("statsd.read-buffer", "Size (in bytes) of the operating system's transmit read buffer associated with the UDP connection. Please make sure the kernel parameters net.core.rmem_max is set to a value greater than the value specified.").Int()
|
||||||
|
dumpFSMPath = kingpin.Flag("debug.dump-fsm", "The path to dump internal FSM generated for glob matching as Dot file.").Default("").String()
|
||||||
)
|
)
|
||||||
|
|
||||||
log.AddFlags(kingpin.CommandLine)
|
log.AddFlags(kingpin.CommandLine)
|
||||||
|
@ -183,6 +200,12 @@ func main() {
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatal("Error loading config:", err)
|
log.Fatal("Error loading config:", err)
|
||||||
}
|
}
|
||||||
|
if *dumpFSMPath != "" {
|
||||||
|
err := dumpFSM(mapper, *dumpFSMPath)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatal("Error dumping FSM:", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
go watchConfig(*mappingConfig, mapper)
|
go watchConfig(*mappingConfig, mapper)
|
||||||
}
|
}
|
||||||
exporter := NewExporter(mapper)
|
exporter := NewExporter(mapper)
|
||||||
|
|
132
pkg/mapper/fsm/README.md
Normal file
132
pkg/mapper/fsm/README.md
Normal file
|
@ -0,0 +1,132 @@
|
||||||
|
# FSM Mapping
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
This package implements a fast and efficient algorithm for generic glob style
|
||||||
|
string matching using a finite state machine (FSM).
|
||||||
|
|
||||||
|
### Source Hierachy
|
||||||
|
|
||||||
|
```
|
||||||
|
'-- fsm
|
||||||
|
'-- dump.go // functionality to dump the FSM to Dot file
|
||||||
|
'-- formatter.go // format glob templates using captured * groups
|
||||||
|
'-- fsm.go // manipulating and searching of FSM
|
||||||
|
'-- minmax.go // min() max() function for interger
|
||||||
|
```
|
||||||
|
|
||||||
|
## FSM Explained
|
||||||
|
|
||||||
|
Per [Wikipedia](https://en.wikipedia.org/wiki/Finite-state_machine):
|
||||||
|
|
||||||
|
> A finite-state machine (FSM) or finite-state automaton (FSA, plural: automata),
|
||||||
|
> finite automaton, or simply a state machine, is a mathematical model of
|
||||||
|
> computation. It is an abstract machine that can be in exactly one of a finite
|
||||||
|
> number of states at any given time. The FSM can change from one state to
|
||||||
|
> another in response to some external inputs; the change from one state to
|
||||||
|
> another is called a transition. An FSM is defined by a list of its states, its
|
||||||
|
> initial state, and the conditions for each transition.
|
||||||
|
|
||||||
|
In our use case, each *state* is a substring after the input StatsD metric name is splitted by `.`.
|
||||||
|
|
||||||
|
### Add state to FSM
|
||||||
|
|
||||||
|
`func (f *FSM) AddState(match string, matchMetricType string,
|
||||||
|
maxPossibleTransitions int, result interface{}) int`
|
||||||
|
|
||||||
|
At first, the FSM only contains three states, representing three possible metric types:
|
||||||
|
|
||||||
|
____ [gauge]
|
||||||
|
/
|
||||||
|
(start)---- [counter]
|
||||||
|
\
|
||||||
|
'--- [ timer ]
|
||||||
|
|
||||||
|
|
||||||
|
Adding a rule `client.*.request.count` with type `counter` will make the FSM to be:
|
||||||
|
|
||||||
|
|
||||||
|
____ [gauge]
|
||||||
|
/
|
||||||
|
(start)---- [counter] -- [client] -- [*] -- [request] -- [count] -- {R1}
|
||||||
|
\
|
||||||
|
'--- [timer]
|
||||||
|
|
||||||
|
`{R1}` is short for result 1, which is the match result for `client.*.request.count`.
|
||||||
|
|
||||||
|
Adding a rule `client.*.*.size` with type `counter` will make the FSM to be:
|
||||||
|
|
||||||
|
____ [gauge] __ [request] -- [count] -- {R1}
|
||||||
|
/ /
|
||||||
|
(start)---- [counter] -- [client] -- [*]
|
||||||
|
\ \__ [*] -- [size] -- {R2}
|
||||||
|
'--- [timer]
|
||||||
|
|
||||||
|
|
||||||
|
### Finding a result state in FSM
|
||||||
|
|
||||||
|
`func (f *FSM) GetMapping(statsdMetric string, statsdMetricType string)
|
||||||
|
(*mappingState, []string)`
|
||||||
|
|
||||||
|
For example, when mapping `client.aaa.request.count` with `counter` type in the
|
||||||
|
FSM, the `^1` to `^7` symbols indicate how FSM will traversal in its tree:
|
||||||
|
|
||||||
|
|
||||||
|
____ [gauge] __ [request] -- [count] -- {R1}
|
||||||
|
/ / ^5 ^6 ^7
|
||||||
|
(start)---- [counter] -- [client] -- [*]
|
||||||
|
^1 \ ^2 ^3 \__ [*] -- [size] -- {R2}
|
||||||
|
'--- [timer] ^4
|
||||||
|
|
||||||
|
|
||||||
|
To map `client.bbb.request.size`, FSM will do a backtracking:
|
||||||
|
|
||||||
|
|
||||||
|
____ [gauge] __ [request] -- [count] -- {R1}
|
||||||
|
/ / ^5 ^6
|
||||||
|
(start)---- [counter] -- [client] -- [*]
|
||||||
|
^1 \ ^2 ^3 \__ [*] -- [size] -- {R2}
|
||||||
|
'--- [timer] ^4
|
||||||
|
^7 ^8 ^9
|
||||||
|
|
||||||
|
|
||||||
|
## Debugging
|
||||||
|
|
||||||
|
To see all the states of the current FSM, use `func (f *FSM) DumpFSM(w io.Writer)`
|
||||||
|
to dump into a Dot file. The Dot file can be further renderer into image using:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
$ dot -Tpng dump.dot > dump.png
|
||||||
|
```
|
||||||
|
|
||||||
|
In StatsD exporter, one could use the following:
|
||||||
|
|
||||||
|
```shell
|
||||||
|
$ statsd_exporter --statsd.mapping-config=statsd.rules --debug.dump-fsm=dump.dot
|
||||||
|
$ dot -Tpng dump.dot > dump.png
|
||||||
|
```
|
||||||
|
|
||||||
|
For example, the following rules:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
mappings:
|
||||||
|
- match: client.*.request.count
|
||||||
|
name: request_count
|
||||||
|
match_metric_type: counter
|
||||||
|
labels:
|
||||||
|
client: $1
|
||||||
|
|
||||||
|
- match: client.*.*.size
|
||||||
|
name: sizes
|
||||||
|
match_metric_type: counter
|
||||||
|
labels:
|
||||||
|
client: $1
|
||||||
|
direction: $2
|
||||||
|
```
|
||||||
|
|
||||||
|
will be rendered as:
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
The `dot` program is part of [Graphviz](https://www.graphviz.org/) and is
|
||||||
|
available in most of popular operating systems.
|
48
pkg/mapper/fsm/dump.go
Normal file
48
pkg/mapper/fsm/dump.go
Normal file
|
@ -0,0 +1,48 @@
|
||||||
|
// Copyright 2018 The Prometheus Authors
|
||||||
|
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
// you may not use this file except in compliance with the License.
|
||||||
|
// You may obtain a copy of the License at
|
||||||
|
//
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
//
|
||||||
|
// Unless required by applicable law or agreed to in writing, software
|
||||||
|
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
// See the License for the specific language governing permissions and
|
||||||
|
// limitations under the License.
|
||||||
|
|
||||||
|
package fsm
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
)
|
||||||
|
|
||||||
|
// DumpFSM accepts a io.writer and write the current FSM into dot file format.
|
||||||
|
func (f *FSM) DumpFSM(w io.Writer) {
|
||||||
|
idx := 0
|
||||||
|
states := make(map[int]*mappingState)
|
||||||
|
states[idx] = f.root
|
||||||
|
|
||||||
|
w.Write([]byte("digraph g {\n"))
|
||||||
|
w.Write([]byte("rankdir=LR\n")) // make it vertical
|
||||||
|
w.Write([]byte("node [ label=\"\",style=filled,fillcolor=white,shape=circle ]\n")) // remove label of node
|
||||||
|
|
||||||
|
for idx < len(states) {
|
||||||
|
for field, transition := range states[idx].transitions {
|
||||||
|
states[len(states)] = transition
|
||||||
|
w.Write([]byte(fmt.Sprintf("%d -> %d [label = \"%s\"];\n", idx, len(states)-1, field)))
|
||||||
|
if idx == 0 {
|
||||||
|
// color for metric types
|
||||||
|
w.Write([]byte(fmt.Sprintf("%d [color=\"#D6B656\",fillcolor=\"#FFF2CC\"];\n", len(states)-1)))
|
||||||
|
} else if transition.transitions == nil || len(transition.transitions) == 0 {
|
||||||
|
// color for end state
|
||||||
|
w.Write([]byte(fmt.Sprintf("%d [color=\"#82B366\",fillcolor=\"#D5E8D4\"];\n", len(states)-1)))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
idx++
|
||||||
|
}
|
||||||
|
// color for start state
|
||||||
|
w.Write([]byte(fmt.Sprintf("0 [color=\"#a94442\",fillcolor=\"#f2dede\"];\n")))
|
||||||
|
w.Write([]byte("}"))
|
||||||
|
}
|
76
pkg/mapper/fsm/formatter.go
Normal file
76
pkg/mapper/fsm/formatter.go
Normal file
|
@ -0,0 +1,76 @@
|
||||||
|
// Copyright 2018 The Prometheus Authors
|
||||||
|
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
// you may not use this file except in compliance with the License.
|
||||||
|
// You may obtain a copy of the License at
|
||||||
|
//
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
//
|
||||||
|
// Unless required by applicable law or agreed to in writing, software
|
||||||
|
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
// See the License for the specific language governing permissions and
|
||||||
|
// limitations under the License.
|
||||||
|
|
||||||
|
package fsm
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"regexp"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
templateReplaceCaptureRE = regexp.MustCompile(`\$\{?([a-zA-Z0-9_\$]+)\}?`)
|
||||||
|
)
|
||||||
|
|
||||||
|
type TemplateFormatter struct {
|
||||||
|
captureIndexes []int
|
||||||
|
captureCount int
|
||||||
|
fmtString string
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewTemplateFormatter instantiates a TemplateFormatter
|
||||||
|
// from given template string and the maximum amount of captures.
|
||||||
|
func NewTemplateFormatter(template string, captureCount int) *TemplateFormatter {
|
||||||
|
matches := templateReplaceCaptureRE.FindAllStringSubmatch(template, -1)
|
||||||
|
if len(matches) == 0 {
|
||||||
|
// if no regex reference found, keep it as it is
|
||||||
|
return &TemplateFormatter{captureCount: 0, fmtString: template}
|
||||||
|
}
|
||||||
|
|
||||||
|
var indexes []int
|
||||||
|
valueFormatter := template
|
||||||
|
for _, match := range matches {
|
||||||
|
idx, err := strconv.Atoi(match[len(match)-1])
|
||||||
|
if err != nil || idx > captureCount || idx < 1 {
|
||||||
|
// if index larger than captured count or using unsupported named capture group,
|
||||||
|
// replace with empty string
|
||||||
|
valueFormatter = strings.Replace(valueFormatter, match[0], "", -1)
|
||||||
|
} else {
|
||||||
|
valueFormatter = strings.Replace(valueFormatter, match[0], "%s", -1)
|
||||||
|
// note: the regex reference variable $? starts from 1
|
||||||
|
indexes = append(indexes, idx-1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return &TemplateFormatter{
|
||||||
|
captureIndexes: indexes,
|
||||||
|
captureCount: len(indexes),
|
||||||
|
fmtString: valueFormatter,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format accepts a list containing captured strings and returns the formatted
|
||||||
|
// string using the template stored in current TemplateFormatter.
|
||||||
|
func (formatter *TemplateFormatter) Format(captures []string) string {
|
||||||
|
if formatter.captureCount == 0 {
|
||||||
|
// no label substitution, keep as it is
|
||||||
|
return formatter.fmtString
|
||||||
|
}
|
||||||
|
indexes := formatter.captureIndexes
|
||||||
|
vargs := make([]interface{}, formatter.captureCount)
|
||||||
|
for i, idx := range indexes {
|
||||||
|
vargs[i] = captures[idx]
|
||||||
|
}
|
||||||
|
return fmt.Sprintf(formatter.fmtString, vargs...)
|
||||||
|
}
|
324
pkg/mapper/fsm/fsm.go
Normal file
324
pkg/mapper/fsm/fsm.go
Normal file
|
@ -0,0 +1,324 @@
|
||||||
|
// Copyright 2018 The Prometheus Authors
|
||||||
|
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
// you may not use this file except in compliance with the License.
|
||||||
|
// You may obtain a copy of the License at
|
||||||
|
//
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
//
|
||||||
|
// Unless required by applicable law or agreed to in writing, software
|
||||||
|
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
// See the License for the specific language governing permissions and
|
||||||
|
// limitations under the License.
|
||||||
|
|
||||||
|
package fsm
|
||||||
|
|
||||||
|
import (
|
||||||
|
"regexp"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/prometheus/common/log"
|
||||||
|
)
|
||||||
|
|
||||||
|
type mappingState struct {
|
||||||
|
transitions map[string]*mappingState
|
||||||
|
minRemainingLength int
|
||||||
|
maxRemainingLength int
|
||||||
|
// result* members are nil unless there's a metric ends with this state
|
||||||
|
Result interface{}
|
||||||
|
ResultPriority int
|
||||||
|
}
|
||||||
|
|
||||||
|
type fsmBacktrackStackCursor struct {
|
||||||
|
fieldIndex int
|
||||||
|
captureIndex int
|
||||||
|
currentCapture string
|
||||||
|
state *mappingState
|
||||||
|
prev *fsmBacktrackStackCursor
|
||||||
|
next *fsmBacktrackStackCursor
|
||||||
|
}
|
||||||
|
|
||||||
|
type FSM struct {
|
||||||
|
root *mappingState
|
||||||
|
metricTypes []string
|
||||||
|
statesCount int
|
||||||
|
BacktrackingNeeded bool
|
||||||
|
OrderingDisabled bool
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewFSM creates a new FSM instance
|
||||||
|
func NewFSM(metricTypes []string, maxPossibleTransitions int, orderingDisabled bool) *FSM {
|
||||||
|
fsm := FSM{}
|
||||||
|
root := &mappingState{}
|
||||||
|
root.transitions = make(map[string]*mappingState, len(metricTypes))
|
||||||
|
|
||||||
|
for _, field := range metricTypes {
|
||||||
|
state := &mappingState{}
|
||||||
|
(*state).transitions = make(map[string]*mappingState, maxPossibleTransitions)
|
||||||
|
root.transitions[string(field)] = state
|
||||||
|
}
|
||||||
|
fsm.OrderingDisabled = orderingDisabled
|
||||||
|
fsm.metricTypes = metricTypes
|
||||||
|
fsm.statesCount = 0
|
||||||
|
fsm.root = root
|
||||||
|
return &fsm
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddState adds a mapping rule into the existing FSM.
|
||||||
|
// The maxPossibleTransitions parameter sets the expected count of transitions left.
|
||||||
|
// The result parameter sets the generic type to be returned when fsm found a match in GetMapping.
|
||||||
|
func (f *FSM) AddState(match string, matchMetricType string, maxPossibleTransitions int, result interface{}) int {
|
||||||
|
// first split by "."
|
||||||
|
matchFields := strings.Split(match, ".")
|
||||||
|
// fill into our FSM
|
||||||
|
roots := []*mappingState{}
|
||||||
|
// first state is the metric type
|
||||||
|
if matchMetricType == "" {
|
||||||
|
// if metricType not specified, connect the start state from all three types
|
||||||
|
for _, metricType := range f.metricTypes {
|
||||||
|
roots = append(roots, f.root.transitions[string(metricType)])
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
roots = append(roots, f.root.transitions[matchMetricType])
|
||||||
|
}
|
||||||
|
var captureCount int
|
||||||
|
var finalStates []*mappingState
|
||||||
|
// iterating over different start state (different metric types)
|
||||||
|
for _, root := range roots {
|
||||||
|
captureCount = 0
|
||||||
|
// for each start state, connect from start state to end state
|
||||||
|
for i, field := range matchFields {
|
||||||
|
state, prs := root.transitions[field]
|
||||||
|
if !prs {
|
||||||
|
// create a state if it's not exist in the fsm
|
||||||
|
state = &mappingState{}
|
||||||
|
(*state).transitions = make(map[string]*mappingState, maxPossibleTransitions)
|
||||||
|
(*state).maxRemainingLength = len(matchFields) - i - 1
|
||||||
|
(*state).minRemainingLength = len(matchFields) - i - 1
|
||||||
|
root.transitions[field] = state
|
||||||
|
// if this is last field, set result to currentMapping instance
|
||||||
|
if i == len(matchFields)-1 {
|
||||||
|
root.transitions[field].Result = result
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
(*state).maxRemainingLength = max(len(matchFields)-i-1, (*state).maxRemainingLength)
|
||||||
|
(*state).minRemainingLength = min(len(matchFields)-i-1, (*state).minRemainingLength)
|
||||||
|
}
|
||||||
|
if field == "*" {
|
||||||
|
captureCount++
|
||||||
|
}
|
||||||
|
|
||||||
|
// goto next state
|
||||||
|
root = state
|
||||||
|
}
|
||||||
|
finalStates = append(finalStates, root)
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, state := range finalStates {
|
||||||
|
state.ResultPriority = f.statesCount
|
||||||
|
}
|
||||||
|
|
||||||
|
f.statesCount++
|
||||||
|
|
||||||
|
return captureCount
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetMapping using the fsm to find matching rules according to given statsdMetric and statsdMetricType.
|
||||||
|
// If it finds a match, the final state and the captured strings are returned;
|
||||||
|
// if there's no match found, nil and a empty list will be returned.
|
||||||
|
func (f *FSM) GetMapping(statsdMetric string, statsdMetricType string) (*mappingState, []string) {
|
||||||
|
matchFields := strings.Split(statsdMetric, ".")
|
||||||
|
currentState := f.root.transitions[statsdMetricType]
|
||||||
|
|
||||||
|
// the cursor/pointer in the backtrack stack implemented as a double-linked list
|
||||||
|
var backtrackCursor *fsmBacktrackStackCursor
|
||||||
|
resumeFromBacktrack := false
|
||||||
|
|
||||||
|
// the return variable
|
||||||
|
var finalState *mappingState
|
||||||
|
|
||||||
|
captures := make([]string, len(matchFields))
|
||||||
|
// keep track of captured group so we don't need to do append() on captures
|
||||||
|
captureIdx := 0
|
||||||
|
filedsCount := len(matchFields)
|
||||||
|
i := 0
|
||||||
|
var state *mappingState
|
||||||
|
for { // the loop for backtracking
|
||||||
|
for { // the loop for a single "depth only" search
|
||||||
|
var present bool
|
||||||
|
// if we resume from backtrack, we should skip this branch in this case
|
||||||
|
// since the state that were saved at the end of this branch
|
||||||
|
if !resumeFromBacktrack {
|
||||||
|
if len(currentState.transitions) > 0 {
|
||||||
|
field := matchFields[i]
|
||||||
|
state, present = currentState.transitions[field]
|
||||||
|
fieldsLeft := filedsCount - i - 1
|
||||||
|
// also compare length upfront to avoid unnecessary loop or backtrack
|
||||||
|
if !present || fieldsLeft > state.maxRemainingLength || fieldsLeft < state.minRemainingLength {
|
||||||
|
state, present = currentState.transitions["*"]
|
||||||
|
if !present || fieldsLeft > state.maxRemainingLength || fieldsLeft < state.minRemainingLength {
|
||||||
|
break
|
||||||
|
} else {
|
||||||
|
captures[captureIdx] = field
|
||||||
|
captureIdx++
|
||||||
|
}
|
||||||
|
} else if f.BacktrackingNeeded {
|
||||||
|
// if backtracking is needed, also check for alternative transition, i.e. *
|
||||||
|
altState, present := currentState.transitions["*"]
|
||||||
|
if !present || fieldsLeft > altState.maxRemainingLength || fieldsLeft < altState.minRemainingLength {
|
||||||
|
} else {
|
||||||
|
// push to backtracking stack
|
||||||
|
newCursor := fsmBacktrackStackCursor{prev: backtrackCursor, state: altState,
|
||||||
|
fieldIndex: i,
|
||||||
|
captureIndex: captureIdx, currentCapture: field,
|
||||||
|
}
|
||||||
|
// if this is not the first time, connect to the previous cursor
|
||||||
|
if backtrackCursor != nil {
|
||||||
|
backtrackCursor.next = &newCursor
|
||||||
|
}
|
||||||
|
backtrackCursor = &newCursor
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// no more transitions for this state
|
||||||
|
break
|
||||||
|
}
|
||||||
|
} // backtrack will resume from here
|
||||||
|
|
||||||
|
// do we reach a final state?
|
||||||
|
if state.Result != nil && i == filedsCount-1 {
|
||||||
|
if f.OrderingDisabled {
|
||||||
|
finalState = state
|
||||||
|
return finalState, captures
|
||||||
|
} else if finalState == nil || finalState.ResultPriority > state.ResultPriority {
|
||||||
|
// if we care about ordering, try to find a result with highest prioity
|
||||||
|
finalState = state
|
||||||
|
}
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
i++
|
||||||
|
if i >= filedsCount {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
resumeFromBacktrack = false
|
||||||
|
currentState = state
|
||||||
|
}
|
||||||
|
if backtrackCursor == nil {
|
||||||
|
// if we are not doing backtracking or all path has been travesaled
|
||||||
|
break
|
||||||
|
} else {
|
||||||
|
// pop one from stack
|
||||||
|
state = backtrackCursor.state
|
||||||
|
currentState = state
|
||||||
|
i = backtrackCursor.fieldIndex
|
||||||
|
captureIdx = backtrackCursor.captureIndex + 1
|
||||||
|
// put the * capture back
|
||||||
|
captures[captureIdx-1] = backtrackCursor.currentCapture
|
||||||
|
backtrackCursor = backtrackCursor.prev
|
||||||
|
if backtrackCursor != nil {
|
||||||
|
// deref for GC
|
||||||
|
backtrackCursor.next = nil
|
||||||
|
}
|
||||||
|
resumeFromBacktrack = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return finalState, captures
|
||||||
|
}
|
||||||
|
|
||||||
|
// TestIfNeedBacktracking tests if backtrack is needed for given list of mappings
|
||||||
|
// and whether ordering is disabled.
|
||||||
|
func TestIfNeedBacktracking(mappings []string, orderingDisabled bool) bool {
|
||||||
|
backtrackingNeeded := false
|
||||||
|
// A has * in rules, but there's other transisitions at the same state,
|
||||||
|
// this makes A the cause of backtracking
|
||||||
|
ruleByLength := make(map[int][]string)
|
||||||
|
ruleREByLength := make(map[int][]*regexp.Regexp)
|
||||||
|
|
||||||
|
// first sort rules by length
|
||||||
|
for _, mapping := range mappings {
|
||||||
|
l := len(strings.Split(mapping, "."))
|
||||||
|
ruleByLength[l] = append(ruleByLength[l], mapping)
|
||||||
|
|
||||||
|
metricRe := strings.Replace(mapping, ".", "\\.", -1)
|
||||||
|
metricRe = strings.Replace(metricRe, "*", "([^.]*)", -1)
|
||||||
|
regex, err := regexp.Compile("^" + metricRe + "$")
|
||||||
|
if err != nil {
|
||||||
|
log.Warnf("invalid match %s. cannot compile regex in mapping: %v", mapping, err)
|
||||||
|
}
|
||||||
|
// put into array no matter there's error or not, we will skip later if regex is nil
|
||||||
|
ruleREByLength[l] = append(ruleREByLength[l], regex)
|
||||||
|
}
|
||||||
|
|
||||||
|
for l, rules := range ruleByLength {
|
||||||
|
if len(rules) == 1 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
rulesRE := ruleREByLength[l]
|
||||||
|
for i1, r1 := range rules {
|
||||||
|
currentRuleNeedBacktrack := false
|
||||||
|
re1 := rulesRE[i1]
|
||||||
|
if re1 == nil || strings.Index(r1, "*") == -1 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
// if rule r1 is A.B.C.*.E.*, is there a rule r2 is A.B.C.D.x.x or A.B.C.*.E.F ? (x is any string or *)
|
||||||
|
// if such r2 exists, then to match r1 we will need backtracking
|
||||||
|
for index := 0; index < len(r1); index++ {
|
||||||
|
if r1[index] != '*' {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
// translate the substring of r1 from 0 to the index of current * into regex
|
||||||
|
// A.B.C.*.E.* will becomes ^A\.B\.C\. and ^A\.B\.C\.\*\.E\.
|
||||||
|
reStr := strings.Replace(r1[:index], ".", "\\.", -1)
|
||||||
|
reStr = strings.Replace(reStr, "*", "\\*", -1)
|
||||||
|
re := regexp.MustCompile("^" + reStr)
|
||||||
|
for i2, r2 := range rules {
|
||||||
|
if i2 == i1 {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if len(re.FindStringSubmatchIndex(r2)) > 0 {
|
||||||
|
currentRuleNeedBacktrack = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for i2, r2 := range rules {
|
||||||
|
if i2 != i1 && len(re1.FindStringSubmatchIndex(r2)) > 0 {
|
||||||
|
// log if we care about ordering and the superset occurs before
|
||||||
|
if !orderingDisabled && i1 < i2 {
|
||||||
|
log.Warnf("match \"%s\" is a super set of match \"%s\" but in a lower order, "+
|
||||||
|
"the first will never be matched", r1, r2)
|
||||||
|
}
|
||||||
|
currentRuleNeedBacktrack = false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for i2, re2 := range rulesRE {
|
||||||
|
if i2 == i1 || re2 == nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
// if r1 is a subset of other rule, we don't need backtrack
|
||||||
|
// because either we turned on ordering
|
||||||
|
// or we disabled ordering and can't match it even with backtrack
|
||||||
|
if len(re2.FindStringSubmatchIndex(r1)) > 0 {
|
||||||
|
currentRuleNeedBacktrack = false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if currentRuleNeedBacktrack {
|
||||||
|
log.Warnf("backtracking required because of match \"%s\", "+
|
||||||
|
"matching performance may be degraded", r1)
|
||||||
|
backtrackingNeeded = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// backtracking will always be needed if ordering of rules is not disabled
|
||||||
|
// since transistions are stored in (unordered) map
|
||||||
|
// note: don't move this branch to the beginning of this function
|
||||||
|
// since we need logs for superset rules
|
||||||
|
|
||||||
|
return !orderingDisabled || backtrackingNeeded
|
||||||
|
}
|
BIN
pkg/mapper/fsm/fsm.png
Normal file
BIN
pkg/mapper/fsm/fsm.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 33 KiB |
30
pkg/mapper/fsm/minmax.go
Normal file
30
pkg/mapper/fsm/minmax.go
Normal file
|
@ -0,0 +1,30 @@
|
||||||
|
// Copyright 2018 The Prometheus Authors
|
||||||
|
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
// you may not use this file except in compliance with the License.
|
||||||
|
// You may obtain a copy of the License at
|
||||||
|
//
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
//
|
||||||
|
// Unless required by applicable law or agreed to in writing, software
|
||||||
|
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
// See the License for the specific language governing permissions and
|
||||||
|
// limitations under the License.
|
||||||
|
|
||||||
|
package fsm
|
||||||
|
|
||||||
|
// min and max implementation for integer
|
||||||
|
|
||||||
|
func min(x, y int) int {
|
||||||
|
if x < y {
|
||||||
|
return x
|
||||||
|
}
|
||||||
|
return y
|
||||||
|
}
|
||||||
|
|
||||||
|
func max(x, y int) int {
|
||||||
|
if x > y {
|
||||||
|
return x
|
||||||
|
}
|
||||||
|
return y
|
||||||
|
}
|
|
@ -17,10 +17,10 @@ import (
|
||||||
"fmt"
|
"fmt"
|
||||||
"io/ioutil"
|
"io/ioutil"
|
||||||
"regexp"
|
"regexp"
|
||||||
"strings"
|
|
||||||
"sync"
|
"sync"
|
||||||
|
|
||||||
"github.com/prometheus/client_golang/prometheus"
|
"github.com/prometheus/client_golang/prometheus"
|
||||||
|
"github.com/prometheus/statsd_exporter/pkg/mapper/fsm"
|
||||||
yaml "gopkg.in/yaml.v2"
|
yaml "gopkg.in/yaml.v2"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -34,15 +34,19 @@ var (
|
||||||
)
|
)
|
||||||
|
|
||||||
type mapperConfigDefaults struct {
|
type mapperConfigDefaults struct {
|
||||||
TimerType TimerType `yaml:"timer_type"`
|
TimerType TimerType `yaml:"timer_type"`
|
||||||
Buckets []float64 `yaml:"buckets"`
|
Buckets []float64 `yaml:"buckets"`
|
||||||
Quantiles []metricObjective `yaml:"quantiles"`
|
Quantiles []metricObjective `yaml:"quantiles"`
|
||||||
MatchType MatchType `yaml:"match_type"`
|
MatchType MatchType `yaml:"match_type"`
|
||||||
|
GlobDisableOrdering bool `yaml:"glob_disable_ordering"`
|
||||||
}
|
}
|
||||||
|
|
||||||
type MetricMapper struct {
|
type MetricMapper struct {
|
||||||
Defaults mapperConfigDefaults `yaml:"defaults"`
|
Defaults mapperConfigDefaults `yaml:"defaults"`
|
||||||
Mappings []MetricMapping `yaml:"mappings"`
|
Mappings []MetricMapping `yaml:"mappings"`
|
||||||
|
FSM *fsm.FSM
|
||||||
|
doFSM bool
|
||||||
|
doRegex bool
|
||||||
mutex sync.Mutex
|
mutex sync.Mutex
|
||||||
|
|
||||||
MappingsCount prometheus.Gauge
|
MappingsCount prometheus.Gauge
|
||||||
|
@ -53,8 +57,11 @@ type matchMetricType string
|
||||||
type MetricMapping struct {
|
type MetricMapping struct {
|
||||||
Match string `yaml:"match"`
|
Match string `yaml:"match"`
|
||||||
Name string `yaml:"name"`
|
Name string `yaml:"name"`
|
||||||
|
nameFormatter *fsm.TemplateFormatter
|
||||||
regex *regexp.Regexp
|
regex *regexp.Regexp
|
||||||
Labels prometheus.Labels `yaml:"labels"`
|
Labels prometheus.Labels `yaml:"labels"`
|
||||||
|
labelKeys []string
|
||||||
|
labelFormatters []*fsm.TemplateFormatter
|
||||||
TimerType TimerType `yaml:"timer_type"`
|
TimerType TimerType `yaml:"timer_type"`
|
||||||
Buckets []float64 `yaml:"buckets"`
|
Buckets []float64 `yaml:"buckets"`
|
||||||
Quantiles []metricObjective `yaml:"quantiles"`
|
Quantiles []metricObjective `yaml:"quantiles"`
|
||||||
|
@ -94,7 +101,14 @@ func (m *MetricMapper) InitFromYAMLString(fileContents string) error {
|
||||||
n.Defaults.MatchType = MatchTypeGlob
|
n.Defaults.MatchType = MatchTypeGlob
|
||||||
}
|
}
|
||||||
|
|
||||||
|
remainingMappingsCount := len(n.Mappings)
|
||||||
|
|
||||||
|
n.FSM = fsm.NewFSM([]string{string(MetricTypeCounter), string(MetricTypeGauge), string(MetricTypeTimer)},
|
||||||
|
remainingMappingsCount, n.Defaults.GlobDisableOrdering)
|
||||||
|
|
||||||
for i := range n.Mappings {
|
for i := range n.Mappings {
|
||||||
|
remainingMappingsCount--
|
||||||
|
|
||||||
currentMapping := &n.Mappings[i]
|
currentMapping := &n.Mappings[i]
|
||||||
|
|
||||||
// check that label is correct
|
// check that label is correct
|
||||||
|
@ -121,24 +135,34 @@ func (m *MetricMapper) InitFromYAMLString(fileContents string) error {
|
||||||
}
|
}
|
||||||
|
|
||||||
if currentMapping.MatchType == MatchTypeGlob {
|
if currentMapping.MatchType == MatchTypeGlob {
|
||||||
|
n.doFSM = true
|
||||||
if !metricLineRE.MatchString(currentMapping.Match) {
|
if !metricLineRE.MatchString(currentMapping.Match) {
|
||||||
return fmt.Errorf("invalid match: %s", currentMapping.Match)
|
return fmt.Errorf("invalid match: %s", currentMapping.Match)
|
||||||
}
|
}
|
||||||
// Translate the glob-style metric match line into a proper regex that we
|
|
||||||
// can use to match metrics later on.
|
captureCount := n.FSM.AddState(currentMapping.Match, string(currentMapping.MatchMetricType),
|
||||||
metricRe := strings.Replace(currentMapping.Match, ".", "\\.", -1)
|
remainingMappingsCount, currentMapping)
|
||||||
metricRe = strings.Replace(metricRe, "*", "([^.]*)", -1)
|
|
||||||
if regex, err := regexp.Compile("^" + metricRe + "$"); err != nil {
|
currentMapping.nameFormatter = fsm.NewTemplateFormatter(currentMapping.Name, captureCount)
|
||||||
return fmt.Errorf("invalid match %s. cannot compile regex in mapping: %v", currentMapping.Match, err)
|
|
||||||
} else {
|
labelKeys := make([]string, len(currentMapping.Labels))
|
||||||
currentMapping.regex = regex
|
labelFormatters := make([]*fsm.TemplateFormatter, len(currentMapping.Labels))
|
||||||
|
labelIndex := 0
|
||||||
|
for label, valueExpr := range currentMapping.Labels {
|
||||||
|
labelKeys[labelIndex] = label
|
||||||
|
labelFormatters[labelIndex] = fsm.NewTemplateFormatter(valueExpr, captureCount)
|
||||||
|
labelIndex++
|
||||||
}
|
}
|
||||||
|
currentMapping.labelFormatters = labelFormatters
|
||||||
|
currentMapping.labelKeys = labelKeys
|
||||||
|
|
||||||
} else {
|
} else {
|
||||||
if regex, err := regexp.Compile(currentMapping.Match); err != nil {
|
if regex, err := regexp.Compile(currentMapping.Match); err != nil {
|
||||||
return fmt.Errorf("invalid regex %s in mapping: %v", currentMapping.Match, err)
|
return fmt.Errorf("invalid regex %s in mapping: %v", currentMapping.Match, err)
|
||||||
} else {
|
} else {
|
||||||
currentMapping.regex = regex
|
currentMapping.regex = regex
|
||||||
}
|
}
|
||||||
|
n.doRegex = true
|
||||||
}
|
}
|
||||||
|
|
||||||
if currentMapping.TimerType == "" {
|
if currentMapping.TimerType == "" {
|
||||||
|
@ -160,6 +184,19 @@ func (m *MetricMapper) InitFromYAMLString(fileContents string) error {
|
||||||
|
|
||||||
m.Defaults = n.Defaults
|
m.Defaults = n.Defaults
|
||||||
m.Mappings = n.Mappings
|
m.Mappings = n.Mappings
|
||||||
|
if n.doFSM {
|
||||||
|
var mappings []string
|
||||||
|
for _, mapping := range n.Mappings {
|
||||||
|
if mapping.MatchType == MatchTypeGlob {
|
||||||
|
mappings = append(mappings, mapping.Match)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
n.FSM.BacktrackingNeeded = fsm.TestIfNeedBacktracking(mappings, n.FSM.OrderingDisabled)
|
||||||
|
|
||||||
|
m.FSM = n.FSM
|
||||||
|
m.doRegex = n.doRegex
|
||||||
|
}
|
||||||
|
m.doFSM = n.doFSM
|
||||||
|
|
||||||
if m.MappingsCount != nil {
|
if m.MappingsCount != nil {
|
||||||
m.MappingsCount.Set(float64(len(n.Mappings)))
|
m.MappingsCount.Set(float64(len(n.Mappings)))
|
||||||
|
@ -177,10 +214,33 @@ func (m *MetricMapper) InitFromFile(fileName string) error {
|
||||||
}
|
}
|
||||||
|
|
||||||
func (m *MetricMapper) GetMapping(statsdMetric string, statsdMetricType MetricType) (*MetricMapping, prometheus.Labels, bool) {
|
func (m *MetricMapper) GetMapping(statsdMetric string, statsdMetricType MetricType) (*MetricMapping, prometheus.Labels, bool) {
|
||||||
|
// glob matching
|
||||||
|
if m.doFSM {
|
||||||
|
finalState, captures := m.FSM.GetMapping(statsdMetric, string(statsdMetricType))
|
||||||
|
if finalState != nil && finalState.Result != nil {
|
||||||
|
result := finalState.Result.(*MetricMapping)
|
||||||
|
result.Name = result.nameFormatter.Format(captures)
|
||||||
|
|
||||||
|
labels := prometheus.Labels{}
|
||||||
|
for index, formatter := range result.labelFormatters {
|
||||||
|
labels[result.labelKeys[index]] = formatter.Format(captures)
|
||||||
|
}
|
||||||
|
return result, labels, true
|
||||||
|
} else if !m.doRegex {
|
||||||
|
// if there's no regex match type, return immediately
|
||||||
|
return nil, nil, false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// regex matching
|
||||||
m.mutex.Lock()
|
m.mutex.Lock()
|
||||||
defer m.mutex.Unlock()
|
defer m.mutex.Unlock()
|
||||||
|
|
||||||
for _, mapping := range m.Mappings {
|
for _, mapping := range m.Mappings {
|
||||||
|
// if a rule don't have regex matching type, the regex field is unset
|
||||||
|
if mapping.regex == nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
matches := mapping.regex.FindStringSubmatchIndex(statsdMetric)
|
matches := mapping.regex.FindStringSubmatchIndex(statsdMetric)
|
||||||
if len(matches) == 0 {
|
if len(matches) == 0 {
|
||||||
continue
|
continue
|
||||||
|
|
775
pkg/mapper/mapper_benchmark_test.go
Normal file
775
pkg/mapper/mapper_benchmark_test.go
Normal file
|
@ -0,0 +1,775 @@
|
||||||
|
// Copyright 2013 The Prometheus Authors
|
||||||
|
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
// you may not use this file except in compliance with the License.
|
||||||
|
// You may obtain a copy of the License at
|
||||||
|
//
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
//
|
||||||
|
// Unless required by applicable law or agreed to in writing, software
|
||||||
|
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
// See the License for the specific language governing permissions and
|
||||||
|
// limitations under the License.
|
||||||
|
|
||||||
|
package mapper
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
ruleTemplateSingleMatchGlob = `
|
||||||
|
- match: metric%d.*
|
||||||
|
name: "metric_single"
|
||||||
|
labels:
|
||||||
|
name: "$1"
|
||||||
|
`
|
||||||
|
ruleTemplateSingleMatchRegex = `
|
||||||
|
- match: metric%d\.([^.]*)
|
||||||
|
name: "metric_single"
|
||||||
|
labels:
|
||||||
|
name: "$1"
|
||||||
|
`
|
||||||
|
|
||||||
|
ruleTemplateMultipleMatchGlob = `
|
||||||
|
- match: metric%d.*.*.*.*.*.*.*.*.*.*.*.*
|
||||||
|
name: "metric_multi"
|
||||||
|
labels:
|
||||||
|
name: "$1-$2-$3.$4-$5-$6.$7-$8-$9.$10-$11-$12"
|
||||||
|
`
|
||||||
|
|
||||||
|
ruleTemplateMultipleMatchRegex = `
|
||||||
|
- match: metric%d\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)
|
||||||
|
name: "metric_multi"
|
||||||
|
labels:
|
||||||
|
name: "$1-$2-$3.$4-$5-$6.$7-$8-$9.$10-$11-$12"
|
||||||
|
`
|
||||||
|
)
|
||||||
|
|
||||||
|
func duplicateRules(count int, template string) string {
|
||||||
|
rules := ""
|
||||||
|
for i := 0; i < count; i++ {
|
||||||
|
rules += fmt.Sprintf(template, i)
|
||||||
|
}
|
||||||
|
return rules
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlob(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: test.dispatcher.*.*.succeeded
|
||||||
|
name: "dispatch_events"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "$2"
|
||||||
|
result: "succeeded"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: test.my-dispatch-host01.name.dispatcher.*.*.*
|
||||||
|
name: "host_dispatch_events"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "$2"
|
||||||
|
result: "$3"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: request_time.*.*.*.*.*.*.*.*.*.*.*.*
|
||||||
|
name: "tyk_http_request"
|
||||||
|
labels:
|
||||||
|
method_and_path: "${1}"
|
||||||
|
response_code: "${2}"
|
||||||
|
apikey: "${3}"
|
||||||
|
apiversion: "${4}"
|
||||||
|
apiname: "${5}"
|
||||||
|
apiid: "${6}"
|
||||||
|
ipv4_t1: "${7}"
|
||||||
|
ipv4_t2: "${8}"
|
||||||
|
ipv4_t3: "${9}"
|
||||||
|
ipv4_t4: "${10}"
|
||||||
|
orgid: "${11}"
|
||||||
|
oauthid: "${12}"
|
||||||
|
- match: "*.*"
|
||||||
|
name: "catchall"
|
||||||
|
labels:
|
||||||
|
first: "$1"
|
||||||
|
second: "$2"
|
||||||
|
third: "$3"
|
||||||
|
job: "-"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"test.dispatcher.FooProcessor.send.succeeded",
|
||||||
|
"test.my-dispatch-host01.name.dispatcher.FooProcessor.send.succeeded",
|
||||||
|
"request_time.get/threads/1/posts.200.00000000.nonversioned.discussions.a11bbcdf0ac64ec243658dc64b7100fb.172.20.0.1.12ba97b7eaa1a50001000001.",
|
||||||
|
"foo.bar",
|
||||||
|
"foo.bar.baz",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlobNoOrdering(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
glob_disable_ordering: true
|
||||||
|
mappings:
|
||||||
|
- match: test.dispatcher.*.*.succeeded
|
||||||
|
name: "dispatch_events"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "$2"
|
||||||
|
result: "succeeded"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: test.my-dispatch-host01.name.dispatcher.*.*.*
|
||||||
|
name: "host_dispatch_events"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "$2"
|
||||||
|
result: "$3"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: request_time.*.*.*.*.*.*.*.*.*.*.*.*
|
||||||
|
name: "tyk_http_request"
|
||||||
|
labels:
|
||||||
|
method_and_path: "${1}"
|
||||||
|
response_code: "${2}"
|
||||||
|
apikey: "${3}"
|
||||||
|
apiversion: "${4}"
|
||||||
|
apiname: "${5}"
|
||||||
|
apiid: "${6}"
|
||||||
|
ipv4_t1: "${7}"
|
||||||
|
ipv4_t2: "${8}"
|
||||||
|
ipv4_t3: "${9}"
|
||||||
|
ipv4_t4: "${10}"
|
||||||
|
orgid: "${11}"
|
||||||
|
oauthid: "${12}"
|
||||||
|
- match: "*.*"
|
||||||
|
name: "catchall"
|
||||||
|
labels:
|
||||||
|
first: "$1"
|
||||||
|
second: "$2"
|
||||||
|
third: "$3"
|
||||||
|
job: "-"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"test.dispatcher.FooProcessor.send.succeeded",
|
||||||
|
"test.my-dispatch-host01.name.dispatcher.FooProcessor.send.succeeded",
|
||||||
|
"request_time.get/threads/1/posts.200.00000000.nonversioned.discussions.a11bbcdf0ac64ec243658dc64b7100fb.172.20.0.1.12ba97b7eaa1a50001000001.",
|
||||||
|
"foo.bar",
|
||||||
|
"foo.bar.baz",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlobNoOrderingWithBacktracking(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
glob_disable_ordering: true
|
||||||
|
mappings:
|
||||||
|
- match: test.dispatcher.*.*.succeeded
|
||||||
|
name: "dispatch_events"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "$2"
|
||||||
|
result: "succeeded"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: test.dispatcher.*.received.*
|
||||||
|
name: "dispatch_events_wont_match"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "received"
|
||||||
|
result: "$2"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: test.my-dispatch-host01.name.dispatcher.*.*.*
|
||||||
|
name: "host_dispatch_events"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "$2"
|
||||||
|
result: "$3"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: request_time.*.*.*.*.*.*.*.*.*.*.*.*
|
||||||
|
name: "tyk_http_request"
|
||||||
|
labels:
|
||||||
|
method_and_path: "${1}"
|
||||||
|
response_code: "${2}"
|
||||||
|
apikey: "${3}"
|
||||||
|
apiversion: "${4}"
|
||||||
|
apiname: "${5}"
|
||||||
|
apiid: "${6}"
|
||||||
|
ipv4_t1: "${7}"
|
||||||
|
ipv4_t2: "${8}"
|
||||||
|
ipv4_t3: "${9}"
|
||||||
|
ipv4_t4: "${10}"
|
||||||
|
orgid: "${11}"
|
||||||
|
oauthid: "${12}"
|
||||||
|
- match: "*.*"
|
||||||
|
name: "catchall"
|
||||||
|
labels:
|
||||||
|
first: "$1"
|
||||||
|
second: "$2"
|
||||||
|
third: "$3"
|
||||||
|
job: "-"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"test.dispatcher.FooProcessor.send.succeeded",
|
||||||
|
"test.my-dispatch-host01.name.dispatcher.FooProcessor.send.succeeded",
|
||||||
|
"request_time.get/threads/1/posts.200.00000000.nonversioned.discussions.a11bbcdf0ac64ec243658dc64b7100fb.172.20.0.1.12ba97b7eaa1a50001000001.",
|
||||||
|
"foo.bar",
|
||||||
|
"foo.bar.baz",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegex(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
match_type: regex
|
||||||
|
mappings:
|
||||||
|
- match: test\.dispatcher\.([^.]*)\.([^.]*)\.([^.]*)
|
||||||
|
name: "dispatch_events"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "$2"
|
||||||
|
result: "$3"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: test.my-dispatch-host01.name.dispatcher\.([^.]*)\.([^.]*)\.([^.]*)
|
||||||
|
name: "host_dispatch_events"
|
||||||
|
labels:
|
||||||
|
processor: "$1"
|
||||||
|
action: "$2"
|
||||||
|
result: "$3"
|
||||||
|
job: "test_dispatcher"
|
||||||
|
- match: request_time\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)
|
||||||
|
name: "tyk_http_request"
|
||||||
|
labels:
|
||||||
|
method_and_path: "${1}"
|
||||||
|
response_code: "${2}"
|
||||||
|
apikey: "${3}"
|
||||||
|
apiversion: "${4}"
|
||||||
|
apiname: "${5}"
|
||||||
|
apiid: "${6}"
|
||||||
|
ipv4_t1: "${7}"
|
||||||
|
ipv4_t2: "${8}"
|
||||||
|
ipv4_t3: "${9}"
|
||||||
|
ipv4_t4: "${10}"
|
||||||
|
orgid: "${11}"
|
||||||
|
oauthid: "${12}"
|
||||||
|
- match: \.([^.]*)\.([^.]*)
|
||||||
|
name: "catchall"
|
||||||
|
labels:
|
||||||
|
first: "$1"
|
||||||
|
second: "$2"
|
||||||
|
third: "$3"
|
||||||
|
job: "-"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"test.dispatcher.FooProcessor.send.succeeded",
|
||||||
|
"test.my-dispatch-host01.name.dispatcher.FooProcessor.send.succeeded",
|
||||||
|
"request_time.get/threads/1/posts.200.00000000.nonversioned.discussions.a11bbcdf0ac64ec243658dc64b7100fb.172.20.0.1.12ba97b7eaa1a50001000001.",
|
||||||
|
"foo.bar",
|
||||||
|
"foo.bar.baz",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlobSingleMatch(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: metric.*
|
||||||
|
name: "metric_one"
|
||||||
|
labels:
|
||||||
|
name: "$1"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"metric.aaa",
|
||||||
|
"metric.bbb",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegexSingleMatch(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: metric\.([^.]*)
|
||||||
|
name: "metric_one"
|
||||||
|
match_type: regex
|
||||||
|
labels:
|
||||||
|
name: "$1"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"metric.aaa",
|
||||||
|
"metric.bbb",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlobMultipleCaptures(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: metric.*.*.*.*.*.*.*.*.*.*.*.*
|
||||||
|
name: "metric_multi"
|
||||||
|
labels:
|
||||||
|
name: "$1-$2-$3.$4-$5-$6.$7-$8-$9.$10-$11-$12"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"metric.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegexMultipleCaptures(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: metric\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)
|
||||||
|
name: "metric_multi"
|
||||||
|
match_type: regex
|
||||||
|
labels:
|
||||||
|
name: "$1-$2-$3.$4-$5-$6.$7-$8-$9.$10-$11-$12"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"metric.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlobMultipleCapturesNoFormat(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: metric.*.*.*.*.*.*.*.*.*.*.*.*
|
||||||
|
name: "metric_multi"
|
||||||
|
labels:
|
||||||
|
name: "not_relevant"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"metric.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegexMultipleCapturesNoFormat(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: metric\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)
|
||||||
|
name: "metric_multi"
|
||||||
|
match_type: regex
|
||||||
|
labels:
|
||||||
|
name: "not_relevant"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"metric.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlobMultipleCapturesDifferentLabels(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: metric.*.*.*.*.*.*.*.*.*.*.*.*
|
||||||
|
name: "metric_multi"
|
||||||
|
labels:
|
||||||
|
label1: "$1"
|
||||||
|
label2: "$2"
|
||||||
|
label3: "$3"
|
||||||
|
label4: "$4"
|
||||||
|
label5: "$5"
|
||||||
|
label6: "$6"
|
||||||
|
label7: "$7"
|
||||||
|
label8: "$8"
|
||||||
|
label9: "$9"
|
||||||
|
label10: "$10"
|
||||||
|
label11: "$11"
|
||||||
|
label12: "$12"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"metric.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegexMultipleCapturesDifferentLabels(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:
|
||||||
|
- match: metric\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)\.([^.]*)
|
||||||
|
name: "metric_multi"
|
||||||
|
match_type: regex
|
||||||
|
labels:
|
||||||
|
label1: "$1"
|
||||||
|
label2: "$2"
|
||||||
|
label3: "$3"
|
||||||
|
label4: "$4"
|
||||||
|
label5: "$5"
|
||||||
|
label6: "$6"
|
||||||
|
label7: "$7"
|
||||||
|
label8: "$8"
|
||||||
|
label9: "$9"
|
||||||
|
label10: "$10"
|
||||||
|
label11: "$11"
|
||||||
|
label12: "$12"
|
||||||
|
`
|
||||||
|
mappings := []string{
|
||||||
|
"metric.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlob10Rules(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateSingleMatchGlob)
|
||||||
|
mappings := []string{
|
||||||
|
"metric100.a",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegex10RulesAverage(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
match_type: regex
|
||||||
|
mappings:` + duplicateRules(10, ruleTemplateSingleMatchRegex)
|
||||||
|
mappings := []string{
|
||||||
|
"metric5.a",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlob100Rules(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateSingleMatchGlob)
|
||||||
|
mappings := []string{
|
||||||
|
"metric100.a",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlob100RulesNoMatch(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateSingleMatchGlob)
|
||||||
|
mappings := []string{
|
||||||
|
"metricnomatchy.a",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlob100RulesNoOrderingNoMatch(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
glob_disable_ordering: true
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateSingleMatchGlob)
|
||||||
|
mappings := []string{
|
||||||
|
"metricnomatchy.a",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegex100RulesAverage(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
match_type: regex
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateSingleMatchRegex)
|
||||||
|
mappings := []string{
|
||||||
|
"metric50.a",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegex100RulesWorst(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
match_type: regex
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateSingleMatchRegex)
|
||||||
|
mappings := []string{
|
||||||
|
"metric100.a",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkGlob100RulesMultipleCaptures(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateMultipleMatchGlob)
|
||||||
|
mappings := []string{
|
||||||
|
"metric50.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegex100RulesMultipleCapturesAverage(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
match_type: regex
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateMultipleMatchRegex)
|
||||||
|
mappings := []string{
|
||||||
|
"metric50.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func BenchmarkRegex100RulesMultipleCapturesWorst(b *testing.B) {
|
||||||
|
config := `---
|
||||||
|
defaults:
|
||||||
|
match_type: regex
|
||||||
|
mappings:` + duplicateRules(100, ruleTemplateMultipleMatchRegex)
|
||||||
|
mappings := []string{
|
||||||
|
"metric100.a.b.c.d.e.f.g.h.i.j.k.l",
|
||||||
|
}
|
||||||
|
|
||||||
|
mapper := MetricMapper{}
|
||||||
|
err := mapper.InitFromYAMLString(config)
|
||||||
|
if err != nil {
|
||||||
|
b.Fatalf("Config load error: %s %s", config, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
b.ResetTimer()
|
||||||
|
for j := 0; j < b.N; j++ {
|
||||||
|
for _, metric := range mappings {
|
||||||
|
mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
|
@ -139,6 +139,94 @@ mappings:
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
//Config with backtracking
|
||||||
|
{
|
||||||
|
config: `
|
||||||
|
defaults:
|
||||||
|
glob_disable_ordering: true
|
||||||
|
mappings:
|
||||||
|
- match: backtrack.*.bbb
|
||||||
|
name: "testb"
|
||||||
|
labels:
|
||||||
|
label: "${1}_foo"
|
||||||
|
- match: backtrack.justatest.aaa
|
||||||
|
name: "testa"
|
||||||
|
labels:
|
||||||
|
label: "${1}_foo"
|
||||||
|
`,
|
||||||
|
mappings: mappings{
|
||||||
|
"backtrack.good.bbb": {
|
||||||
|
name: "testb",
|
||||||
|
labels: map[string]string{
|
||||||
|
"label": "good_foo",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"backtrack.justatest.bbb": {
|
||||||
|
name: "testb",
|
||||||
|
labels: map[string]string{
|
||||||
|
"label": "justatest_foo",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
//Config with super sets, disables ordering
|
||||||
|
{
|
||||||
|
config: `
|
||||||
|
defaults:
|
||||||
|
glob_disable_ordering: true
|
||||||
|
mappings:
|
||||||
|
- match: noorder.*.*
|
||||||
|
name: "testa"
|
||||||
|
labels:
|
||||||
|
label: "${1}_foo"
|
||||||
|
- match: noorder.*.bbb
|
||||||
|
name: "testb"
|
||||||
|
labels:
|
||||||
|
label: "${1}_foo"
|
||||||
|
- match: noorder.ccc.bbb
|
||||||
|
name: "testc"
|
||||||
|
labels:
|
||||||
|
label: "ccc_foo"
|
||||||
|
`,
|
||||||
|
mappings: mappings{
|
||||||
|
"noorder.good.bbb": {
|
||||||
|
name: "testb",
|
||||||
|
labels: map[string]string{
|
||||||
|
"label": "good_foo",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"noorder.ccc.bbb": {
|
||||||
|
name: "testc",
|
||||||
|
labels: map[string]string{
|
||||||
|
"label": "ccc_foo",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
//Config with super sets, keeps ordering
|
||||||
|
{
|
||||||
|
config: `
|
||||||
|
defaults:
|
||||||
|
glob_disable_ordering: false
|
||||||
|
mappings:
|
||||||
|
- match: order.*.*
|
||||||
|
name: "testa"
|
||||||
|
labels:
|
||||||
|
label: "${1}_foo"
|
||||||
|
- match: order.*.bbb
|
||||||
|
name: "testb"
|
||||||
|
labels:
|
||||||
|
label: "${1}_foo"
|
||||||
|
`,
|
||||||
|
mappings: mappings{
|
||||||
|
"order.good.bbb": {
|
||||||
|
name: "testa",
|
||||||
|
labels: map[string]string{
|
||||||
|
"label": "good_foo",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
// Config with bad regex reference.
|
// Config with bad regex reference.
|
||||||
{
|
{
|
||||||
config: `---
|
config: `---
|
||||||
|
@ -483,9 +571,10 @@ mappings:
|
||||||
t.Fatalf("%d. Expected bad config, but loaded ok: %s", i, scenario.config)
|
t.Fatalf("%d. Expected bad config, but loaded ok: %s", i, scenario.config)
|
||||||
}
|
}
|
||||||
|
|
||||||
var dummyMetricType MetricType = ""
|
|
||||||
for metric, mapping := range scenario.mappings {
|
for metric, mapping := range scenario.mappings {
|
||||||
m, labels, present := mapper.GetMapping(metric, dummyMetricType)
|
// exporter will call mapper.GetMapping with valid MetricType
|
||||||
|
// so we also pass a sane MetricType in testing
|
||||||
|
m, labels, present := mapper.GetMapping(metric, MetricTypeCounter)
|
||||||
if present && mapping.name != "" && m.Name != mapping.name {
|
if present && mapping.name != "" && m.Name != mapping.name {
|
||||||
t.Fatalf("%d.%q: Expected name %v, got %v", i, metric, m.Name, mapping.name)
|
t.Fatalf("%d.%q: Expected name %v, got %v", i, metric, m.Name, mapping.name)
|
||||||
}
|
}
|
||||||
|
|
Loading…
Reference in a new issue