网站的创新点有哪些,代理记账公司注册,html图案代码大全,网站可以先做代码么如果victoriametrics的scrape_timeout大于scrape_interval时会怎么样
背景 我们在配置采集的时候#xff0c;有时候会不小心配置了scrape_timeout大于scrape_interval#xff0c;那么这时会出现什么情况呢#xff1f;vmagent在采集的时候#xff0c;如果上一次采集还没有完…如果victoriametrics的scrape_timeout大于scrape_interval时会怎么样
背景 我们在配置采集的时候有时候会不小心配置了scrape_timeout大于scrape_interval那么这时会出现什么情况呢vmagent在采集的时候如果上一次采集还没有完成其实会结束掉再发起下一次采集的不会出现说上一次没完成又发起新的一次因为这样只会采集越堆积越多在到了下一次还没有完成或者超过了scrape_timeout那么就会标识采集失败 如果出现scrape_config的scrape_timeout大于scrape_interval时会怎么样
官方文档的说明 https://docs.victoriametrics.com/sd_configs.html#scrape_configs # scrape_timeout is an optional timeout when scraping the targets.# By default, the scrape_timeout specified in global section is used.# See https://prometheus.io/docs/prometheus/latest/configuration/configuration/#configuration-file# If global section doesnt contain the scrape_timeout option,# then 10 seconds interval is used.# Example values:# - 30s - 30 seconds# - 2m - 2 minutes# The scrape_timeout cannot exceed the scrape_interval.# The scrape_timeout can be set on a per-target basis by specifying __scrape_timeout__# label during target relabeling phase.# See https://docs.victoriametrics.com/vmagent.html#relabeling# scrape_timeout: duration源码里的处理
// ScrapeConfigs returns the scrape configurations.
func (c *Config) GetScrapeConfigs() ([]*ScrapeConfig, error) {scfgs : make([]*ScrapeConfig, len(c.ScrapeConfigs))jobNames : map[string]string{}for i, scfg : range c.ScrapeConfigs {// We do these checks for library users that would not call Validate in// Unmarshal.if err : scfg.Validate(c.GlobalConfig.ScrapeInterval, c.GlobalConfig.ScrapeTimeout); err ! nil {return nil, err}......
}func (c *ScrapeConfig) Validate(defaultInterval, defaultTimeout model.Duration) error {if c nil {return errors.New(empty or null scrape config section)}// First set the correct scrape interval, then check that the timeout// (inferred or explicit) is not greater than that.if c.ScrapeInterval 0 {c.ScrapeInterval defaultInterval}if c.ScrapeTimeout c.ScrapeInterval {return fmt.Errorf(scrape timeout greater than scrape interval for scrape config with job name %q, c.JobName)}if c.ScrapeTimeout 0 {if defaultTimeout c.ScrapeInterval {c.ScrapeTimeout c.ScrapeInterval} else {c.ScrapeTimeout defaultTimeout}}return nil
}结论
job的scrape_timeout如果大于scrape_interval则报错如果全局的scrape_timeout大于采集的scrape_interval则将scrape_timeout设置为scrape_interval否则设置job的scrape_interval为全局scrape_interval