This commit adds a new job filter to the gearman client to filter
based on the build queue. This is used for the subunit jobs which
we don't want to run on check jobs.
Change-Id: If81fe98d8d67bb718c53a963695a7d06f5f6625d
This adds a new gearman worker to process the subunit files from
the gate job runs. It will use subunit2sql to connect to a sql
server and process the data from the subunit file. The
log-gearman-client is modified to allow for pushing subunit jobs
to gearman, and the worker model for processsing logs is borrowed
to process the subunit files.
Change-Id: I83103eb6afc22d91f916583c36c0e956c23a64b3
We are leaking file descriptors in our log worker processes because we
are are not catch all possible errors leaving some actions left behind
to do. More aggressively catch errors so that all cleanup happens
Change-Id: I7a73a36c6fc42d4eba636cf36c8cfffcea48a318
According to https://docs.python.org/3/howto/pyporting.html the
syntax changed in Python 3.x. The new syntax is usable with
Python >= 2.6 and should be preferred to be compatible with Python3.
Enabled hacking check H231.
Change-Id: I4c20a04bc7732efc2d4bbcbc3d285107b244e5fa
The openstack logs a full of various IDs and UUIDs but they are not
uniquely special when it comes to filtering them. Instead replace each
ID with a token making CRM114's life much easier.
Change-Id: Id9b430c0d31889b89e4e0c1790a2405d73f501b5
We are currently using a lot of wildcard searches in elasticsearch which
are slow. Provide better field data so that we can replace those
wildcard searches with filters. In particular add a short uuid field and
make the filename tag field the basename of the filepath so that grenade
and non grenade files all end up with the same tags.
Change-Id: If558017fceae96bcf197e611ab5cac1cfe7ae9bf
Have the log-gearman-client (aka jenkins-log-client) initialize
the statsd parameters when starting the geard server. Also, make
sure that the python statsd package is installed on the host.
Change-Id: I04fe1a7609f08bc710891b6a3b92d0f4d156d86c
If there is an exception filtering a log event handle that by removing
the filter and continuing to process the remaining log events for the
assocaited file. This prevents non filter data from being lost when the
filters have an exception.
Change-Id: I65141daf21a873096829c41fdc2c77cbeecde2e3
CRM 114 is being forked off of the gearman worker processes and as a
result has open fds for log files and tcp connections. CRM 114 should be
isolated from the fds so that it doesn't crash when they change
unexpectedly. Close the fds using the subprocess.Popen close_fds flag.
Change-Id: I4fbdf3564771be7d7a7e4c518e571634de576253
Add grenade new/ and old/ logs to logstash. To do this without tripling
HTTP GETs for every finished job add a job filter to the log gearman
client that when present only attempts to grab files if the job name
matches the job filter.
Change-Id: Ia33722bf71d482f2fd6b655b28090a10bf46af54
Please see corresponding review for zmq-event-publisher:
https://review.openstack.org/#/c/67495/
This will help track down jenkins host/slave issues
Change-Id: I660252dc79f074b52587298120b4d6ceeedaf9a3
Since some bugs are branch specific we want to write logstash queries
that use the branch, so log ZUUL_BRANCH as build_branch in logstash.
From zuul's launchers doc:
ZUUL_BRANCH: The target branch for the change that triggered this build
Change-Id: Ic408afb235be5716231c663616c17a98ef6f8870
zmq publisher already has the computer/node name information
per commit:
openstack-infra/zmq-event-publisher 36ca349
we just need to pull it out of the event and pass it along as
a new field
Change-Id: Iddefdf74ddf170eaafcd82c5e1f5b0389651cf89
Separate the jenkins log client and worker bits into a new module
called log_processor with ::client and ::worker classes.
Instantiate two workers on each logstash worker node.
Change-Id: I7cfec410983c25633e6b555f22a85e9435884cfb