Hello Everyone
Welcome back to Old Logs New Tricks!
Todays post we are going to talk about an issue I ran into recently and thought it would be a good idea to post about.
Here's the message I got from an indexer after we had some architecture issues with the underlying hardware Spunk was running on(all images and messages are sanitized for security reasons....of course):
Failed to register with cluster master reason: failed method=POST path=/services/cluster/master/peers/?output_mode=json master=xxx.xxx.xxx:8089 rv=0 gotConnectionError=0 gotUnexpectedStatusCode=1 actual_response_code=500 expected_response_code=2xx status_line=“Internal Server Error” socket_error=“No error” remote_error=Cannot add peer=xxx.xxx.xxx.xxx mgmtport=8089 (reason: bucket already added as clustered, peer attempted to add again as standalone. guid=12345A67-BC8D-9E0F-G1H2-34I56J7K8L9M bid=windows_perfmon~571~12345A67-BC8D-9E0F-G1H2-34I56J7K8L9M).
So this bucket was preventing the indexer from re-joining the indexer cluster.
So what I had to do is go into the, in my case custom data library path, and find the bucket in question:
Based on the warning message I know the index to look in and the name of the bucket so I can grep that index and find the bucket in question:
[splunk@hostname splunkdata]$ grep -r '_571 ' windows_perfmon/db/
This found the bucket I was looking for:
To fix this I tacked the GUID of the cluster master to the end of the bucket to force it to send it to the cluster master to replicate/fix. TO do this I had to find the GUID of the cluster master. You can do this by checking the file located at $SPLUNK_HOME/etc/instance.cfg:
To add the GUID I simply used 'mv' to move the bucket to the new bucket name: [splunk@hostname db]$ mv db_1627425087_1627424688_571 db_1627425087_1627424688_571_12345D67-CE8C-9F0A-A1D2-34E56B7A8E9F
And this is the result:
After is this was completed, I simply restarted Splunk on the indexer and it send the bucket to the cluster master and re-joined the cluster
I hope this helps in the event some of you may run into this issue. I definitely was tricky to figure out but once I did fixing it was easy.
Have a great day everyone and talk to you soon!
Todd
Comentarios