Focused crawls are collections of frequently-updated webcrawl data from narrow (as opposed to broad or wide) web crawls, often focused on a single domain or subdomain.
I understand the nomad need to be start/restart as a 'root' user because many operation it runs, needs root privilege.
However as a system admin I would like to harden/limit the scope of 'sudo' access to specific tasks only. As far as I have analyzed the following will need sudo/become/root privilege's:
Sherpa is a highly available, fast, and flexible horizontal job scaling for HashiCorp Nomad. It is capable of running in a number of different modes to suit different requirements, and can scale based on Nomad resource metrics or external sources.
Hashicorp Homelab is a collection of nomad recipes related to several Open Source projects that I use on my own nomad + consul + vault + Intel Nuc cluster.
I'm trying to follow the readme, but I'm running into lots of issues understanding it. If I make it through, I'll try to make a PR with some clarity changes, but I wanted to note some issues that I was wondering about upfront.
I understand the nomad need to be start/restart as a 'root' user because many operation it runs, needs root privilege.
However as a system admin I would like to harden/limit the scope of 'sudo' access to specific tasks only. As far as I have analyzed the following will need sudo/become/root privilege's: