It’s not the worst idea ever, if they just want to dip their toes into the water and see what you can do with a setup like this. The upshot will be, however, that a local LLM will always be pretty limited in its capabilities. Also I suspect that most of those “automations” could be done with a few lines of Python. Now, of course you can get your LLM to write the code for you but if you don’t understand the output it produces, you’ll have a recipe for disaster
It’s not the worst idea ever, if they just want to dip their toes into the water and see what you can do with a setup like this. The upshot will be, however, that a local LLM will always be pretty limited in its capabilities. Also I suspect that most of those “automations” could be done with a few lines of Python. Now, of course you can get your LLM to write the code for you but if you don’t understand the output it produces, you’ll have a recipe for disaster