Hi everyone! I’ve got a question about some guys at work who are pitching that we need a server. This is mostly a sanity check, but I’m pretty sure they’re full of shit

The law firm I work for uses a CRM that doesn’t have an open API. This is important. You can access a daily updated copy of your data from an S3 bucket with an API, but it’s essentially read-only for that reason.

They said they want to set up a server for the firm for " automation" using n8n. We can’t use n8n because it doesn’t have a native integration with our CRM. They also set up Dropbox for the firm (prior to my time starting) and the firm pays for OneDrive with Microsoft suite already…

Then they said they want local ai for automation so they’ll

“Just put a 5070ti in there” For an ai agent for the entire 15-20 person firm. They also never specified what said ai would do. I also think it’s completely not viable

Then they said all 3 locations can just use tailscale to access the server simultaneously. All of these people minus me and one other are completely non-technical. I help them restart Excel once a week non-technical.

I cannot possibly think of a viable use case for what they’re describing. Am I cinical or are they just looking to make some cash off whatever project they don’t know anything about?

  • theunknownmuncher@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    9 hours ago

    5070 Ti

    for the entire 15-20 person firm

    Local AI is a great option to look into, but I can’t imagine that’s going to go well… 16GB of VRAM is going to limit you to very small models and small context size. I imagine for a law firm, you’re going to want the AI to be reading lots of documents, so lots of context. Maybe it will be fine depending on how the 15-20 people access it, but I’m doubtful.

    Is this machine just a proof of concept to start putting together a process and testing the waters? I wouldn’t call total bullshit immediately, but I’d expect you’ll eventually find that you need much more VRAM and probably a heavier development lift to integrate with n8n.

    • appauled@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 hours ago

      That’s what I was thinking, they were saying that a 5070 Ti would be good enough for “whatever Ai automations the firm wants to build” which is where I was calling BS

      • theunknownmuncher@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        5 hours ago

        Then yeah, that’s for sure BS. For getting started and testing a PoC, it is “reasonable” depending on a lot of factors, but there’s almost no way that a 5070 Ti will be a permanent production solution.