Hey y’all.

I’ve been running a eBay special server with a pair of Xeon E5-2687W v2 and 64gb of cheap RAM. It’s been working pretty great (other than a little bit of instability that randomly went away after a while, but I expect to come back at some point). I’ve had this setup for a couple years now, and have been eyeballing a couple different upgrade paths.

The reason I want to switch is that the eBay special motherboard does not have enough pci-e slots for what I’m planning in the near future.

The reason I want to upgrade is that I like having nice server hardware.

So I have been looking at alternatives. Is anyone else amazed at how cheap a CPU is, but how expensive a socket SP3 motherboard is? The cheapest sp3 board I’ve found is more than 500$, used. The cpus which will outperform my setup in every metric cost less than 200$.

This had me looking at new/used gaming hardware instead, which is cheaper, but misses out on ecc and also has a more limited amount of pci-e slots as well, (which is the primary motivation for changing my setup).

I’m trying to find a 16core/32 thread setup with the best single core performance available in that class. My current setup is 8core/16thread x2, and i’d like to avoid ‘downgrading’ to fewer cores. But the things I run usually run do better with more single thread performance than multi thread performance.

I found the epyc 7371, and the ryzen 5950x which meet my core requirements, but both have downsides I’m trying to mitigate. (Mobo price and ECC ram, respectively)

Do y’all have any alternative options I might’ve missed?

  • Egonallanon@feddit.uk
    link
    fedilink
    English
    arrow-up
    4
    ·
    11 days ago

    Boards will often end up costing more than CPUs as they fail more so supply for them dries up quicker.

    As for you use case a little more detail on what you’re doing could help. Though if I were you I’d be going for the used gaming hardware if it suits your needs as ECC + management stuff that you get with servers boards while nice isn’t really all that necessary for homelab work.

    Another avenue to look is if your workload needs lots of individual add-in cards but they don’t need much bandwidth old mining mobos can be a decent source though they all use conspire CPU slots from what I’ve seen.

    • DaGeek247@fedia.ioOP
      link
      fedilink
      arrow-up
      2
      ·
      11 days ago

      Boards will often end up costing more than CPUs as they fail more so supply for them dries up quicker.

      I was hoping you wouldn’t say that. I’ve seen this mentioned in a couple of other places too.

      I use it as a main use home server. Jellyfin (with transcoding), 96tb of storage (64tb usable), the occasional game server (minecraft, valheim, both do better with better single core performance), navidrome (with conversion), and whatever other project catches my eye. Lately it’s been playing with the website generator, Hugo, which takes no resources, but prior to that it was VHS to digital conversions, which took all the resources.

      I’ve got a GTX 1650 and a SAS card (for extra hard drive plugins) right now. I’d like to add a m.2 extension card, and a 10gbps sfp card, but my current board only has two pcie slots right now.

      Given what I’m after, more slots without proper bandwidth, like from a mining board, would likely result in issues that I would only notice in benchmarks, but would bug me regardless. If I’m spending half a grand on new parts, they might as well fully support the stuff I’m spending half a grand for.

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    11 days ago

    Is there a reason you’re attached to 16 cores? A modern mid to high range CPU will have about 2x the single thread performance, so even with a downgrade in core count it would still be substantially faster overall.

    For example an i5-14600K has 14 cores, with 3x the total performance and 2x the single thread performance, and it’s a $160 CPU that also has an iGPU that has really good transcoding ability, and can run openvino for stuff like object detection on Frigate or ML search on images in Immich, so no external GPU needed either.

    A setup with a 14600k would also idle at probably 20W (with no PCIe cards or HDDs), so a huge power savings over any enterprise server gear.

    Plus even cheaper motherboards will often have 2+ NVMe slots, so you probably won’t need an addon card for those, on top of no extra GPU, leaving you with only the 10GbE and SAS cards.

    Personally I’d say since you’ll be transcoding and stuff I would not go with an AMD CPU.

    • DaGeek247@fedia.ioOP
      link
      fedilink
      arrow-up
      1
      ·
      10 days ago

      The reason I want to upgrade is that I like having nice server hardware.

      Pride, mostly. Bigger numbers make me feel good, and I refuse to apologise for it. I may end up waiting another year if the requirements aren’t reasonable right now. This is a hobby without any real deadline, so, ‘just waiting a bit’ is a decision I’m entirely willing to make to make sure I get what I’m after.

      Notably, the m.2 expansion card isn’t for operating system storage; it’s for a scratch disk for jellyfin transcodes and in-progress torrent downloads. This is separate from my OS drive, which is also m.2, but usually plugged into the motherboard.

      Also, the epyc 7371, which has 16/32, and is better in every measurable way than my current setup, without a motherboard, is about 80$ on eBay. It’s what started me on this topic in the first place. An Intel processor that also does that, but with less cores, is not as impressive in comparison.

      • MangoPenguin@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 days ago

        Fair enough haha

        You could combine the scratch disk and OS drives into a pair of mirrored M.2 drives. That’s what I do on my setup!

        The downside I see to the Epyc 7371 is poor single thread performance, only about 15% better than the E5-2687W v2. I have the E5-2667 v2 CPU in my server and it really struggles at running servers for Minecraft, Enshrouded, Arma 3, etc…

        Also worth a look at the i7-14700k, it has 20 cores and 28 threads and absolutely blows the Epyc and Xeon out of the water.

        https://www.cpubenchmark.net/compare/5719vs5720vs3387/Intel-i7-14700K-vs-Intel-i5-14600K-vs-AMD-EPYC-7371

        Personally I’m probably going to do a build with the i3-14100 to replace my Xeon v2, I want less power usage and more single thread performance.

        • DaGeek247@fedia.ioOP
          link
          fedilink
          arrow-up
          1
          ·
          10 days ago

          Thank you for the info. The Intel processors really do look like they’re a better long term choice, for just about everything, including price.

          My only real issue is that, at best, I’ve found that all the Intel motherboards support one 4.0x16 pcie slot, one 3.0x4 slot, and then a couple 3.0x1 slots. In short, one 32gbs slot, one 4gbps slot, and then three 1gbps slots. It just isn’t enough for my use; bare minimum I need 10gbps for networking, and then at least a 3.0x16 slot for the video card.

          Well, I say need, but really this is a pride issue, making sure that things aren’t limited past their requirements.

          I could maybe switch from the video card to the Intel iGPU; not sure how it handles multiple 4k streams, but even if it did work I’d still be missing out on the scratchdisk and SAS card being at full speed. I would also be severely limited on any other card upgrades I could make.

          The epyc cpus have shittier power efficiency and single core speeds. But dammit, 3 pcie 3.0x16 slots with an additional three pcie 3.0x8 is just so tempting lmao.

          I was hoping for a slightly smaller compromise than I’m seeing between server tier pcie lane capability and personal tier processor speeds and efficiencies. I even looked into the threadripper series and they’re more expensive with less features than the epyc series. More performant, but still severly limited in expandability.

          • MangoPenguin@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            10 days ago

            The GTX 1650 I think is limited to 2 or 4 transcodes by nvidia, as far as I’ve heard the Intel iGPUs since 7th gen can handle up to 20 1080p transcodes, or around 5 4k transcodes.

            I think you’re off by 8 on your PCIe bandwidth due to conversion between bytes and bits, PCIe 3.0 x1 can do 1GB/s or 8Gb/s half-duplex, 16Gbps full-duplex. PCIe 3.0 x4 is 32Gbps/64Gbps.

            So the 10GbE card would be fine in the x4 slot (and pretty close to full speed in x1 even), and the SAS card in the x1 slot would get you 5 HDDs at full speed of 200MB/s each. Not sure how many HDDs you need to hook up though.

            It also depends on the chipset, it looks like B760 maybe supports PCIe 4.0 on all the slots, so double the bandwidth for the x4 and x1 slots vs PCIe 3.0. The higher end Z790 chipsets have even more PCIe 4.0 lanes: https://www.gizmowind.com/b760-vs-z790/

            Plus keep in mind the other reason the PCIe slots might seem limited is the onboard M.2 slots are using PCIe lanes as well, vs using an addon card in a PCIe x16 slot. Some of the Z790 boards have 4x onboard M.2 NVMe slots.

            • DaGeek247@fedia.ioOP
              link
              fedilink
              arrow-up
              1
              ·
              10 days ago

              The GTX 1650 I think is limited to 2 or 4 transcodes by nvidia

              Nvidia officially removed that limitation, and prior to that there was a workaround. I’ve been running it unlocked for a couple years now, and it handled four 4k streams just fine, but crashed at 5 or so. Plenty for who I’m sharing with, considering 4k is a only-sometimes sort of deal.

              I think you’re off by 8 on your PCIe bandwidth due to conversion between bytes and bits

              Lmao whoops. Looks like I am. This actually adds a couple options back to the table.

              The higher end Z790 chipsets have even more PCIe 4.0 lanes

              Oh my god this is perfect. Like, exactly what I was looking for. And it’s cheaper than an epyc motherboard. Thank you so much. Definitely putting this on the list for sure.

              • MangoPenguin@lemmy.blahaj.zone
                link
                fedilink
                English
                arrow-up
                2
                ·
                10 days ago

                Glad I could help!

                Double check the specific motherboard specs too, even though Z790 has more lanes, some boards may only have 1 x16 slot still.

  • dbtng@eviltoast.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 days ago

    Have you looked into an actual used server?
    I own 2 Dell rack servers, each of which I got for free. The big one cost me about $500 to upgrade to 80vcpu 180gb RAM and has more PCIe slots than I need. I don’t even run it because my smaller server does everything I want.

    I guess it depends on exactly what you are looking for, but at least do a scan of ebay or amazon refurbs. Look for used Dell or HP rack servers. The parts to upgrade older machines are cheap, and an enterprise server has the capacity to deliver a hell of a lot of performance. Do some searching, see what you find.