Gosh it's hard to build a performance PC these days
-
I've come to appreciate having a tiny box on my desk, so these days I have a NUC sitting right next to me, but RDP into my big rig, which generates a lot of heat, is loud, big, and--most importantly--in another room, so it doesn't bother me at all. I have 3 monitors connected to the NUC, including one running at 4K - it's got plenty of horsepower for that.
I switched over to a single 55" 4k QLED smart TV as my monitor. I hate multimon for many reasons. My PC is a relatively small cubish thing. A Thermaltake Level 20 VT. Dimension (H x W x D) 348 x 330 x 430 mm (13.7 x 13 x 16.9 inch) - 2 and a half pill bottles tall, for perspective (don't ask :laugh: ) My PC isn't very loud, even on air, unless I'm pushing the GPU, and currently I'm not anywhere close to having heat issues, because the system is fairly modest compared to my target upgrade.
To err is human. Fortune favors the monsters.
-
Cooling cooling cooling. Things I've never had to do before: Add up the wattage of all my components. Downgrade a processor from the one I wanted due to wattage and heat Carefully consider my airflow design, since liquid cooling won't work ideally in my case Measure the height above my CPU to find a good enough heat sink. Downgrade from my preferred video card so my machine doesn't actually catch fire. Consider the distance between my PCIe slots. Gosh, they are really pushing the envelope in terms of power and thermal properties of newer computer designs. You have to pick your case, fans, sinks, PSU, CPU, and everything super carefully. I hope I built in enough headroom because I'd hate to find out that one day my little cube of computer power turned into an expensive furnace full of slag.
To err is human. Fortune favors the monsters.
-
I'm after maximum single core performance, and no 65W processor will net me that, much less an AMD which is better at multithreading than single threading in terms of efficiency and performance. I'm on a Ryzen 7 4750G right now. I've considered getting a 4080 and underclocking it, because it's super efficient at 300W but right now I have a 2080 TI and it works great.
To err is human. Fortune favors the monsters.
You should read more reviews. Here's one : SPEC2017 Single-Threaded Results - Intel Core i9-13900K and i5-13600K Review: Raptor Lake Brings More Bite[^]
Quote:
While we highlighted in our AMD Ryzen 9 7950X processor review, which at the time of publishing was the clear leader in single-core performance, it seems as though Intel's Raptor Lake is biting at the heels of the new Zen 4-core.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
-
You should read more reviews. Here's one : SPEC2017 Single-Threaded Results - Intel Core i9-13900K and i5-13600K Review: Raptor Lake Brings More Bite[^]
Quote:
While we highlighted in our AMD Ryzen 9 7950X processor review, which at the time of publishing was the clear leader in single-core performance, it seems as though Intel's Raptor Lake is biting at the heels of the new Zen 4-core.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
Note: I have to go with the i9 spec in that review because it doesn't have my i5, but single core perf is close to the same on both. After reading that review I'm still going with the intel, as it outperforms the Ryzen on GCC benchmarks, and some other single core heavy tasks - i'm guessing a lot of it has to do with having more on die cache. This is about compile times with GCC, so anything that bests it is my go to. The other thing is the 13th gen are new, and they haven't filled out the model line, so I may have some upgrade paths in the future. The Ryzen is a little older, and where it sits in the AMD lineup it's probably near their top end for that generation of chip. I think. I know a lot more about intel's habits in terms of chip development, their tick/tock cycles, etc.
To err is human. Fortune favors the monsters.
-
Note: I have to go with the i9 spec in that review because it doesn't have my i5, but single core perf is close to the same on both. After reading that review I'm still going with the intel, as it outperforms the Ryzen on GCC benchmarks, and some other single core heavy tasks - i'm guessing a lot of it has to do with having more on die cache. This is about compile times with GCC, so anything that bests it is my go to. The other thing is the 13th gen are new, and they haven't filled out the model line, so I may have some upgrade paths in the future. The Ryzen is a little older, and where it sits in the AMD lineup it's probably near their top end for that generation of chip. I think. I know a lot more about intel's habits in terms of chip development, their tick/tock cycles, etc.
To err is human. Fortune favors the monsters.
I currently have a 5950X at work and it writes to an NVME drive and build times are lightning fast. I use VS22 with about 100 or so projects and it is really quick. Have Fun.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
-
I think unless you want to go full-on liquid cooling, you'll need to compromise. From what I understand, you don't need the 4090; just get a 30xx series card (maybe a ti one) and get the best air cooling solution you can get (double tower cooler) and add RGB fans, because everything is cooler with RGB. If the usual ambient air temperature is relatively cool, it should be good enough (don't trust me here, I have no clue) But what I can see, most people recommend an AIO for the i9 series cpu.
CI/CD = Continuous Impediment/Continuous Despair
> I think unless you want to go full-on liquid cooling, you'll need to compromise. These days you can just get a standard AIO system for CPUs (Corsair H150 and co) and be done with it. Far less effort than going all-out, and not really an issue unless you want to start overclocking.
-= Reelix =-
-
Cooling cooling cooling. Things I've never had to do before: Add up the wattage of all my components. Downgrade a processor from the one I wanted due to wattage and heat Carefully consider my airflow design, since liquid cooling won't work ideally in my case Measure the height above my CPU to find a good enough heat sink. Downgrade from my preferred video card so my machine doesn't actually catch fire. Consider the distance between my PCIe slots. Gosh, they are really pushing the envelope in terms of power and thermal properties of newer computer designs. You have to pick your case, fans, sinks, PSU, CPU, and everything super carefully. I hope I built in enough headroom because I'd hate to find out that one day my little cube of computer power turned into an expensive furnace full of slag.
To err is human. Fortune favors the monsters.
I think I read a day or two ago that you want to stick with ITX. Why is that? Seems to me that limits your options.
Paul Sanders. If I had more time, I would have written a shorter letter - Blaise Pascal. Some of my best work is in the undo buffer.
-
I did briefly consider submerging the entire thing in a mineral oil bath and cooling it with a box fan. :laugh: Or liquid nitrogen, but that gets messy and i have pets. It could end poorly.
To err is human. Fortune favors the monsters.
honey the codewitch wrote:
I did briefly consider submerging the entire thing in a mineral oil bath and cooling it with a box fan. :laugh:
The [Cray-2 approach](https://www.computerhistory.org/revolution/supercomputers/10/68)...
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
-
I think I read a day or two ago that you want to stick with ITX. Why is that? Seems to me that limits your options.
Paul Sanders. If I had more time, I would have written a shorter letter - Blaise Pascal. Some of my best work is in the undo buffer.
No. I'm avoiding ITX. My chassis fits MicroATX.
To err is human. Fortune favors the monsters.
-
honey the codewitch wrote:
I did briefly consider submerging the entire thing in a mineral oil bath and cooling it with a box fan. :laugh:
The [Cray-2 approach](https://www.computerhistory.org/revolution/supercomputers/10/68)...
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
Has anyone ever made a Cray-2 simulator running on a modern PC? I'd be curious to know how the hardware 1985 Cray-2 would compare to an emulator on a top-range gaming PC of 2022! I suspect that the result might be like my Alma Mater's Cray-1: After a few years it was thrown out - its processing speed was certainly high enough, but for the major tasks - weather forecasting and FEM - the volume of the raw data to be processed was so immense (for the day) that the CPU was idling, waiting for the input channels to fill up memory. The Cray-1 was replaced by a Cray-2; apparently, the Cray-2 I/O-capacity was a lot higher. I wouldn't be surprised if a modern CPU/GPU can compete in processing power, but given similar weather forecasting / FEM tasks, the bottleneck would be the I/O, just like the Cray-1. Yes, there are datacenter-oriented versions of the top chips, with loads of PCIe lanes. They would probably come out better than the gaming-oriented chips.
-
I switched over to a single 55" 4k QLED smart TV as my monitor. I hate multimon for many reasons. My PC is a relatively small cubish thing. A Thermaltake Level 20 VT. Dimension (H x W x D) 348 x 330 x 430 mm (13.7 x 13 x 16.9 inch) - 2 and a half pill bottles tall, for perspective (don't ask :laugh: ) My PC isn't very loud, even on air, unless I'm pushing the GPU, and currently I'm not anywhere close to having heat issues, because the system is fairly modest compared to my target upgrade.
To err is human. Fortune favors the monsters.
Right...I was merely making the suggestion to build a beefy system but put it in another room (so the heat/noise isn't an issue for where you work), and then just RDP into it. I've been doing that for a few years now, and I wouldn't allow a loud PC back into my office.
-
Edit: My preferred would be a 4090 but for fire hazard and cooling issues. I like to build 10 year systems, and a 4090 is by best shot at that. The 2080 TI GPU i bought because i basically stole it for that price. I figured if I didn't use it I could resell it and easily more than double my money. I just got a 3070 TI for free tonight too. Didn't even have to pay for shipping. I do use the 2080 TI to play Fallout 4, which I play to scratch my game development itch (the game engine allows you to customize and otherwise modify it until it's an entirely different game) But primarily, it's a development machine. The GPU is a luxury. I don't mind developing and gaming on it because everything I do is in source control.
To err is human. Fortune favors the monsters.
honey the codewitch wrote:
Edit: My preferred would be a 4090 but for fire hazard and cooling issues. I like to build 10 year systems, and a 4090 is by best shot at that.
Even if you keep the core of the system for a decade, improvements in perf/watt on the GPU are high enough that you'd probably come out ahead at the ~5 year mark buying the same level of performance @ 80% lower power. Unless you want your gaming performance to decline from god-tier to potato over your system lifetime you'll need at least one, probably two GPU swapouts at similar performance tiers as the original card.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius
-
honey the codewitch wrote:
Edit: My preferred would be a 4090 but for fire hazard and cooling issues. I like to build 10 year systems, and a 4090 is by best shot at that.
Even if you keep the core of the system for a decade, improvements in perf/watt on the GPU are high enough that you'd probably come out ahead at the ~5 year mark buying the same level of performance @ 80% lower power. Unless you want your gaming performance to decline from god-tier to potato over your system lifetime you'll need at least one, probably two GPU swapouts at similar performance tiers as the original card.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius
I only really play Fallout 4, though I mod it to the point where it strains my 2080 TI I will continue to play Fallout 4 until Fallout 5 comes out. At this rate I will probably be dead or blind by the time Fallout 6 launches. :laugh: A 4090 should do it for me, I think. Though on reflection, a 3090 TI is a safer bet for my system and I could always swap it out later.
To err is human. Fortune favors the monsters.
-
Cooling cooling cooling. Things I've never had to do before: Add up the wattage of all my components. Downgrade a processor from the one I wanted due to wattage and heat Carefully consider my airflow design, since liquid cooling won't work ideally in my case Measure the height above my CPU to find a good enough heat sink. Downgrade from my preferred video card so my machine doesn't actually catch fire. Consider the distance between my PCIe slots. Gosh, they are really pushing the envelope in terms of power and thermal properties of newer computer designs. You have to pick your case, fans, sinks, PSU, CPU, and everything super carefully. I hope I built in enough headroom because I'd hate to find out that one day my little cube of computer power turned into an expensive furnace full of slag.
To err is human. Fortune favors the monsters.
This makes me feel better. I've been buying the Precision Laptop/Workstations. They are gamer designs I use for development. I don't play any games, so I sacrifice the GPU a bit. (as long as I can run 4 monitors, or one 55inch 4k UHD, I am good). BUT... That said... We've seen overheating issues. I could leave my old machine running for days. Now I power off every night. I also have a piece of wood UNDER the back of the machine to increase airflow. One of my devs has his sitting on a laptop cooler (5 fan design) as he doesn't have the A/C options, and the summer months are a problem. I've noticed it's only getting worse. FWIW, I agree with building machines for long-term. I shoot for 5yrs. After 2yrs, I usually buy a "cold spare" (A used version of the same computer, off-lease), and use it for testing/validating my backups (cloned drives). I can't fathom getting 10yrs from a machine. Hardware (specifically USB seems to fail before that time). But the cost of a new machine is on par with the 80 hrs it takes to move my licenses to it. In the last 2 builds, I've slowly started using more VMs for various development environments. I am hoping to get this down to 40hrs. The part I truly hate is LICENSED software tied to the DRIVE_ID (Quickbooks). So if I upgrade my hardrive, the software doesn't work until I jump through some hoops. But I have like 4 of those. It's just adding time to the process... Finally comment on heat. One of the devs who was NOT cooling his machine ran into NVME issues where the drives were getting too hot and faulting. He actually assumed the drive was bad, and went through the restore process (swapping back to the previously cloned drive, and then restoring from backup any changed files). The next day, he checked the the drive he replaced, and it was working fine! Scary. I read somewhere that some building in iceland/greenland uses BTC miners to heat the building. Basically the BTC is a wash for the energy consumption, and the heat is now beneficial. Probably the most expensive heater ever built. (Not sure it's a true store or it was a plan, FWIW). But I feel your heated pain!
-
This makes me feel better. I've been buying the Precision Laptop/Workstations. They are gamer designs I use for development. I don't play any games, so I sacrifice the GPU a bit. (as long as I can run 4 monitors, or one 55inch 4k UHD, I am good). BUT... That said... We've seen overheating issues. I could leave my old machine running for days. Now I power off every night. I also have a piece of wood UNDER the back of the machine to increase airflow. One of my devs has his sitting on a laptop cooler (5 fan design) as he doesn't have the A/C options, and the summer months are a problem. I've noticed it's only getting worse. FWIW, I agree with building machines for long-term. I shoot for 5yrs. After 2yrs, I usually buy a "cold spare" (A used version of the same computer, off-lease), and use it for testing/validating my backups (cloned drives). I can't fathom getting 10yrs from a machine. Hardware (specifically USB seems to fail before that time). But the cost of a new machine is on par with the 80 hrs it takes to move my licenses to it. In the last 2 builds, I've slowly started using more VMs for various development environments. I am hoping to get this down to 40hrs. The part I truly hate is LICENSED software tied to the DRIVE_ID (Quickbooks). So if I upgrade my hardrive, the software doesn't work until I jump through some hoops. But I have like 4 of those. It's just adding time to the process... Finally comment on heat. One of the devs who was NOT cooling his machine ran into NVME issues where the drives were getting too hot and faulting. He actually assumed the drive was bad, and went through the restore process (swapping back to the previously cloned drive, and then restoring from backup any changed files). The next day, he checked the the drive he replaced, and it was working fine! Scary. I read somewhere that some building in iceland/greenland uses BTC miners to heat the building. Basically the BTC is a wash for the energy consumption, and the heat is now beneficial. Probably the most expensive heater ever built. (Not sure it's a true store or it was a plan, FWIW). But I feel your heated pain!
I should have written that I shoot for 10 years. Realistically I end up replacing or upgrading components along the way, like adding more RAM, or an NVMe (although in this case I'm starting with NVMe sys drive, and RAID 1 NVMe x3 secondary storage so I can't get much faster than that with current tech. I designed it such that my read speeds off secondary storage will saturate my PCIe 3.0 bus.
To err is human. Fortune favors the monsters.
-
Cooling cooling cooling. Things I've never had to do before: Add up the wattage of all my components. Downgrade a processor from the one I wanted due to wattage and heat Carefully consider my airflow design, since liquid cooling won't work ideally in my case Measure the height above my CPU to find a good enough heat sink. Downgrade from my preferred video card so my machine doesn't actually catch fire. Consider the distance between my PCIe slots. Gosh, they are really pushing the envelope in terms of power and thermal properties of newer computer designs. You have to pick your case, fans, sinks, PSU, CPU, and everything super carefully. I hope I built in enough headroom because I'd hate to find out that one day my little cube of computer power turned into an expensive furnace full of slag.
To err is human. Fortune favors the monsters.
This is why I built a Ryzen 5950x system. Best performance per watt. I am hoping power consumption will come down again before I need the next upgrade.
-
Cooling cooling cooling. Things I've never had to do before: Add up the wattage of all my components. Downgrade a processor from the one I wanted due to wattage and heat Carefully consider my airflow design, since liquid cooling won't work ideally in my case Measure the height above my CPU to find a good enough heat sink. Downgrade from my preferred video card so my machine doesn't actually catch fire. Consider the distance between my PCIe slots. Gosh, they are really pushing the envelope in terms of power and thermal properties of newer computer designs. You have to pick your case, fans, sinks, PSU, CPU, and everything super carefully. I hope I built in enough headroom because I'd hate to find out that one day my little cube of computer power turned into an expensive furnace full of slag.
To err is human. Fortune favors the monsters.
Always use the largest tower-case you can find that will work. Not only will it provide a better aerated machine but make it much easier to work when building or maintaining the machine...
Steve Naidamast Sr. Software Engineer Black Falcon Software, Inc. blackfalconsoftware@outlook.com
-
Cooling cooling cooling. Things I've never had to do before: Add up the wattage of all my components. Downgrade a processor from the one I wanted due to wattage and heat Carefully consider my airflow design, since liquid cooling won't work ideally in my case Measure the height above my CPU to find a good enough heat sink. Downgrade from my preferred video card so my machine doesn't actually catch fire. Consider the distance between my PCIe slots. Gosh, they are really pushing the envelope in terms of power and thermal properties of newer computer designs. You have to pick your case, fans, sinks, PSU, CPU, and everything super carefully. I hope I built in enough headroom because I'd hate to find out that one day my little cube of computer power turned into an expensive furnace full of slag.
To err is human. Fortune favors the monsters.
My son ordered a new high performance tower with liquid cooling for the CPU for game development, and I had to buy him a room air conditioner for it.
-
Always use the largest tower-case you can find that will work. Not only will it provide a better aerated machine but make it much easier to work when building or maintaining the machine...
Steve Naidamast Sr. Software Engineer Black Falcon Software, Inc. blackfalconsoftware@outlook.com
Nope. Towers mount the mobo vertical and thus suffer GPU sag. Also my thermaltake open-air chassis is quite good at cooling, and can fit a 4090 video card if I wanted to. Towers waste space anyway. My little glass cube doubles as a shelf. =)
To err is human. Fortune favors the monsters.
-
This is why I built a Ryzen 5950x system. Best performance per watt. I am hoping power consumption will come down again before I need the next upgrade.
I'm looking for the best single core performance I can keep properly cooled, not the most efficient. My i5 CPU slightly bests a 9xxx ryzen 7 on GCC benchmarks.
To err is human. Fortune favors the monsters.