

The first Android phone had the nipple, so must be the layout or something.
The first Android phone had the nipple, so must be the layout or something.
Every picture I’ve seen has been an outside pin. So my theory is it’s the cable getting tugged for cable management and even though it’s clipped in, it’s not making as good of contact.
That or just a bad cable design. I’ve bought a few cables from cablemod and I’m not happy with the wiring they used. Their website says “Crafted with 16AWG wiring” but they also brag about the flexibility of their cables so I assume they’re using stranded wires instead of a solid core so you lose a decent chunk of ampacity (and heat sinking).
High end GPUs are always pushed just past their peak efficiency. If you slightly underclock and undervolt them you can see some incredible performance per watt.
I have a 4090 that’s underclocked as low as it will go (0.875v on the core, more or less stock speeds) and it only draws about 250 watts while still providing like 80%+ the performance of the card stock. I had an undervolt that went to about 0.9 or 0.925v on the core with a slight overclock and I got stock speeds at about 300 watts. Heavy RT will make the consumption spike to closer to the 450 watt TDP, but that just puts me back at the same performance as not underclocked because the card was already downclocking to those speeds. About 70 of that 250 watts is my vram so it could scale a bit better if I found the right sweet spot.
My GTX 1080 before that was under volted, but left at maybe 5% less than stock clocks and it went from 180w to 120 or less.
The 8800 Ultra was 170 watts in 06
The GTX 280 was 230 in 08.
The GTX 480/580 was 250 in 2010. But then we got the GTX 590 dual GPU which more or less doubled
The 680 was a drop, but then they added the TIs/Titans and that brought us back up to high TDP flagships.
These cards have always been high for the time, but quickly that became normalized. Remember when 95 watt CPUs were really high? Yeah that’s a joke compared to modern CPUs. My laptops CPU draws 95 watts.
There are decades where nothing happens; and there are weeks where decades happen.
This is one.
What are you trying to run? a VPS is pennies, and a phyiscal server isn’t much more. We have a bunch of servers that are $40 a month each and they come with 5 usable IPs, 32 gigs of ram, 1tb SSD etc. The cost of getting a static IP for home will be almost as much as a server. If you want less you can get less for a lot less money.
I’ve self hosted my own personal website for years now and it’s not really an issue outside of the power going out and my IP changing. I just update DNS and move on. But if this is for an actual work? Just pay the $10 a month, not having to worry about it is worth that money.
The law requires things to have a backdoor (or non encryption I guess). So it’s either sell in that market, or don’t and have security.