Posts

Nivida isn't as greedy as you think

Image
  No, they just misread the room Nvidia this past week announced the 4080 12GB (hereafter the 4080-12) for an eye-watering $899 and the 4080 16GB (4080-16) for $1199. But wait, the 2080 was $699 and the 3080 was $699, why are we suddenly jumping to $899 and $1199 for 4080, a 29% and 71% price increase respectively! This isn’t even the 4080ti! The instinctive conclusion is that Nvidia’s being greedy. I’ve speculated for some time before the announcement that we could see prices go up given Nvidia has seen what people are willing to spend if they have no other choice since gamers still bought GPUs during the mining rush. I pointed to how the 20 series (Turing) increased prices pretty significantly even after the 2018 mining spike was over. I thought this would be another case of something similar. Upon further reflection, I don’t think this is the case. I think it’s possible that Nvidia simply could not get away with charging much less if they want to maintain reasonable pr...

What to expect with a PS5 pro... if they even make one.

 Wild Speculation I’ve seen a lot of speculation about what specs the PS5 Pro will have. Most of that speculation has been people asking themselves “What specs would I like to see in a much more powerful console” and for the most part, are describing what would amount to a new generation. I will try to bring the speculation back to some reality by instead asking “what makes sense given everything we know about Playstation development, what reasons have there been for mid-generation spec improvements in the past, and what reasons might motivate a drive for a mid-generation upgrade this generation?” Before we get into what’s possible for the PS5 Pro, we need to establish some things. Development difference between PS5 and PC This point will be a recurring theme, but it’s very important for this to be understood early. Playstation is *not* a PC, and the API that it uses to send commands from the game engines to the hardware is *not* similar to DX12 or Vulkan, which are designe...

Intel is back! An analysis in the next few years

Image
E-Cores are love. E-Cores are life.  As many of us know, Intel were the first to launch with a heterogenous x86 architecture. What this means is that there's more than just one type of core. Where every single core in a processor was designed identically, Alder Lake now has two types of cores (Similar to the big.LITTLE found in ARM-based mobile processors). The purpose of this is to maximize MultiThreaded (MT) performance for its die area. Intel's large cores (P-Cores) are extremely fast. However, they're also BIG. They take up a lot of die area, and adding more of them makes the chip bigger and more expensive to manufacture. The solution is add some small cores that aren't as fast, but are much faster relative to the space they take up.  In the above image, we can see the BIG dark blue cores. Those are the P-cores, and the small light-blue E-cores have 4 cores in about the same die area as the P-cores. In reality a cluster of 4 E-cores are slightly wider than a P-core ...

Predicting 1440p Performance of RDNA2

Image
  RDNA2 may have a 1440p edge.  First of all, I need to stress that there’s a lot of speculation here. There’s also a lot of math, but I will be clear what data I’m using and which assumptions I’m making. To start things off, I’m going to outline a few caveats. 1)       I’m using Hardware Unboxed benchmarking data from the 3090 review, here: https://youtu.be/PTs1gHqvcjs Go watch it. It’s a great review and be sure to subscribe to their channel. I use their data because they present it in an easy to consume manner, and they have among the best testing methodology in the business. However, I also encourage you to use other benchmarks to help round out the picture and get other opinions. I’m also using these screenshots without permission, but I feel like they’d probably be fine with the way I’m using them. If they ask me to remove them I will. a.         b.         2)  ...

Announcing the Nvidia 2180...?

Image
 Or will it be the 3080?  Okay, full disclosure, I could end up looking like a complete idiot here. I have no sources, and this comes from nothing other than my own observations. But Nvidia and the entirety of the tech press are blowing my mind.  Points for 30 series The entirety of the tech press from what I can see has accepted the conclusion that Ampere will be the 30 series of Geforce GPUs. This makes a lot of sense, since there was the 10 series, then there was the 20 series, now there's going to be the 30 series, right? It also sounds better. "Thirty Eithty" and "Thirty Ninety" sound a lot better than "twenty-one eithty" and "twenty-one ninety". There's also the fact of the Micron document that showed it being called the 30 series. That's an official document and couldn't be wrong, right? There are also cooler prototypes with the 30 series name written on them.  Img Src: Videocardz:  https://cdn.videocardz.com/1/2020/06/NVID...