Danieru 2 days ago

No one appears to have mentioned the important meta game going on: Intel bidding as a credible alternative supplier.

For Intel, by bidding they get to undercut AMD's profits.

For Sony, they get a credible alternative which they can pretend would be a viable choice. Thus forcing a slightly better deal from AMD.

We saw similar articles related to the Switch 2. That time it was AMD acting as spoiler to Nvidia. Nvidia reportedly got the contract. That time too we got news articles lamenting this loss for AMD.

As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries. Console games, outside small or open source engines, use proprietary graphics api. Those apis are tied to the hardware. With this coming generation from Nintendo, and the "current gen" from Sony and Xbox they've been able to mostly reuse much of their software investment. I'd case more but this is obviously nda, other devs should be able to confirm.

Thus I don't think AMD for switch2 or Intel for ps6 was ever a credible path. Their bids existed to keep the existing vendor from getting overly greedy and ruining the parade for everyone. This is important, famously the original Xbox got hamstrung in the market by Nvidia's greed and refusal to lower prices as costs went down.

  • senkora 2 days ago

    +1. An important non-obvious detail for AMD is that they (at least in the past, I assume for this as well) have kept the instruction timings similar from generation to generation of consoles.

    Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console. If you suddenly changed the micro-architecture (if switching to Intel), then old games could suddenly become janky and slow even though both systems are x86.

    (This would only matter if you were pushing performance to the edge, which is why it rarely matters for general software development, but console game dev pushes to the edge)

    So it isn't just the graphics APIs that would change going from AMD to Intel, but the CPU performance as well.

    • deaddodo 2 days ago

      > Different x86 micro-architectures benefit from writing the machine code in slightly different ways. Games are highly optimized to the specific micro-architecture of the console, so keeping that stable helps game developers optimize for the console.

      While that can be true, very few gamedev companies these days optimize to that degree. They almost all use off-the-shelf middleware and game engines that are built to support all of the platforms. The companies that do go through that effort tend to have very notable releases.

      Nobody is hand-tuning Assembler code these days to fit into tight instruction windows. At least, not outside of some very specific logic fragments. Instead they're all writing generic interrupt-based logic. Which is fine, as that's what the newer CPUs expect and optimize for internally.

      In addition, the difference in the Zen generation gap is as different as switching to Intel. We're talking fairly different cache coherency, memory hierarchies, CCX methodologies, micro-op and instruction timings, iGPU configurations, etc.

      That all being said, AMD was going to beat Intel regardless because of established business relationships and their current internal struggles (both business-wise and R&D) making it fairly difficult for them to provide an equivalent alternative.

      • soganess 2 days ago

        Asking this as an open ended (if leading) question: I assume enough people are doing it otherwise PS5 Pro makes no sense... Right?

        They (AMD/Sony) shoehorned the RDNA 3/3.5 GPU architecture onto an older Zen 2 core, with a different process node, because... they felt like making a frankenAPU? Especially since the APUs are usually monolithic (vs chiplet) in design and share a memory controller. Surely it would have been easier/cheaper to put in 8 zen 4c/5c cores and call it a day.

        I'm pretty sure I'm just missing something obvious...

        • wmf 2 days ago

          For PlayStation APUs, it's likely that AMD presents a menu of options and Sony chooses which components they want. For PS5 Pro, the CPU is unchanged from PS5 because Sony doesn't feel the need for anything faster. A newer CPU would take more area. But Sony really wanted better raytracing and AI so they chose RDNA 3.9 or whatever for the GPU. I suspect the cores are all mostly synthesized so they can support any process and Infinity Fabric is compatible enough that you can mix and match new and old cores.

        • deaddodo 20 hours ago

          > They (AMD/Sony) shoehorned the RDNA 3/3.5 GPU architecture onto an older Zen 2 core

          The original core was already a custom configuration. I don't see why it seems odd that the new version would be a custom configuration based on the previous one.

          > with a different process node

          This doesn't apply to the PS5 SoC, but is general to AMD's methodology.

          AMD has been using an off-chip interposer setup for multiple generations now. They did this specifically to allow for different process nodes for different chips.

          It's cheaper (and there are more fab options) to produce chips at a lower process node. If there's no reason to update the CPU, it would make sense to keep it on the cheaper option.

          In regards to the PS5 and Xbox SoCs, specifically.

          The entirety of the SoC is fabbed at the same process node. A core designed for a 14nm process node and then fabbed at 7nm (assuming drastic changes weren't needed to make it function at the lower node) is going to be much smaller and run cooler on that node size. This is cheaper and leaves more space in the total footprint for the GPU-specific and auxiliary logic cores. Same rule applies above, why use more if it's not needed.

          > they felt like making a frankenAPU

          All of the game console chips are "frankenAPUs".

          > Especially since the APUs are usually monolithic (vs chiplet) in design and share a memory controller.

          "Monolithic" vs "chiplet" is an arbitrary distinction, in this case. The individual logic cores are still independent and joined together with interposers and glue logic. This is clear from the die shots:

          https://videocardz.com/newz/sony-playstation-5-soc-die-pictu...

          To return to the previous point, look at the space dedicated to the CCXs. The Zen2 has ~1.9bln transistors, the Zen3 ~4.1bln, the Zen4 ~6.6bln, etc. To use a newer core would double or triple that space. Increasing the total die size, making it more expensive per chip and increasing the defect rate.

          > Surely it would have been easier/cheaper to put in 8 zen 4c/5c cores and call it a day.

          Definitely not.

          > I'm pretty sure I'm just missing something obvious...

          Nothing about chip design is obvious.

        • pjmlp a day ago

          PS5 Pro makes no sense, yes.

          Most studios aren't even able to push current PS 5 to its limit, given current development schedules and budgets.

          PS 5 Pro is for the same target audience as PS 4 Pro, hardcode console fans that will buy whatever the console vendor puts out, and Sony needs to improve their margins.

      • MichaelZuo 2 days ago

        How would you explain cross PS5/PC releases being much more efficient on the PS5?

        e.g. Horizon Forbidden West needing a much better GPU on PC to run at the same level of fidelity as the PS5.

        If not for special tuning specific to the PS5’s differences.

        (I can imagine Windows bloat and other junk requiring an additional 10% to 20%, but not 30% to 50%.)

        • jitl 2 days ago

          The comment above is elaborating on x86 micro-architecture, the differences between how the CPU handles x86 instructions specifically.

          The overall system architecture is different between PC, which has discrete memory systems for the CPU and GPU, and a very long pathway between GPU memory and system/CPU memory, versus today's consoles which have unified memory for CPU+GPU, and optimized pathways for loading from persistent storage too.

          Consolesuse their own graphics APIs, but you would have any vendor you contract with for graphics support your native graphics API and everything would be "fine". PS5 games use GNM/GNMX Playstation proprietary graphics APIs. Usually PC ports of console native games re-implement the rendering engine using the PC graphics APIs like DirectX or Vulkan. The re-implementation is probably less efficient and less tuned.

          • TimeBearingDown 2 days ago

            Great answer. Denuvo and other heavy anti-piracy tools are also sometimes used for releases on PCs which can seriously impact performance.

          • ac29 2 days ago

            > Usually PC ports of console native games re-implement the rendering engine using the PC graphics APIs like DirectX or Vulkan. The re-implementation is probably less efficient and less tuned.

            This was true 25 years ago when in house bespoke game engines were more common and consoles weren't basically PCs. In 2024, I highly doubt many cross-platform games are ported at all - its just a different target in Unreal/Unity/etc.

            • kuschku 2 days ago

              > I highly doubt many cross-platform games are ported at all - its just a different target in Unreal/Unity/etc.

              Horizon is running in Guerilla Games' in-house Decima Engine, which is PS5-only for production builds. Ports are handled by nixxes.

              Kojima games previously used Konami's in-house Fox Engine, again primarily designed for playstation. Since Kojima left Konami, Kojima games use the Decima Engine as well.

        • yangff 2 days ago

          Horizon Forbidden West was ported from PS to PC. Decima is an engine from Sony’s first-party studio, so it's understandable that their development process would lean more towards PS's internal architecture rather than the more common GPUs on the market. Of course, even general-purpose engines can perform better on PS5, AMD, or NV. But, for these engines, they have less information about how customers will use the engines, so there's less infomation can be used to optimize. On the other side, customers using these engines often don’t have enough experience optimizing sufficiently for each platform. None of this is absolute, but I think this logic is reasonable.

          For game developers using these engines, if they take optimization seriously, they typically make adjustments to lighting, LOD, loading, and model details or shaders on console platforms to achieve a similar visual effect while meeting the targeted performance goals. This is why you usually get better performance on a console at the same price point compared to a PC, aside from the subsidies provided by Sony.

        • tacticus 2 days ago

          > Horizon Forbidden West needing a much better GPU on PC to run at the same level of fidelity as the PS5.

          not being expected to run with variable refresh rate\interleaving and accepting 30\60 fps in best case situations?

        • deaddodo 2 days ago

          > They almost all use off-the-shelf middleware and game engines that are built to support all of the platforms. The companies that do go through that effort tend to have very notable releases.

      • HelloNurse 2 days ago

        And, more simply, Moore's Law should ensure that in a next-generation console with a new microprocessor architecture slowdown in some instructions and memory access patterns is compensated by general speedup, limiting performance regressions to terribly unfortunate cases (which should be unlikely and so obvious that they are mitigated).

    • mikepavone 2 days ago

      > An important non-obvious detail for AMD is that they (at least in the past, I assume for this as well) have kept the instruction timings similar from generation to generation of consoles.

      What? The Jaguar-based CPU in the PS4 has both a much lower clock and substantially lower IPC than the Zen 2 based one in the PS5. The timings are not remotely the same and the micro-architectures are quite different. Jaguar was an evolution of the Bobcat core which was AMD's answer to the Intel Atom at the time (i.e. low cost and low-power, though it was at least an out-of-order core unlike contemporary Atoms).

      Going from GCN to RDNA on the GPU side is also a pretty significant architectural change, though definitely much less than the going from AMD to Intel would be.

      • senkora 2 days ago

        I did some more research and I was wrong.

        My source was an AMD tech talk from years ago where they mentioned keeping instruction timings the same for backwards compatibility reasons.

        I believe they were talking about this for the XBox One X: https://en.wikichip.org/wiki/microsoft/scorpio_engine#Overvi... (and a similar chip for the PS4 Pro)

        So basically, they upgraded and lightly enhanced the Jaguar architecture, shrunk the process (28nm -> 16nm), but otherwise kept it the same. AMD Zen was released around this time and was far superior but they decided to stick with Jaguar in order to make sure that instruction timings were kept the same.

        I guess that they didn't want two hardware revisions of the same console generation running on different micro-architectures, but they were okay switching the micro-architecture for the next console generation.

    • jheriko 2 days ago

      you clearly haven't played a modern game :P

      cpu timings taken care around by developers is 10-15 years out of date. most of them these days dont even know what a dot product is, how to find the distance to a point or a straight line in-between two... and the people they rely on to do this for them make horrendous meals of it.

      but yeah, sure, cpu instruction timings matter.

      • DaoVeles 2 days ago

        I was about to say. I bailed out of the industry just as the Xbox One/Ps4 was coming in. Even with the 360/Ps3, it was considered wise to try and steer clear of that kind of low level stuff just for ones sanity. When the X1/Ps4 came in, it was completely abandoned, turns out x86 compilers combined with OoO execution just made that kind of tinkering not only nearly pointless but sometime actually hurt performance.

        Nowadays,I suspect it is almost entirely in the hands of the compilers, the API's and the base OS to figure out the gritty details.

        • xgkickt 2 days ago

          There are still manual optimizations that can be done (non-temporal writes where appropriate for example), but nothing like the painstaking removal of Load-Hit-Stores and cache control of the 360/PS3 era.

      • Meganet 2 days ago

        The new chip will be relevant faster. I would bet that bandwidth between certain components is a lot more critical. Or NUNA or bandwidth between cores.

        Im surprised that cpu instruction latency is mentioned before other

    • lxgr 2 days ago

      Given the size of such a contract, wouldn't it be reasonable for Sony to just request equal or better instruction latency for everything relevant from the old CPU?

  • ksec 2 days ago

    Adding a bit more context.

    Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.

    Xbox and Playstation dont earn AMD much profits at all. AMD had this Custom Processor segment to barely keep them surviving, people may forget AMD was only worth ~$3B market cap in 2016. They are now at ~$250B.

    On the subject of software compatibility though, one thing I got it wrong was my prediction of having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.

    • elzbardico 2 days ago

      Sometimes a low margin business is all you need and have to keep the lights on, don't hemorrhage too much people and stay afloat until you get better winds.

      Traditional MBA thinking sometimes is too short sighted. For example, PCs might not have been a Cash Cow for IBM, but the Thinkpad brand, the distributor relationships and the customer may had helped IBM more than the cash out selling this business to Lenovo. Maybe having a healthy bridge head with a popular brand of laptops could have helped IBM coming up with some innovative way of selling the overhyped Watson.

      The same with AMD and videogames, it paid the bills, paid salaries and left a little profit on the table to be invested. Probably it helped them bridge from their hell days to what they are today.

      There's a lot of intangibles and hidden symmetries, serendipitous opportunities that are frequently overlooked by our bean-counting master race overlords.

    • sangnoir 2 days ago

      > Xbox and Playstation dont earn AMD much profits at all

      It doesn't cost them much either. Lisa Su, in an interview that was posted to HN a few months ago, said it is a deliberate strategy to repackage IP AMD has already developed. They are willing to pull designs from the shelf and customize it to meet partners needs. Having a long tail of products adds up, and sets you up to get first dibs on higher margin partnerships in the future.

    • derstander 2 days ago

      > Nvidia got the bid for Switch when they were basically dumping those unwanted Tegra to Nintendo for an incredibly low price.

      This seems pretty well aligned with Gunpei Yokoi’s strategy of “Lateral Thinking [with] Withered Technology”. It worked out pretty well for Nintendo in the past (e.g., Gameboy) and seems to be working out with the Switch. Even though he has passed, his Wikipedia page alleges that this philosophy has been passed on to others at Nintendo.

      • lynguist a day ago

        > Withered technology

        At the time of its release the Nintendo Switch’s CPU was only a single generation behind the latest offering by ARM; and its GPU was by far the most powerful mobile GPU available. It doesn’t hold true for Switch.

        What happened is that mobile compute has advanced tremendously since 2017 and Switch is stuck on technology that was leading in early 2017.

        • pjmlp 20 hours ago

          While providing marvelous gaming experiences, faster polygons doesn't equate better games, and is specially an issue in latest gen PlayStation and XBox where many games with great graphics have lousy gameplay experience at high prices.

    • DaoVeles 2 days ago

      A few of the Playstation titles that made their way to PC do seem to have a little home field advantage on AMD chips, but not enough to sway people over to them.

    • lupusreal 2 days ago

      > having AAA titles on Xbox and PS would have helped AMD's market share on PC, given those titles are already optimised on Xbox and PS anyway. That didn't happen at all. And Nvidia continue to dominate.

      My impression is that console ports have insufficient popularity with PC gamers for them to alter their hardware purchasing habits for those games.

  • lxgr 2 days ago

    > Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

    But would they really?

    Staying on x86-64 would take care of CPU compatibility (unless there's some exotic AMD-only instruction set extension heavily used by PS4/5 games), and a GPU emulation stack seems at least somewhat plausible.

    Sony has pulled this off multiple times before with seemingly completely incompatible architectures:

    The PS2 came with the PS1 CPU (repurposed as an IO controller, but fully available for previous-gen games) and emulated the GPU on its own. The PS3 did the reverse in its second iteration (i.e. it included the PS2's GPU and emulated the CPU). The PS Vita's SoC had the PSP MIPS CPU included on-die, which in turn is similar enough to the PS1's to allow running those games too.

    • DSMan195276 2 days ago

      For GPU emulation, I'm not super knowledgeable but I would think the shaders are a big issue, older systems don't have that problem. Console games come with precompiled shaders and you won't be able to reuse those between AMD vs. Nvidia. Certainly you can get around it, emulators for modern Console do just that, but it's not without it's issues which might be considered unacceptable.

      That's still fixable if you're willing to ship newly compiled shaders and such, but that's a lot more work if you're talking about needing some kind of per-game fix to be downloaded. This is how the XBox 360 "Backwards-compatibility" works, and this approach means it only works with a subset of XBox 360 games, not all of them. It's much better than nothing, but it's not a hardware-level fix that makes the original game binaries "just work".

      For packaging the old GPU with the new system, I think that's not really realistic anymore since prices for them simply don't drop enough and the system design would be a mess (the chips are huge and you'd need cooling for both chips. I guess if only one is running at a time then it's not as bad, but...). Separately, if you're swapping from Nvidia to AMD then you're talking about trying to convince one of them to make a batch of old chips for you while you use their competitor's chip as the the main one, they might not be willing to do it.

      • lxgr 2 days ago

        Would it not be possible to recompile all shaders at startup (or "install", i.e. first launch) time and then cache them (if runtime recompilation is even too slow in the first place)?

  • jm4 2 days ago

    The whole article seems unfair to Intel. They didn’t lose the contract because they didn’t have it in the first place. I think your analysis is correct. They win a little if they don’t get the contract and they win a lot if they do. It was a no brainer to bid on it.

  • neighbour 2 days ago

    This is all true. Xbox always threatens to leave their current vendors only to end up signing a renewal at the final hours of the contract.

    >As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

    In your view, is this issue worse with modern consoles now that the Playstation (and possibly Nintendo) online store purchases persist across generations? Imagine a scenario where someone has a PS4 and PS5, they buy many games through the Playstation Store, then Sony selects a different chip supplier for the PS6. I'm guessing this would cause issues with games that were designed for the older consoles, breaking backwards compatibility.

    I'd imagine that if the console manufacturers cared about backwards compatibility, which I think they do, the likelihood of them switching chip providers would decrease with each generation.

    • wmf 2 days ago

      Microsoft maintained backwards compatibility across Intel+Nvidia, IBM+ATI, and AMD+AMD so it's possible. Sony hasn't invested as much in compatibility, instead just keeping the same architecture for PS4/5.

      • lxgr 2 days ago

        Sony has historically invested a lot into backwards compatibility, going as far as shipping the previous gen's GPU and/or CPU with the PS2, initial PS3 models, and the PS Vita.

        PS3 compatibility on the PS4 was notably absent, though.

        • nottorp 2 days ago

          Historically. But not presently.

          They could include a software emulator at least for the PS2 (not PS1 because afaik the drive in the PS5 does not read CDs) on the PS5 and let people use old discs, but they don't and instead sell again old games packaged with the emulator in their online store.

          • Yeul 2 days ago

            I doubt there is a lot of money in PS2 games. Anyone who really wants to play those games can emulate them on PC.

            • lxgr 2 days ago

              Arguably, there not being a lot of money in them would be a point in favor of Sony shipping an emulator (as a minor perk/nod to long-time ecosystem fans), not against it (which would allow them to keep selling "HD remakes" etc.)

      • neighbour 2 days ago

        True but if you're referring to the fact that you can play Xbox and Xbox 360 games on newer hardware, I believe Microsoft has a team that has to individually patch these games to work for newer hardware.

        Sony does something similar I believe with their new Classics Catalogue as part of their most premium PS Plus tier.

        • jamesfinlayson 2 days ago

          Yeah I remember the Xbox 360 being hit and miss with backwards compatibility - their FAQs said that most of the time the people working on it had to just look at the raw assembly of games they were trying to get running to figure out what went wrong.

      • etempleton 2 days ago

        Most games were not backwards compatible between Xbox and Xbox 360. They had to do work to make game work and prioritized the most popular games, most notably Halo. With that said, there were certain features that did not work properly. There was a Halo 2 map they took out of the online pool because it used a heavy fog effect that would not render on 360.

        From 360 to Xbox One there was a similar situation where they would patch individual games to work, but because it was at least partially emulated, publishers had to sign off on allowing their game to be backwards compatible.

    • lxgr 2 days ago

      There was no backwards compatibility between the PS3 and PS4 whatsoever (except for PS Plus allowing cloud-based game streaming of some PS3 titles), and Sony survived that as well.

      What they did was offer some old PS2 games for purchase, though, which allowed them to tap into that very large back catalog. I could see something like this happen for a hypothetical Intel PS6 as well (i.e. skipping PS5 backwards compatibility and tapping into the large catalog of PS4 and PS4/PS5 games).

  • aurareturn 2 days ago

    I’m pretty sure PS5 runs x86 and Vulcan. Both are standardized. That’s why PS5 games can be easily ported to PCs running Intel and Nvidia.

    So I’m not buying that going Intel would lose backwards compatibility.

    • pjmlp 2 days ago

      I am quite sure PS5 doesn't do Vulkan at all, and you even don't need a NDA access for that, there are enough GDC talks and SCEE presentations on the what APIs Playstations do support.

    • mastax 2 days ago

      It’s not clear to me that the PS5 supports Vulkan at all (excluding third party translation layers). I would be happy to see any evidence. In any case I’m confident the large majority of PS5 games use its native api GNM.

      GNM could certainly be implemented for Intel GPUs, but it’s an additional cost to account for.

      • aurareturn 2 days ago

        Yes, you are correct. I was wrong.

  • johnnyanmac 2 days ago

    Yeah, this was rigged from the start. If Sony did want to take up Intel next gen, they'd need to do a lot of work on backwards compatibility with the PS5 on the PS6. Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

    I suppose it can be seen as controlling rampant greed (especially for Nvidia), but it feels like the consoles dealt the cards here. There would have needed to either be some revolutionary tech or an outright schism to make a business steer an otherwise smooth ship that way.

    >As a gamedev I have a different perspective: Sony and Nintendo would be fools to give up backwards compatibility just for savings on chips.

    I agree that both are probably playing it safe this time. But as a devil's advocate: both Sony and Nintendo are not strangers to ditching the previous gen if they don't want to compromise their next gen. At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).

    Sony tried and almost failed hard with the PS3 (kind of before with the whole SKU debacle, and then ditched after) but is otherwise usually consistent on BC. Well, that and the Vita. But I don't think anyone missed the UMD (it was still backwards compatible digitally, though).

    • philistine 2 days ago

      > At this point Nintendo is skewed towards ditching (SNES/N64/Gamecube/Switch vs. Wii/WiiU).

      Ultimately, a company is its people. And the management class at Nintendo is famously new. Everybody is expecting them to focus on robust backwards compatibility as part of their new, exciting development.

    • Tuna-Fish 2 days ago

      > Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

      I think there will be sufficient time between now and PS6 release that they will be able to support full RTRT.

    • ac29 2 days ago

      > Yeah, this was rigged from the start. If Sony did want to take up Intel next gen, they'd need to do a lot of work on backwards compatibility with the PS5 on the PS6. Whereas I imagine the PS6 being a "PS5 Pro Pro" at this rate.

      Why would they need to do a lot more work on compatibility if they'd picked Intel vs AMD?

      Either CPU is presumably going to be x86_64. The GPU is almost certainly going to be much different between AMD's PS5 GPU and AMD's PS6 GPU, so the graphics driver will need work either way.

      • yangff 2 days ago

        They could have AMD provide a compatibility layer for the GPU (although this might be a bad idea), but implementing an AMD compatibility layer on Intel/NV clearly seems like an even worse idea. But at least you might be able to run the already compiled shaders in compatibility mode?

  • Meganet 2 days ago

    Those IPs are bought from expert companies right?

    I would assume if intel can make ARM and x86, it can do whatever sony needs.

    Or is AMDs architecture THAT special? My assumption is, that the ps3 streaming processor was so different, that it would have mattered but with ps4 and 5?

    You could also patch PS5 games if you need to. The ecosystem is closed.

  • zelon88 a day ago

    > Switching vendors does not just invalidate old games compatibility, it also requires retooling for their internal libraries.

    This is a red herring. The hardware is x86-64, and all the game engines are made on x86-64, and all the games are compiled on, you guess it, x86-64. That's why they stopped using PowerPC, or Motorola, or other non-x86 architectures. To simplify backwards compatibility, and actually get comparable value to a decent performing system.

    So when they tell you there is a cost overhead associated with switching vendors, that is BS. However long it takes to port your desktop driver package is how long it would take to get all of this working on different hardware.

    Seriously, if someone in a basement in Arkansas can get Windows to run on a PowerPC PS3, Sony can figure out how to make x86-64 AMD games work on an x86-64 Intel chips. Anyone saying otherwise has incentive to not make it happen.

  • smcl 2 days ago

    I'm not convinced, this feels like those "actually this is good for bitcoin" replies that are popular with cryptobros anytime some bad news hits. Intel have lost out on a big, high-profile contract - this cannot be something they are happy with and any explanation to the contrary is, as the kids say, "cope"

bangaladore 2 days ago

Maybe I'm misinformed, but I could never see Intel getting this contract.

AMD has extensive experience with high-performing APUs, something Intel, at least in my memory, does not have. The chips on modern high-end consoles are supposed to compete with GPUs, not with integrated graphics. Does Intel even have any offerings that would indicate they could accomplish this? Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.

  • janice1999 2 days ago

    Intel has Battlemage [1]. Presumably that would be the basis of the console APU. Their iGPU performance is actually getting good now. [2]

    [1] https://www.pcgamer.com/hardware/graphics-cards/embargo-no-p...

    [2] https://www.tomshardware.com/pc-components/cpus/lunar-lake-i...

    • Scramblejams 2 days ago

      > Their iGPU performance is actually getting good now.

      I've only been waiting for Intel to ship a compelling iGPU since, I dunno, their "Extreme Graphics" in 2001? What on earth have their iGPU teams been doing over there for the last 20+ years?

      I guess the OEMs were blinkered enough not to demand it, and Intel management was blinkered enough not to see the upside on their own.

      • windowsrookie 2 days ago

        The Intel Iris Pro graphics from about 10 years ago were actually ok. I believe they were matching the lower-end dedicated laptop GPUs of that era. The problem was Apple was the only company willing to pay for the Iris Pro Chips.

        • kcb 2 days ago

          The other problem is Intel's graphics drivers for 3d gaming are a distant 3rd place. Games just haven't historically targeted their GPUs. We've had like 2 decades of games that for the most part have tested compatibility with Nvidia and AMD.

        • onli 2 days ago

          Those even were on the desktop, for a very short while, with the broadwell processor i5-5675C and i7-5775C.They were stronger than the FM2+-Apus AMD had released earlier, that Intel otherwise could not beat for years, just weaker than the following Ryzen Apus.

          Ofc gone in the next generation. But those widely available might have changed things.

      • DaoVeles 2 days ago

        I think what they have been doing is focusing on what 95% of people use these things for. Just basic utility based things. The most complex thing most people will render is Google Earth. I would not be surprised if that is probably the most like focus of performance for metrics Intel is using the iGPU for.

      • deelowe 2 days ago

        Intel didn't take gaming seriously until very recently. They stayed focused on productivity focused applications well past the time when netbooks became viable for most use cases.

    • adastra22 2 days ago

      Intel’s absolute best integrated GPU being roughly comparable to a lower end model from the competition is not “getting good.”

    • bangaladore 2 days ago

      The "Intel Core Ultra 7 258V" is at least 2-3x slower than the GPU within the PlayStation 5. It is not even close, and that's last gen. Again, the APUs within modern consoles compete with desktop grade GPUs. In the case of the PS5 its roughly comparable to an RTX 2070 or Rx 6700 (better analog).

      • aurareturn 2 days ago

        GPUs can be scaled with more cores and higher bandwidth memory. I assume had Intel won the contract, they would have done so.

      • wmf 2 days ago

        Multiple commenters here are forgetting about discrete Battlemage.

        • berbec 2 days ago

          And that's telling, isn't it? Even in this space, Intel's iGPUs are totally ignored or dismissed out of hand. I say it's because they have an unending string of broken promises, saying "This'll be the time we get integrated graphics right", over and over. It's never been true, and I for one have totally wiped them from my vision due to that.

  • pknomad 2 days ago

    Ditto. AMD also reliably delivered on CPUs for the past 2 iterations of both Xbox and PS. AMD feels like the only choice for consoles at this point.

    • coder543 2 days ago

      Well, Nvidia has powered a much more popular console... the Nintendo Switch, and Nvidia looks set to power the Switch 2 when it launches next year. So, AMD is clearly not the only choice.

      • mdasen 2 days ago

        The problem with choosing Nvidia is that they can't make an x86 processor with an integrated GPU. If you're looking to maintain backward compatibility with the Playstation 5, you're probably going to want to stick with an x86 chip. AMD has the rights to make x86 chips and it has the graphics chips to integrate.

        Nvidia has graphics chips, but it doesn't have the CPUs. Yes, Nvidia can make ARM CPUs, but they haven't been putting out amazing custom cores.

        AMD can simply repackage some Zen X cores with RDNA X GPU and with a little work have something Sony can use. Nvidia would need to either grab off-the-shelf ARM Cortex cores (like most of their ARM CPUs use) or Sony would need to bet that Nvidia could and would give them leading-edge performance on custom designed cores. But would Nvidia come in at a price that Sony would pay? Probably not. AMD's costs are probably a lot lower since they're going to be doing all that CPU work anyway for the rest of their business.

        For Nintendo, the calculus is a bit different. Nintendo is fine with off-the-shelf cores that are less powerful than smartphones and they're already on ARM so there's no backward incompatibility there. But for Sony whose business is different, it'd be a huge gamble.

        • coder543 2 days ago

          I think changing from AMD GPUs to Nvidia GPUs by itself has a good chance of breaking backwards compatibility with how low level and custom Sony's GPU API apparently is, so the CPU core architecture would just be a secondary concern.

          I was not saying Sony should switch to Nvidia, just pointing out that it is objectively incorrect to say that AMD is the only option for consoles when the most popular console today does not rely on AMD.

          I also fully believe Intel could scale up an integrated Battlemage to meet Sony's needs, but is it worth the break in compatibility? Is it worth the added risk when Intel's 13th and 14th gen CPUs have had such publicly documented stability issues? I believe the answer to both questions is "probably not."

          • qwytw a day ago

            > incorrect to say that AMD is the only option for consoles

            It's a bit of an apples to oranges comparison though, even if all 3 devices are technically consoles. The Switch is basically a tablet with controllers attached and a tablet/phone CPU while PS5/Xbox are just custom build PCs.

            • coder543 a day ago

              The only reason I can see that it would matter that the Switch is a low-end console is if you think Nvidia is incapable of building something higher end. Are you saying that Nvidia couldn't make more powerful hardware for a high end console? Otherwise, the Switch just demonstrates to me that Nvidia is willing to form the right partnership, and reliably supply the same chips for long periods of time.

              I'm certain Nvidia would have no trouble doing a high end console, customized to Microsoft and/or Sony's exacting specs... for the right price.

              • qwytw a day ago

                > Are you saying that Nvidia couldn't make more powerful hardware for a high end console?

                Hard to say. It tooks Qualcomm years make something that was superior to standard ARM designs. GPU is of course another matter.

                > I'm certain Nvidia would have no trouble doing a high end console,

                The last mobile/consumer CPU (based on their own core) that they have released came out in 2015 and they have been using off the shelf ARM core designs for their embedded and server stuff. Wouldn't they be effectively be starting from scratch?

                I'm sure they could achieve that in a few years but do you think it would take them significantly less time that it did Apple or Qualcomm?

                > Nvidia is incapable of building something higher end

                I think it depends more on what Nintendo is willing to pay, I doubt they really want a "high-end" chip.

                • coder543 a day ago

                  > I think it depends more on what Nintendo is willing to pay, I doubt they really want a "high-end" chip.

                  In this thread, we were talking about what Sony and Microsoft would want for successors to the PS5 and XSX, not Nintendo. Nintendo was just a convenient demonstration that Nvidia is clearly willing to partner with console makers like Sony and Microsoft.

                  > Hard to say. It tooks Qualcomm years make something that was superior to standard ARM designs.

                  > The last mobile CPU

                  I wasn't talking about Nvidia custom designing an ARM core, although they have done that in the past, and again, this wouldn't be mobile hardware. Nvidia is using very powerful ARM cores in their Grace CPU today. They have plenty of experience with the off-the-shelf ARM cores, which are very likely good enough for modern consoles.

                  • qwytw a day ago

                    > Nvidia is using very powerful ARM cores in their Grace CPU today

                    I'm not sure Neoverse is particularly (or at all) suitable for gaming consoles. Having 60+ cores wouldn't be particularly useful and their single core performance is pretty horrible (by design).

                    > which are very likely good enough for modern consoles

                    Are they? Cortex-X4 has barely caught up with Apple's M1 (from 2020)? What other options are there? ARM just doesen't seem to care that much about the laptop/desktop market at all.

                    • coder543 a day ago

                      The Neoverse cores are substantially more powerful than something like Cortex-X4. Why would they not be suitable? It's hard to find benchmarks that are apples-to-apples in tests that would be relevant for gaming, but what little I've been able to find shows that the Neoverse V2 cores in Nvidia's Grace CPU are competitive against AMD's CPUs. I hate to draw specific comparisons, because it's very easy to attack when, as I already said, the numbers are hard to come by, but I'm seeing probably 20% better than Zen 3 on a clock-for-clock, single core basis. The current-generation PS5 and XSX are based on Zen 2. Zen 3 was already a 10% to 30% jump in IPC over Zen 2, depending on who you ask. Any hypothetical Nvidia-led SoC design for a next-gen console would be pulling in cores like the Neoverse V3 cores that have been announced, and are supposedly another 15% to 20% better than Neoverse V2, or even Neoverse V4 cores which might be available in time for the next-gen consoles.

                      These gains add up to be substantial over the current-gen consoles, and as an armchair console designer, I don't see how you can be so confident they wouldn't be good enough.

                      The CPU cores Nvidia has access to seem more than sufficient, and the GPU would be exceptional. AMD is clearly not the only one capable of providing hardware for consoles. Nvidia has done it, will do it again, and the evidence suggests Nvidia could certainly scale up to much bigger consoles if needed. One problem is certainly that Nvidia is making bank off of AI at the moment, and doesn't need to vie for the attention of console makers right now, so they aren't offering any good deals to those OEMs. The other problem is that console makers also don't want any break in compatibility. I've already addressed these problems in previous comments. It's just incorrect to say that the console makers have no other choices. They're just happy with what AMD is offering, and making the choice to stick with that. Nintendo will be happy using hardware made on a previous process node, so it won't interfere with Nvidia's plan to make insane money off of AI chips the way that next-gen console expectations from Sony or Microsoft would. I'm happy to admit that I'm being speculative in the reasons behind these things, but there seem to be enough facts to support the basic assertion that AMD is not the only option, which is what this sub-thread is about.

                      Since you seem so confident in your assertions, I assume you have good sources to back up the claim that Neoverse V2/V3/V4 wouldn't be suitable for gaming consoles?

                      • qwytw 18 hours ago

                        > Nvidia's Grace CPU are competitive against AMD's CPUs

                        I don't think PS/Xbox are using AMDs 64+ core server chips like Milan etc.

                        > I assume you have good sources to back up the claim that Neoverse V2/V3/V4

                        These are data center CPUs designed for very different purposes. Neoverse is only used in chips that target very specific, highly parallelized workloads. The point is having a very high number 64-128+ of relatively very slow but power efficient cores and extremely high bandwidth.

                        e.g Grace has comparable single thread performance to Ryzen 7 3700X (a 5 year old chip). Sure MT performance is 10x better but how does that matter for gaming workloads?

                        I assume you could boost the frequency and build a SoC with several times less core than all recent Neoverse chips (if ARM let's you). Nobody has done that or publically considered doing it. I can't prove that it's impossible but can you provide any specific arguments why do you think that you be a practical approach?

                        > substantially more powerful than something like Cortex-X4.

                        Of course it's just rumors but Nvidia seems to be going with ARM A78C which is a tier below X4. Which is not particularly surprising since Nintendo would rather spend money on other components / target a lower price point. As we've agreed the GPU is the important part here the CPU will probably be comparable to an off the shelf SoC you can get from Qualcomm or even MediaTek.

                        That might change in the future but I don't see any evidence that Nvidia is somehow particularly good at building CPUs or is close to being in the same tier as AMD, Intel, Qualcomm (maybe even Ampere depending if they finally deliver what they have been promising in the near future).

                        Same applies to Grace, the whole selling point is integration with their datacenter GPUs. For CPU workloads it provides pretty atrocious price/performance and it would make little sense to buy it for that.

        • kmeisthax 2 days ago

          Emulating x86 would be an option - though given Sony's history, I doubt they'd consider it seriously.

          For context...

          - PS1 BC on PS2 was mostly hardware but they (AFAIK?) had to write some code to translate PS1 GPU commands to the PS2 GS. That's why you could forcibly enable bilinear filtering on PS1 games. Later on they got rid of the PS1 CPU / "IO processor" and replaced it with a PPC chip ("Deckard") running a MIPS emulator.

          - PS1 BC on PS3 was entirely software; though the Deckard PS2s make this not entirely unprecedented. Sony had already written POPS for PS1 downloads on PS2 BBN[0] and PSP's PS1 Classics, so they knew how to emulate a PS1.

          - PS2 BC on PS3 was a nightmare. Originally it was all hardware[1], but then they dropped the EE+GS combo chip and went to GPU emulation, then they dropped the PS2 CPU entirely and all backwards compatibility with it. Then they actually wrote a PS2 emulator anyway, which is part of the firmware, but only allowed to be used with PS2 Classics and not as BC. I guess they consider the purchase price of the games on the shop to also pay for the emulator?

          - No BC was attempted on PS4 at all, AFAIK. PS3 is a weird basketcase of an architecture, but even PS1 or PS2 aren't BC supported.

          At some point Sony gave up on software emulation and decided it's only worth it for retro re-releases where they can carefully control what games run on the emulator and, more importantly, charge you for each re-release. At least the PS4 versions will still play on a PS5... and PS6... right?

          [0] A Japan-only PS2 application that served as a replacement for the built-in OSD and let you connect to and download software demos, game trailers, and so on. Also has an e-mail client.

          [1] Or at least as "all hardware" as the Deckard PS2s are

          • lxgr 2 days ago

            > Then they actually wrote a PS2 emulator anyway, which is part of the firmware, but only allowed to be used with PS2 Classics and not as BC.

            To be fair, IMO that was only 80-90% of a money grab; "you can now run old physical PS2 games, but only these 30% of our catalog" being a weird selling point was probably also a consideration.

            > Sony had already written POPS for PS1 downloads on PS2 BBN[0] and PSP's PS1 Classics, so they knew how to emulate a PS1.

            POPS on the PSP runs large parts of the code directly on the R4000 without translation/interpretation, right? I'd call this one closer to what they did for PS1 games on the (early/non-Deckard) PS2s.

          • MadnessASAP 2 days ago

            > No BC was attempted on PS4 at all, AFAIK. PS3 is a weird basketcase of an architecture, but even PS1 or PS2 aren't BC supported.

            To Be Faiiiirrrrrr, that whole generation was a basket case. Nintendo with the motion controls. Microsoft with a console that internally was more PC then "traditional" console (and HD-DVD). Sony with the Cell processor and OtherOS™.

            I do have fond memories of playing around with Linux on the PS3. Two simultaneous threads! 6 more almost cores!! That's practically a supercomputer!!!

            • anonfordays 2 days ago

              I remember the hype around cell processors being so high around the release of the PlayStation 3. It was novel for the application, but still fizzled out even with the backing it had.

            • lxgr 2 days ago

              In what sense would you say the Xbox 360 was more "PC-like" than "console-like"?

              • kmeisthax 2 days ago

                I'll try to answer in the parent commenter's place.

                Prior generations of consoles were true-blue, capital-E "embedded". Whatever CPU they could get, graphics hardware that was custom built for that particular machine, and all sorts of weird coprocessors and quirks. For example, in the last generation, we had...

                - The PlayStation 2, sporting a CPU with an almost[0] MIPS-compatible core with "vertex units", one of which is exposed to software as a custom MIPS coprocessor, a completely custom GPU architecture, a separate I/O processor that's also a PS1, custom sound mixing hardware, etc.

                - The GameCube, sporting a PPC 750 with custom cache management and vector instructions[1], which you might know as the PowerPC G3 that you had in your iMac. The GPU is "ATI technology", but that's because ATI bought out the other company Nintendo contracted to make it, ArtX. And it also has custom audio hardware that runs on another chip with it's own memory.

                - The Xbox, sporting... an Intel Celeron and an Nvidia GPU. Oh, wait, that's "just a PC".

                Original Xbox is actually a good way to draw some red lines here, because while it is in some respects "just a PC", it's built a lot more like consoles are. All games run in Ring 0, and are very tightly coupled to the individual quirks of the system software. The "Nvidia GPU" is an NV2A, a custom design that Nvidia built specifically for the Xbox. Which itself has custom audio mixing and security hardware you would never find in a PC.

                In contrast, while Xbox 360 and PS3 both were stuck with PPC[2], they also both had real operating system software that commercial games were expected to coexist with. On Xbox 360, there's a hypervisor that enforces strict code signing; on PS3 games additionally run in user mode. The existence of these OSes meant that system software could be updated in nontrivial ways, and the system software could do some amount of multitasking, like playing music alongside a game without degrading performance or crashing it. Y'know, like you can on a PC.

                Contrast this again to the Nintendo Wii, which stuck with the PPC 750 and ArtX GPU, adding on a security processor designed by BroadOn[3] to do very rudimentary DRM. About the only thing Nintendo could sanely update without bricking systems was the Wii Menu, which is why we were able to get the little clock at the bottom of the screen. They couldn't, say, run disc games off the SD card or update the HOME Menu to have a music player or friends list or whatever, because the former runs in a security processor that exposes the SD card as a block device and the latter is a library Nintendo embedded into every game binary rather than a separate process with dedicated CPU time budgets.

                And then the generation after that, Xbox One and PS4 both moved to AMD semicustom designs that had x86 CPUs and Radeon GPUs behind familiar APIs. They're so PC like that the first thing demoed on a hacked PS4 was running Steam and Portal. The Wii U was still kind of "console-like", but even that had an OS running on the actual application processor (albeit one of those weird designs with fixed process partitions like something written for a mainframe). And that got replaced with the Switch which has a proper microkernel operating system running on an Nvidia Tegra SoC that might have even wound up in an Android phone at some point!

                Ok, that's "phone-like", not "PC-like", but the differences in systems design philosophy between the two is far smaller than the huge gulf between either of those and oldschool console / embedded systems.

                [0] PS2 floating-point is NOWHERE NEAR IEEE standard, and games targeting PS2 tended to have lots of fun physics bugs on other hardware. Case in point: the Dolphin wiki article for True Crime: New York City, which is just a list of bugs the emulator isn't causing. https://wiki.dolphin-emu.org/index.php?title=True_Crime:_New...

                [1] PPC 750 doesn't have vector normally; IBM added a set of "paired single" instructions that let it do math on 32-bit floats stored in a 64-bit float register.

                [2] Right after Apple ditched it for power reasons, which totally would not blow up in Microsoft's face

                [3] Which coincidentally was founded by the same ex-SGI guy (Wei Yen) who founded ArtX, and ran DRM software ported from another Wei Yen founded company - iQue.

          • philistine 2 days ago

            Considering how the wins are blowing, I'm going to guess the next consoles from Sony and Microsoft are the last ones to use x86. They'll be forced to switch to ARM for price/performance reasons, with all x86 vendors moving upmarket to try and maintain revenues.

        • alexjplant 2 days ago

          > Nvidia has graphics chips, but it doesn't have the CPUs. Yes, Nvidia can make ARM CPUs, but they haven't been putting out amazing custom cores.

          Ignorant question - do they have to? The last time I was up on gaming hardware it seemed as though most workloads were GPU-bound and that having a higher-end GPU was more important than having a blazing fast CPU. GPUs have also grown much more flexible rendering pipelines as game engines have gotten much more sophisticated and, presumably, parallelized. Would it not make sense for Nvidia to crank out a cost-optimized design comprising their last-gen GPU architecture with 12 ARM cores on an affordable node size?

          The reason I ask is because I've been reading a lot about 90s console architectures recently. My understanding is that back then the CPU and specialized co-processors had to do a lot of heavy lifting on geometry calculations before telling the display hardware what to draw. In contrast I think most contemporary GPU designs take care of all of the vertex calculations themselves and therefore free the CPU up a lot in this regard. If you have an entity-based game engine and are able to split that object graph into well-defined clusters you can probably parallelize the simulation and scale horizontally decently well. Given these trends I'd think a bunch of cheaper cores could work as well for cheaper than higher-end ones.

          • toast0 2 days ago

            I think a PS6 needs to play PS5 games, or Sony will have a hard time selling them until the PS6 catalog is big; and they'll have a hard time getting 3rd party developers if they're going to have a hard time with console sales. I don't think you're going to play existing PS5 games on an ARM CPU unless it's an "amazing" core. Apple does pretty good at running x86 code on their CPUs, but they added special modes to make it work, and I don't know how timing sensitive PS5 games are --- when there's only a handful of hardware variants, you can easily end up with tricky timing requirements.

            • bigstrat2003 2 days ago

              I mean, the PS4 didn't play PS3 games and that didn't hurt it any. Backwards compatibility is nice but it isn't the only factor.

              • bigfishrunning 2 days ago

                The first year of PS4 was pretty dry because of the lack of BC; It really helped that the competition was the Xbox One, which was less appealing for a lot of reasons

                • MBCook 2 days ago

                  At this point people have loved the PS5 and Xbox Series for having full backwards compatibility. The Xbox goes even further through software. People liked the Wii’s backwards compatibility and the Wii U (for those who had it).

                  And Nintendo’s long chain of BC from the GB to the 3DS (though eventually dropping GB/GBC) was legendary.

                  The Switch was such a leap over the 3DS and WiiU Nintendo got away with it. It’s had such a long life having no BC could be a huge hit if the Switch 2 didn’t have it.

                  I think all three intended to try and keep it going forward at this point.

                  • pjmlp 20 hours ago

                    Which is also the reason why many games on PS 5 and XBox Series are kind of lame, as studios want to keep PS 4 and XBone gamers in the sales loop, and why PS 5 Pro is more of scam kind of thing for hardcore fans that will buy anything that a console vendor puts out.

                • nottorp 2 days ago

                  One data point: there was no chip shortage at the PS4 launch, but I still waited more than a year to get one because there was little to play on it.

                  While with the PS5 I got one as soon as I could (that still took more than a year since launch, but for chip shortage reasons) because I knew I could simply replace the PS4 with it under the TV and carry on.

              • philistine 2 days ago

                We're not in 2012 anymore. Modern players don't only want a clean break to play the new AAA games every month, they also want access to a large indie marketplace, they also want the games they play every day, they also want to improve the performance of the games they already have.

          • wmf 2 days ago

            PS5 had Zen 2 which was fairly new at the time. If PS6 targets 120 fps they'll want a CPU that's double the performance of Zen 2 per thread. You could definitely achieve this with ARM but I'm not sure how new of an ARM core you would need.

            • t-3 2 days ago

              Is there a need to ever target 120 fps? Only the best-of-best eyes will even notice a slight difference from 60.

              • MBCook 2 days ago

                Yes.

                You say that, but you can absolutely notice. Motion is smoother, the picture is clearer (higher temporal resolution), and input latency is half what it is at 60.

                Does every game need it? Absolutely not. But high-speed action games and driving games can definitely benefit. Maybe others. There’s a reason the PC world has been going nuts with frame rates for years.

                We have 120 fps on consoles today on a few games. They either have to significantly cut back (detail, down to 1080p, etc) or are simpler to begin with (Ori, Prince of Persia). But it’s a great experience.

              • smolder 2 days ago

                My eyes are not best-of-best but the difference between 60 and 120hz in something first-person is dramatic and obvious. It depends on the content but there are many such games for consoles. Your claim that it's "slight" is one that only gets repeated by people who haven't seen the difference.

                • t-3 2 days ago

                  Honestly, I can't even tell the difference between 30 and 60. Maybe I'm not playing the right games or something but I never notice framerate at all unless it's less than 10-20 or so.

                  • smolder a day ago

                    I would guess it's partly the games you play not having a lot of fast motion and maybe partly that you're not really looking for it.

              • wmf 2 days ago

                I don't think my TV can display 120 fps and I'm not buying a new one. But they promise 4K 60 (with upscaling) on the PS5 Pro, so they have to have something beyond that for PS6.

                • MBCook 2 days ago

                  They have 120 today, it’s just not used much.

                  Even if people stick to 4K 60, which I suspect they will, the additional power means higher detail and more enemies on screen and better ray tracing.

                  I think of the difference between the PlayStation three games that could run at 1080 and PS4 games at 1080. Or PS4 Pro and PS5 at 4k or even 1440p.

        • kcb 2 days ago

          Nvidia has very little desire to make a high-end razor thin margin chip that consoles traditionally demand. This is what Jensen has said, and it makes sense when there are other areas that the silicon can be directed to with much greater profit.

        • FileSorter 2 days ago

          >The problem with choosing Nvidia is that they can't make an x86 processor with an integrated GPU

          Can't and not being allowed are two very different things

      • pinewurst 2 days ago

        That's not an apples-to-apples comparison. Switch is lower price, lower performance by design and used, even originally, a mature NVIDIA SoC, not really a custom.

      • qwytw a day ago

        > the Nintendo Switch, and Nvidia looks set to power the Switch 2

        Which runs a very old mobile chip which was already outdated when the Switch came out. Unless Nintendo is planning to go with something high-end this time (e.g. to compete with the Steam Deck and other more powerful handhelds) whatever they get from Nvidia will probably be more or less equivalent to an mid-tier of the shelf Qualcomm SoC.

        It's interesting that Nvidia is going with that, it will just depress their margins. I guess they want to reenter the mobile CPU market and need something to show off.

        • coder543 a day ago

          We already have a good sense of what SoC Nintendo will likely be going with for the Switch 2.

          Being so dismissive of the Switch shows the disconnect between what most gamers care about, and what some tech enthusiasts think gamers care about.

          The Switch 1 used a crappy mobile chip, sure, but it was able to run tons of games that no other Tegra device could have dreamed of running, due to the power of having a stable target for optimization, with sufficiently powerful APIs available, and a huge target market. The Switch 1 can do 90% of what a Steam Deck can, while using a fraction of the power, thickness, and cooling. With the Switch 2 almost certainly gaining DLSS, I fully expect the Switch 2 to run circles around the Steam Deck, even without a “high end chip”. It will be weaker on paper, but that won’t matter.

          I say this as someone who owns a PS5, a Switch OLED, an ROG Ally, and a fairly decent gaming PC. I briefly had an original Steam Deck, but the screen was atrocious.

          Most people I see talking about Steam Deck’s awesomeness seem to either have very little experience with a Switch, or just have a lot of disdain for Nintendo. Yes, having access to Steam games is cool… but hauling around a massive device with short battery life is not cool to most gamers, and neither is spending forever tweaking settings just to get something that’s marginally better than the Switch 1 can do out of the box.

          The Switch 1 is at the end of its life right now, but Nintendo is certainly preparing the hardware for the next 6 to 8 years.

          • qwytw a day ago

            > Being so dismissive of the Switch shows the disconnect between what most gamers care about, and what some tech enthusiasts think gamers care about.

            What makes you think I am? Hardware wise it's an equivalent of a unremarkable ancient Android tablet, yet it's pretty exceptional what Nintendo manage to achieve despite of that.

            > The Switch 1 can do 90% of what a Steam Deck can

            That's highly debatable and almost completely depends on what games specifically you like/play. IMHO PC gaming and Nintendo have relatively little overlap (e.g. compared to to PS and Xbox at least).

            > Steam Deck’s awesomeness

            I never implied that the Switch was/is/will be somehow inferior (besides potentially having a slower CPU & GPU).

            > but Nintendo is certainly preparing the hardware for the next 6 to 8 years

            It's not obvious that they were the first time and still did fine, why would they change their approach this time (albeit there weren't necessarily that many options on the market back then but it was still an ~2 year old chip).

            • coder543 a day ago

              > IMHO PC gaming and Nintendo have relatively little overlap (e.g. compared to PS and Xbox at least).

              That was true back in the Wii era, because there was nothing remarkable about the Wii apart from its input method. It was "just another home console" to most developers, so why bother going through the effort to port their games from more powerful consoles down to the Wii, where they will just look bad, run poorly, and have weird controls?

              With the Nintendo Switch, Nintendo found huge success in third party titles because everyone who made a game was enthusiastic about being able to play their game portably, and the Switch made that possible with hardware that was architecturally similar to other consoles and PCs (at least by comparison to previous handheld gaming consoles), which made porting feasible without a complete rewrite.

              In my opinion, basically the only console games that aren't available on Switch at this point are the very most recent crop of high-end games, which the Switch is too old to run, as well as certain exclusives. If the Switch were still able to handle all the third party ports, then I don't even know if Nintendo would be interested in a Switch 2, but they do seem to care about the decent chunk of money they're making from third party games.

              The overlap with PC is the same as the overlap between PC and other consoles... which is quite a lot, but doesn't include certain genres like RTSes. They've tried bringing Starcraft to console before, and that wasn't very well received for obvious reasons, haha

              > It's not obvious that they were the first time and still did fine, why would they change their approach this time

              I'm not sure I was saying they would change their approach... the Switch 1 is over 7 years old at this point. I was just saying they're preparing the next generation to last the same amount of time, which means finding sufficiently powerful hardware. The Switch 1 was sufficiently powerful, even for running lots of "impossible" ports throughout its lifetime: https://www.youtube.com/watch?v=ENECyFQPe-4

              All of that to say, I am a big fan of the Switch OLED. I'm honestly ready to sell my ROG Ally, just because I never use it. But, being a bit of a contradiction, I am also extremely interested in the PS5 Pro. My PS5 has been a great experience, but I wish it had graphical fidelity that was a bit closer to my PC... without the inconvenience of Windows. For a handheld, I care a lot about portability (small size), battery life, and low hassle (not Windows, and not requiring tons of tweaking of graphics settings to get a good experience), and the Switch OLED does a great job of those things, while having access to a surprisingly complete catalog of both first-party and third-party games.

              • pjmlp 20 hours ago

                Actually there is one remarkle thing about the Wii, but it hardly matters in this context, it was one of the very few consoles out there that actually had something that relates to OpenGL, namely the shading language and how the API was designed.

                Many keep thinking that Khronos APIs have any use on game consoles, which is seldom the case.

      • dathinab 2 days ago

        > much more popular console

        which isn't a useful metric because "being a good GPU" wasn't at all why the switch became successful, like you could say it became successful even through it had a pretty bad GPU. Through bad only in the perf. aspect as far as I can tell back then amd wasn't competitive on energy usage basis and maybe not on a price basis as the nvidea chips where a by product of Nvidea trying to enter the media/TV add on/handheld market with stuff like the Nvidea Shield.

        But yes AMD isn't the only choice, IMHO in difference to what many people seem to think for the price segment most consoles tend to target Intel is a viable choice, too. But then we are missing relevant insider information to properly judge that.

  • dathinab 2 days ago

    > Intel has ARC, which presumably could be put in a custom "APU"; however, their track record with that is not stellar.

    I wouldn't exactly agree with that. ARC GPUs aren't really bad, sure when they where new there was for quite some time quite some driver issues but they have been mostly ironed out and where more in the "expected issues with first non iGPU" territory then "intel being very bad at their job" territory.

    Also GPUs in consoles (ignoring switch) are at the lower mid-class area today and that it's unlikely to change with future consoles, so that is a segment intel should be able to compete with. I mean console GPUs are more like big iGPS then dedicated GPUs.

    The main issue would be that weather it's intel, nvidea or amd their drivers have subtle but sometimes quite important differences in performance characteristics meaning that sometimes optimizations for one are de-optimizations for the other and similar interoperability issues. And they seem more likely with Intel as there is just much less history between the larger game engines and ARC GPUs.

    So IMHO Intel would have to offer a smaller price to be viable to compensate for more issues with backward compatibility, but if they where in a much better financial situation atm. I believe they would have had a high chance of getting it by subventioning it a bit so that they get a foothold on the marked and can compete without drawback next generation.

  • anemic 2 days ago

    Maybe the deal went south because Intel wanted it to be called Playstation 6 with Intel Integrated Graphics.

    And with a sticker on the front, of course.

johnklos 2 days ago

And they rightly deserve to lose the business to AMD.

Intel to Apple: "We're too big to deliver what you want for cell phones." Apple: "Ok. We'll use ARM."

Intel to Sony: "We're too big to commit to pricing, compatibility and volume." Sony:" Ok. We'll keep using AMD."

It's interesting that Intel keeps trying to ship "features", some of arguable utility but others that are decently helpful, like AVX-512, that now AMD delivers and Intel does not. I'm sure Sony didn't want a processor that can't properly and performantly run older and current titles.

  • tester756 2 days ago

    >Intel to Apple: "We're too big to deliver what you want for cell phones." Apple: "Ok. We'll use ARM."

    Reality:

    “We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it. The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do… At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”

    • fhdsgbbcaA 2 days ago

      This is from the horses mouth, and reliable as such. However, it does give the impression that they weren’t sufficiently interested to think more creatively about cost optimization, because they were riding the gravy train of Wintel ruling the world. So I think root comment isn’t too far off.

      • MBCook 2 days ago

        Right. It’s an accurate quote but that doesn’t mean it’s an accurate analysis.

        Not only did they not seem to understand the possibilities in front of them, their chips were not well positioned at all to win. They were too hot and too power-hungry because Intel didn’t care much about efficiency at the time.

        They were taking the “shrink a big chip” path. Apple, using ARM from Samsung then their own , ended up taking the “grow a little chip” path.

        Which is a little bit ironic because Intel made their fortune on the “little” desktop processor that grew up to take over all the servers from main frames and the “big boy“ server chips like the SPARC and Alpha.

        They became the big boys and history started repeating.

        • nxobject 2 days ago

          I'm surprised they didn't learn the same lesson from the P4/NetBurst vs. Pentium M/Banias fiasco: the smaller but scalable architecture somehow always wins – first in power/perf, and then more generally.

          (Actually, I need to check the timing of whether the "oh shit" moment for NetBurst happened before or after the development of the iPhone...)

          • MBCook 2 days ago

            The Core line (2006), when they started to swing back away from “make fast furnaces” was just one year before the iPhone (2007). So the NetBurst debacle had already happened.

            But that was desktops. I wonder if they really realized how much a problem that was in mobile. I also think I remember a discussion of that quote from a few weeks ago where someone said the real problem for Intel in the iPhone wasn’t heat but power draw.

            I don’t think they ever really got the religion. Apple’s M1 sort of seems like a repeat of this whole thing. Intel still didn’t get it at that point. Still too hot. Still not efficient enough.

            The switch from NetBurst to Core seems more like a direction switch because they hit a wall, not a recognition of what the problem actually was. A change from ultra-fast single core to fast multi-core.

            • mjevans 2 days ago

              The NetBurst line also had a _terribly_ deep pipeline. I can't remember the number of stages offhand but it was _massive_ for the era in an attempt to keep growing the single core single state machine performance (more mhz). Pipeline stalls made for some very erratic and very power hungry bursts of performance and then rewound CPU state to take correct branch.

      • silvestrov 2 days ago

        > no one knew what the iPhone would do

        When you are the CEO of Intel you should be able to see/forecast what smartphones would do in the market.

        The iPhone wasn't completely new. Nokia already had some "little smart" phones on the market already.

        The only real surprise was Apple's ability to get a US phone company on board with selling the iPhone and losing grip on what software that was installed on the phones.

        • polar 2 days ago

          > Nokia already had some "little smart" phones on the market already.

          So did other hardware/software vendors, and many of them were a lot smarter than the iPhone.

    • windowsrookie 2 days ago

      Intel made ARM chips, then sold that portion of the company in 2006, shortly before the iPhone was announced.

      https://en.wikipedia.org/wiki/XScale

      It was incredibly bad timing. If intel had continued making ARM chips they could be in an entirely different position today.

      • tester756 a day ago

        >It was incredibly bad timing. If intel had continued making ARM chips they could be in an entirely different position today.

        How so?

        ARM (ISA) doesn't imply performance characteristics nor significant advantage over x86

    • toast0 2 days ago

      IMO, more interesting than Intel not doing the iPhone is Intel ending atom for phones right before Microsoft demoed Continuum for Windows Mobile 10. That would have been a much different product on an x86 phone, IMHO. Maybe it would have been enough of an exciting feature that Microsoft would have not botched the Windows Mobile 10 release.

    • klelatti 2 days ago

      Otellini was not a dispassionate observer at the time he said this and there are very good reasons to believe that isn’t an accurate portrayal of what happened - including the fact that Otellini had just sold Intel’s smartphone SoC business and no x86 design was remotely suitable.

      https://thechipletter.substack.com/p/how-intel-missed-the-ip...

    • jiqiren 2 days ago

      The key in this quote is: "in hindsight, the forecasted cost was wrong"

      100% intel screwup.

epolanski 2 days ago

Not sure the title has the right framing.

It's hard to compete with AMD which is the only tech company to offer both x86 and a solid GPU technology that comes with it.

On top of that you have backwards compatibility woes and the uncertainty around Intel being able to deliver on its foundry.

All in all, this win would've been a great deal for Intel's foundry in PR, but money wise those were never going to be huge sums.

  • jeroenhd 2 days ago

    Intel's Arc GPUs are quite competent (especially with the highly necessary driver updates). If Battlemage fixed the hardware scheduling design flaw, Intel has a decent shot at competing with AMD.

    If AMD continues to lose ground on the desktop market and Intel continues to advance with Arc, there's a chance the PS6/Xbox Series 360 will run on Intel instead of AMD.

  • 0xcde4c3db 2 days ago

    Backward compatibility guarantees is a significant one, I think. A lot of the QA process for console games is predicated on testing against a fixed set of hardware configurations, and various race conditions and other weirdness can start crawling out of the woodwork even with modest changes. This has been seen on many games running on emulators, on hacked console firmwares that allow overclocking (e.g. by running the CPU at the "native" clock speed in backward compatibility mode), or with framerate unlocking patches.

  • hypercube33 2 days ago

    AMD also has a track record for Sony and consoles in general dating back to the game cube and delivering success. Maybe not the fastest thing but one that works and is reliable. Nvidia, IBM and Intel don't exactly deliver on the full suite either.

  • ChocolateGod 2 days ago

    > both x86 and a solid GPU technology that comes with it

    If only Project Denver had kept its original goal

    • wmf 2 days ago

      Transmeta and Denver never had great performance. If you want an x86 CPU it's so much safer to go with AMD.

      • pinewurst 2 days ago

        Plus Denver was constrained in x86 compatibility by Intel patents.

whalesalad 2 days ago

Intel hasn't made a console CPU/GPU since... the original Xbox?

AMD has done: Gamecube, Wii, Xbox 360 (gpu, not cpu), Xbox one, PS4, PS5 ...

  • snitty 2 days ago

    Just to clarify, GameCube and Wii CPU was an IBM PowerPC chip. GPU was ATI in both, later acquired by AMD.

    • christkv 2 days ago

      I think GameCube GPU was ArtX bought by ATI then ATI bought by AMD.

      • aappleby 2 days ago

        I was there, this is correct.

    • whalesalad 2 days ago

      ya should have clarified gpu-only. I am tacking ATI into AMD.

  • kcb 2 days ago

    Even then the GPU was Nvidia.

mastazi 2 days ago

According to the writer everything in tech is AI. It bothers me and makes it difficult to take the article seriously.

> Similar to how big tech companies like Google and Amazon rely on outside vendors to help design and manufacture custom AI chips

> Having missed the first wave of the AI boom dominated by Nvidia and AMD, Intel reported a disastrous second quarter in August.

  • MBCook 2 days ago

    > It bothers me and makes it difficult to take the article seriously.

    But if you’re in the chip game AI is the big thing of the last 10 years. It’s driven a huge chunk of new sales and demand for upgraded choices than they likely would have seen otherwise.

    Having missed out on AI in many ways (nVidia was perfectly positioned, AMD better than Intel) they need stuff to keep growing.

    Their current business is looking shakier than any time in recent history. ARM is getting pretty realistic on the desktop. Apple proved it and now Samsung and Qualcomm have parts for Windows users that perform well enough (compared to the failure of early ARM on Windows).

    They’re behind on selling silicon for AI to business and it’s not clear consumers care enough to upgrade their PCs. And when consumers upgrade they have not only great options from AMD, doing better than ever, but the ARM threat.

    They’re being squeezed on all sides. The PS6 wouldn’t make them dominant but it would have been a very steady and reliable revenue stream for years and a chance at parlaying that into additional business. “See what we did for Sony? We can do that for you.”

    The article seemed rather well done to me. I think you’re being too dismissive in this case.

    • mastazi 2 days ago

      > But if you’re in the chip game AI is the big thing of the last 10 years.

      IMHO, AMD having done well despite being woefully unprepared for the recent AI wave suggests that AI is not the only big thing

      (edit: grammar)

  • deelowe 2 days ago

    I mean, it's very likely next gen consoles will feature AI hardware. The PS5 pro is already touting it.

    • xcv123 2 days ago

      The GPU is "AI hardware", and current PS5 already has it.

      • j_maffe 2 days ago

        Custom architecture optimized for ML is a thing.

      • deelowe 2 days ago

        That's a bit reductionist.

apexalpha 2 days ago

Title is a bit weird; AMD has been the supplier for PS4 and PS5 already and will continue to supply the PS6.

I guess Intel lost the bidding process but they never had the 'Playstation business' in the first place.

Nevertheless, an interesting read.

eigenform 2 days ago

I wonder if Sony having to adapt their DRM/platform security strategy into Intel-world would've introduced a lot of friction.

This kind of thing is probably part of the motivation behind Intel splitting out a "Partner Security Engine."

nottorp 2 days ago

> Intel and AMD were the final two contenders in the bidding process for the contract.

That's an interesting question. Will either Sony or MS break backwards compatibility by going away from x86 again in the future? Definitely not with the next console generation.

On the CPU side, MS does have good x86-on-arm emulation from their brand new windows arm so it's conceivable. Not sure how bad it would be on the GPU side.

  • ThatPlayer 2 days ago

    Games aren't running on GPUs directly; they're using APIs like DirectX for Xbox. As long as the GPU implements the APIs properly it should be fine. RISC-V Linux with a desktop PCI-E AMD GPU and Linux kernel drivers can run games already: https://youtu.be/qHLKB39xVkw, limited by the power of the RISC-V CPU here.

    I'm wondering if they would still aim for a single chip when moving to ARM. AMD (and Intel) don't make ARM chips. Nvidia does, and is probably what the next Switch will use. Qualcomm does have proper DX12 support on Windows ARM, but who knows how that'll scale since they make mostly mobile GPUs. Intel had similar problems scaling their iGPUs for Arc.

langsoul-com 2 days ago

Why did amd win the console business? It seems that even though they weren't number 1, they were always on most consoles.

  • JonChesterfield 2 days ago

    I believe x64 won the PS4 era because games devs were deeply sick of targeting a special purpose architecture for the console and also x86 for the PC game port. At the time all desktop computers for games were x86 based IBM clone things.

    I don't know why AMD ended up with the PS4 and the xbox as opposed to Intel getting either, but x86 was probably inevitable. I wonder if these days something architecturally similar to the mac arm systems would be a reasonable alternative.

    • MBCook 2 days ago

      You’re right they didn’t want to target weird things like the PS3 anymore, that was a huge pain.

      But you have to remember the other side: there were no other options than x86.

      ARM wasn’t powerful enough for one of the high-end consoles at the time. Trying to have such a chip designed would be a lot more expensive than just choosing a premade design and tweaking at a little. It worked for Nintendo but they had different goals. I’m not sure ARM could’ve been used it for the PlayStation 4 or Xbox One.

      The previous supplier, IBM PowerPC, had basically given up. Apple switched off them for the same reason. It wasn’t getting much faster and IBM only really seemed interested in server chips. I think it’s reasonable to assume they wouldn’t have tried very hard to win a chance to make a faster console chip.

      If you don’t want to design your own, that’s all the major players. x86-64 is all that’s left.

      Which is not to say that was a bad option. Developers are extremely familiar with it, there’s a metric ton of tools available, it makes game porting to and from PCs easier, it was the highest performance option, and there are two big suppliers that you can play against each other. So even if you went with Intel and something happened you could switch to AMD.

      When IBM decided they didn’t care about the market, you just had to leave the PowerPC.

      Why AMD over Intel? They were probably hungrier since they were in second place. They had a competitive GPU business, which Intel didn’t. Single supplier + they could do everything on one chip. And if AMD makes both parts they can help optimize the hell out of it.

      Microsoft got screwed by nVidia on the XBox. I don’t think they’d want to do that again. Sony would absolutely know that happened and be wary.

      Honestly it’s not clear to me that nVidia cared too much. But maybe I’m just reading it wrong. Nintendo went with them because they had and all in one system on a chip they were willing to dump for cheap that thanks to its mobile heritage developers were already familiar with.

      Also since AMD was the little guy (compared to Intel) they could really use the sales and the revenue. It would be a bigger percentage of their total income than Intel, meaning it was more important to win that contract.

      So in the end I think it makes a lot of sense that PlayStation and Xbox ended up in AMD land.

    • philistine 2 days ago

      > [..] something architecturally similar to the mac arm systems would be a reasonable alternative.

      It's called the Nintendo Switch, and its the second best selling console of all time.

      • MBCook 2 days ago

        It was also targeting a very different capability level from the PS4/XBox One.

    • someNameIG 2 days ago

      ARM would be a reasonable alternative. Unity/UE5 already support it due to mobile and Nintendo Switch, and consoles are usually more power/thermally constrained than desktops, so AMR in many ways would be a better alternative than x64.

      Plus PlayStation is big enough that if they went ARM, game devs would have to follow.

      • MBCook 2 days ago

        But were there fast/powerful enough ARM chips to be competitive with what ended up in the PS4/XBox One?

        They certainly exist today. But could Sony and Microsoft have chosen them or would they have had to have them developed?

        • someNameIG 2 days ago

          I'm not sure, the Jaguar CPUs in them were pretty underpowered at the time too, they were tablet/netbook level. I think they were in some ways a bit of a downgrade in CPU performance compared to the Cell in the PS3.

          Biggest issue at the time would have been an ARM CPU with a decent GPU if they wanted a SoC instead of having them separate dies

  • netcoyote 2 days ago

    This is just a hypothesis, but I wonder if it’s simply that AMD was willing to accept lower margins to keep their business going, where INTC wasn’t willing to compete because they’re comparing the contract to the higher margin sales they made in the PC business?

    I mean, sure, technical issues and such too, but mature businesses have a hard time accepting lower margins because it hurts their stock market metrics.

jandrese 2 days ago

It's a little vague what the "6 chips" would have been. CPU obviously. Probably some southbridge equivalent, but then what? A NIC? Was Intel going to supply the graphics chip too? That would have been a real turnaround for their GPU division.

  • bhouston 2 days ago

    There wasn't 6 chips. I believe it refers to "PlayStation 6", the version that comes after "PlayStation 5".

    I think the main thing Intel was competing for was CPU + GPU given they have the new Xe graphics architecture which is decent. But I guess they could have just gone after the CPU with NVIDIA likely then supplying the GPU as they did in the first Xbox.

    • jandrese 2 days ago

      Oh derp. You are right. I totally misread that.

    • deelowe 2 days ago

      I bet Intels hubris is too strong to allow them to do that these days.

andrewstuart 2 days ago

Everything is about GPUs these days.

Be it little GPUs inside the CPU package or be it consumer GPUs or big GPUs in data centers.

Unless Intel can start to get its GPU act together, it won't be leading the industry again in a hurry.

  • MBCook 2 days ago

    They’re getting better. But as the article mentions backwards compatibility is a huge lock in factor. It’s way easier for AMD to achieve it than it would be for Intel (or anyone else) following the PS5.

criticalfault 2 days ago

If we are talking foundry, Intel could still manufacture amd chips in their own fab instead of tsmc. Given that 18A is good...

This would be strange, but it would show Intel will do what it takes.

  • bgnn 2 days ago

    They have to do that to stay in fab business. Their design business is gonna be sold out I think, like HP/Agilent.

    • ac29 2 days ago

      Intel announced today they are moving their foundry to a subsidiary, so if anything its the fab business that will be spun out.

motbus3 a day ago

Since intel got fat gov and cloud contracts, nothing else matters

voytec 2 days ago

s/Playstation/PlayStation 6/ - next generation

lapinovski 2 days ago

playstation 6 already?

  • wmf 2 days ago

    It will probably be released around 2027-2028.

    • MBCook 2 days ago

      And with the PS5 Pro announced we know the PS5 is basically dead inside Sony’s engineering organization.

      Other than a die shrink or chip reduction or something there’s not much else to do. The designs are basically done.

      So all hardware work would currently be focused on the PS6. At least for the home console line.

      • Narishma 2 days ago

        The PS5 is definitely not dead, it's their main console. The Pro is a niche machine targeted at wealthy enthusiasts. It will only do a fraction of the regular PS5 sales numbers.

        • j_maffe 2 days ago

          I think GP means dead as in there'll be no more design efforts targeted at it.

          • MBCook 2 days ago

            Right. It will live on. I love it, plan to buy a Pro.

            But for their hardware engineers they’re likely done and moved on.

            • pjmlp 2 days ago

              It will be like the PS4 Pro, something to capitalise on hardcore customers and that is it.

              However, most games nowadays can't even take full advantage of PS 5 and XBox Series X, so it is quite questionable what they will do with next generation.

              Switch might be oldie hardware, however it clearly shows how well a console can sell when more attention is given to gameplay than polygons per second.

jheriko 2 days ago

non-news.

signed. gamedev.