PDA

View Full Version : PS4 vs Xbox One



Winjer
31-10-14, 12:05
Este tópico já chega um pouco atrasado, mas após ver este artigo na Techspot, achei que seria interessante falarmos um pouco disto.

The Cornerplay: Xbox One's struggles are traceable to one bad decision (http://www.techspot.com/news/58624-cornerplay-xbox-one-struggles-traceable-one-bad-decision.html)



Microsoft made two important announcements recently. The first was about Office 365 becoming a game changer (http://cornerplay.com/2014/10/29/microsoft-office365-onedrive/). Today, I'll touch on the $50 price drop for the Xbox One. From November 2nd to January 2nd, you can get any Xbox One SKU for $50 off, which makes the entry level version $350. That's cheaper than the Playstation 4 at $400.Microsoft is marketing this as a temporary promotion for the holidays, but that's just marketing. I have a hard time believing the Xbox One will go back up to $400.
The price drop is long overdue. The Playstation 4 is outselling the Xbox One by a significant margin -- Ars Technica (http://arstechnica.com/gaming/2014/10/analysis-worldwide-ps4-sales-at-least-40-percent-better-than-xbox-one/) estimates by at least 40% -- and the entire gap can be traced to one crucial decision. That's how thin the line is between success and failure in the console market. You can have a fantastic brand, recruit third party support, obtain exclusives, introduce innovations, ensure wide distribution, spend a lot of money on marketing... and still fail because ofone bad decision.
That's not to say Microsoft hasn't made their fair few, but can you guess which bad decision I'm referring to? It wasn't bundling the Kinect, though that was rough because of the $100 price premium. It wasn't the DRM policies, or the always online requirement either. I think Microsoft was able to reverse those out early enough.
It was Microsoft's decision to go with 8GB of 2133MHz DDR3 RAM and 32MB of eSRAM memory for the Xbox One, while Sony opted to go with 8GB of 5500MHz GDDR5 RAM for the Playstation 4. This was terrible judgment on Microsoft's part, and if they lose the console war they can point to that decision as the cause.

What's the difference between these two memory systems? The bottom line is that the Playstation 4 is effectively more powerful than the Xbox One because of it. Microsoft will argue that the gap can be lessened with smart use of the embedded SRAM, marginally faster CPU and built-in cloud capabilities, but the truth is few if any third party developers will go to those lengths to optimize for the Xbox One.
It's easier to simply lower the game's resolution on the Xbox One and call it a day.
You can see why this is a huge problem for early adopters and game enthusiasts.

Two choices: One costs $400 and has better graphics, the other costs $500 with extra hardware you may want but don't need. Exclusives are a wash. Which would you choose?
Two choices: Both cost $400 except one has better graphics. Exclusives are still mostly a wash. Which would you choose?
Most will choose the machine with better graphics.
Xbox fans will argue the visual differences are imperceptible. That may be true but beside the point -- perception is reality and the perception is that the Playstation 4 is more powerful than the Xbox One.

The problem goes deeper for Microsoft.
Let's say you purchased both consoles. A big third party game like Assassin's Creed comes out and the Playstation 4 version runs at a higher resolution. Do you get it for the Playstation 4 or the Xbox One? Most will choose the former.
The game's publisher sees that the Playstation 4 version sold better and concludes they should put more resources behind the winning platform. Those resources translate to better games which gives consumers even more reason to choose Sony. Microsoft itself earns less revenue to make up for its loss leader, the console itself.
It's a death cycle that Microsoft is in danger of falling into.
Those who follow the console market will point out that weaker machines haven't always lost. Indeed, the Sega Genesis held up admirably against the Super Nintendo, as did the Xbox 360 against the Playstation 3. There's one key difference however. Those weaker machines also cost less to make and thus were sold at cheaper prices. Gamers don't mind weaker machines as long as they are cheaper as well.

Unfortunately, the Xbox One costs just as much to manufacture as the Playstation 4. Research firm IHS estimated (http://press.ihs.com/press-release/design-supply-chain/microsoft-xbox-one-hardware-cost-comes-below-retail-price-ihs-tear) that the core of the console, what's responsible for the graphics (CPU, GPU, RAM and other electronics), costs the same for both the Xbox One and the Playstation 4: $263.
How is that possible? Again, it comes down to memory architecture. The eSRAM on the Xbox One means a larger die is required (a huge driver of silicon cost) -- and that negates the Xbox One's cheaper and less powerful DDR3 memory. As Anandtech (http://www.anandtech.com/show/7528/the-xbox-one-mini-review-hardware-analysis/2) pointed out:
It turns out that Microsoft’s silicon budget was actually a little more than Sony’s, at least for the main APU. The Xbox One APU is a 363mm^2 die, compared to 348mm^2 for the PS4’s APU. Both use a similar 8-core Jaguar CPU (2 x quad-core islands), but they feature different implementations of AMD’s Graphics Core Next GPUs. Microsoft elected to implement 12 compute units, two geometry engines and 16 ROPs, while Sony went for 18 CUs, two geometry engines and 32 ROPs. How did Sony manage to fit in more compute and ROP partitions into a smaller die area? By not including any eSRAM on-die.

So why would Microsoft opt for a system that costs the same but is less powerful? At the time, there were concerns about the availability of DDR5, the kind used in the Playstation 4. DDR3 memory, by comparison, was readily available. Microsoft worried it wouldn't be able to secure enough DDR5 supply and thus opted for DDR3, with eSRAM to make up the difference.
Fortunately for Sony, DDR5 supply didn't turn out to be a bottleneck. In fact, Sony was able to launch in many more countries with more consoles ready for sale than Microsoft. Unfortunately for Microsoft, it also meant it had a console that appeared weaker graphically -- a sin for hardcore gamers who are the first to buy new, expensive consoles.
Without losing a lot of money anyway, a scenario Microsoft wants to avoid. Shareholders and financial analysts rightly view the Xbox as adjacent to Microsoft's mobile first, cloud first strategy; why then invest (http://metro.co.uk/2014/02/11/investors-call-for-microsoft-to-abandon-xbox-4299255/) so much in a potentially unprofitable business?
And that, dear readers, is why the Playstation 4 is ahead of the Xbox One. It reminds me of Al Pacino's speech in the movie Any Given Sunday. In the console wars, it truly is a game of inches.


Can you imagine if Microsoft had just opted for DDR5 memory? The Xbox One and the Playstation 4 would have the same exact hardware. There would be noresolutiongate and it wouldn't have been so easy for hardcore gamers to choose which to support. It would be a race to secure the better exclusives, provide a better network environment, and so on. Sure, Microsoft would have still made their earlier snafus, but those were all reversible. A bad hardware decision is not.
For Microsoft to make a comeback, they must sell the Xbox One at a lower price than the Playstation 4. It means Microsoft will make dramatically less money even if they do win out, but that's the price you pay for bad decisions.

Minion
31-10-14, 12:09
isso nem da para discutir, a PS4 e ma, mas ainda ta 1 passo a frente da Xbox

Jorge-Vieira
31-10-14, 12:11
Ja tinha colocado esta noticia no topico da Xbone.

Viriat0
31-10-14, 12:17
As consolas sem os seus jogos exclusivos não servem para nada!

tiran
31-10-14, 12:24
Consolas...

https://i.imgur.com/CakjE.png

Winjer
31-10-14, 12:25
O tópico era mesmo para falar mais do harware do que nos jogos em si.

Será que há assim muita gente que olha para as duas consolas com o mesmo preço e escolhem pela que tem melhor desempenho e gráficos?
É que neste momento não há muitos exclusivos em cada consola para que seja um factor decisivo.

Viriat0
31-10-14, 12:34
É que neste momento não há muitos exclusivos em cada consola para que seja um factor decisivo.

Forza e Gran Turismo, acho que maior parte da "Malta" escolhe a PS isto tudo devido ao Markting que a Sony faz. Está na boca do povo "Playstation" "PS".

A minha última consola comprada foi uma PS2, tive uma ps3 mas foi de uma troca que fiz, acabei por trocar por uma DSLR a uns anos largos.
Também tive 1 Xbox, 360 ..


Abraço

Jorge-Vieira
31-10-14, 13:21
O tópico era mesmo para falar mais do harware do que nos jogos em si.

Será que há assim muita gente que olha para as duas consolas com o mesmo preço e escolhem pela que tem melhor desempenho e gráficos?
É que neste momento não há muitos exclusivos em cada consola para que seja um factor decisivo.

Eu penso que o desempenho e os graficos são um factor decisivo para quem vai comprar uma consola e tiver alguns conhecimentos do assunto.
Por cá é a Sony, quer se queira quer não a Sony é uma marca bastante conhecida e aliado ao marketing tem sempre o seu peso na escolha, tendo uma consola que é superior não existem grandes motivos para ir para a Xone.

Depois jogos exclusivos como o GT também tem o seu peso e aí mais uma vez a PS leva vantagem.

Tirando isso, a nível de hardware, ambas são fracas, comparando com qualquer PC mediano. Se podiam ser melhores, sim podiam, existia hardware mais avançado na altura do desenvolvimento, o porquê de isso não ter sido feito... não sei, custos talvez.
O mal disto tudo... o atraso que está a provocar e o cenario ridiculo em que caiu s devs com afirmações dos 30fps e qualidades cinematograficas.

Winjer
31-10-14, 13:29
Sim, foi uma questão de custos.
Há 4 anos atrás já tínhamos GPUs mais potentes do que uma Xbox One, com capacidades DX11 e com memórias com a largura de banda que usam agora.
Os jogadores de PC já têm o nível de desempenho de uma Xbox One e PS4 há quase meia década. A diferença é que tirando uma dúzia de jogos, este poder era ou desperdiçado, ou gasto em AA, filtros, Physx e alguns efeitos.

LPC
31-10-14, 13:29
Boas!
Pessimo hardware para uma consola "next gen". Tanto uma como outra, usam APU´s bastante fracos.

Deveriam ter sido mais corajosos e ter logo metido uma configuração mais poderosa mesmo que ficasse mais cara a consola.
A qualidade visual é tudo quando se fala de marketing e vendas...

Ninguem vai ver os trailers pela jogabilidade, o que interessa é o WOW factor.

Dentro do mau, temos a PS4 que é a menos má...
No entanto é ridiculo que consolas de nova geração ainda tenham dificuldades a chegar á já antiga Full HD que está a fazer perto de 10 anos já...

Não compraria nenhuma das duas...

Cumprimentos,

LPC

Viriat0
31-10-14, 13:34
O Nosso Gargalo "Utilizadores de PCS" são as consolas!

Winjer
31-10-14, 13:41
Gargalo é dizer pouco, aquilo é mais uma corrente com bola, estilo prisão.

Neste momento vemos o PC a puxar por efeitos complexos como o flameworks, physx, TressFX, HairWorks, 4K, 120/144Hz e por outro lado as consolas mal conseguem bater os 1080p a 30 fps.

Jinx
31-10-14, 13:47
Pra que melhorar o desempenho, se continua a vender bem, é o que interessa!!
É tudo uma questao de numeros....
Quem gosta de consolas, continua a comprar os modelos novos!
Mesmo que estes a nivel de performance nao compense o investimento, comparando com a anterior...

Jorge-Vieira
31-10-14, 14:23
Se o desempenho não melhora, se agora temos jogos miseráveis no PC e afirmações riduculas, como será daqui a um ano?
Estas consolas são a pior coisa que o mundo da informatica em especial os gamers já viram.
Qualquer uma das duas é pessima, quando se compara aquilo que um PC mediano faz a nível de efeitos graficos com qualquer uma das consolas.
Não há justificação, mesmo a parte dos custos, dado que havia hardware muito melhor e mais bem preparado para o futuro.
Quem compra consolas são os meninos da escola... um verdadeiro gamer tem a melhor plataforma que se pode ter para jogos, um PC.

Winjer
31-10-14, 16:29
Felizmente que muitos jogadores já viram a luz e começam a mudar para o PC.
Neste momento o mercado de jogos de PC já vale mais do que o mercado das consolas, PS4, PS3, Xbox One, Xbox 360 e Wii U, todas juntas.
Mas ainda há géneros de jogos que são mais populares nas consolas e isso significa que o PC fica em segundo lugar para algumas empresas.

Se o mercado de PC continuar a crescer, pode forçar muitas empresas a mudar o seu foco das consolas para o PC e como tal, fazer jogos melhores.
Também pode forçar a MS e a Sony a fazerem um ciclo de vida mais curto, lançando novas consolas em 5 anos ou menos.

Jorge-Vieira
31-10-14, 16:54
O que alimenta neste momento grande parte das consolas é jogos como o Fifa e o PES e os jogos tipo GT.
Tirando isso o PC é superior em tudo, em hardware, em jogos e em exclusivos.
O que me preocupa é o tempo para que estas consolas foram projectadas, 8 anos!!! Se em tão pouco tempo já estão como estão e aquilo que chega ao PC já vem cortado, como será quando tivermos mais poder de processamento grafico, DX12...

Era urgente que o prazo de vida destas consolas fosse revisto e rapidamente fosse colocada uma nova geração mais capaz e preparada para aguentar uns anos sem grandes problemas. Não é facil, porque quem comprou as actuais consolas vai-se sentir prejudicado, mas o andamento da industria assim obriga e desvalorizações é uma constante neste meio, basta ver o que aconteceu agora com as recentes graficas da nVidia.

Winjer
14-12-14, 17:40
Which is the better media player? PlayStation 4 and Xbox One revisited (http://www.eurogamer.net/articles/digitalfoundry-2014-which-is-the-better-media-player-ps4-xbox-one-revisited)

Winjer
11-01-15, 12:24
Uma boa leitura sobre como a Xbox One está a evoluir.

The evolution of Xbox One - as told by the SDK leak (http://www.eurogamer.net/articles/digitalfoundry-2015-evolution-of-xbox-one-as-told-by-sdk-leak)
Diga-se que a Xbox One sempre teve uma ligeira vantagem sobre a PS4 em termos de CPU, por ter uns Mhz a mais e por usar DDR3 com menor latência do que a GDDR5 da PS4.
Mas com o libertar do 7º núcleo da X1, devemos ver esta diferença a aumentar.
Em situações de jogos que tenham um bottleneck no CPU, a Xbox One poderá mostrar mais desempenho do que a PS4.

Dape_1904
11-01-15, 12:28
E com o DX12 a ajudar o CPU da consola ainda mais, isso pode ficar ainda mais evidente.

Winjer
11-01-15, 12:37
O DX 12 é apenas uma API low level. A Sony também tem uma API low level na PS4.
Não é por aí que a X1 ganha vantagem.

Winjer
14-01-15, 11:11
Xbox One SDK & Hardware Leak Analysis CPU, GPU, RAM & More Part One – Tech Tribunal (http://www.redgamingtech.com/xbox-one-sdk-hardware-leak-analysis-cpu-gpu-ram-more-part-one-tech-tribunal/)

When the hacker group H4LT leaked the Xbox One’s SDK and its accompanying documentation, we gamer’s and journalists were given a fantastic insight of the hardware and software Microsoft’s Xbox One is comprised of. When the the Xbox One’s SDK leak first hit, gaming news headlines primarily focused their attentions on the revelation the seventh CPU core of the Xbox One (well, up to 80 percent of it anyway), was now usable by game developers. This change further extended the CPU performance lead the Xbox One has over Sony’s Playstation 4 (thanks to the Xbox One’s higher CPU clock speed). But in reality, there’s a lot more revealed inside the documentation than just that.For example, if you’ve ever wondered Xbox One games are more likely to experience frame rate drops during an “Achievement Unlocked” popping up on screen, you’ll have your answer soon enough. It’s our mission, starting with this – a first in a series of articles, to take you through the various improvements and changes in the Xbox One’s architecture, SDK and development cycle; explaining the language and providing insights into Compute, ESRAM usage, APIs and just about everything else that makes Microsoft’s Next-Gen console tick.
If you’re intimately familiar with the Xbox One’s hardware specs, feel free to skip this paragraph – if you’re not, we’ll go over a basic crash course. The Xbox One uses a custom build 28nm APU, co-designed by AMD and Microsoft. In this APU package sits a plethora of different components, including an X86-64 CPU (the AMD jaguar, which is eight cores) and runs at 1.75GHZ. For all intents and purposes, at the clock speed Microsoft are running it, the CPU puts out 112GFLOPS of computing power (total performance, across all eight cores). If we count the performance available for developers however the number goes down to about 95GFLOPS of CPU performance for the Xbox One. The esRAM is also included on die, which is used for the Xbox One’s GPU, like a fast cache. Despite the theoretical peak of the ESRAM’s performance hitting 200GB/s, in the real world Microsoft’s suggests you assume you’ll be rendering a scene with 102GB/s available. But, your mileage may vary up to an additional 20 – 30 percent. An AMD based GPU, featuring 12 GCN cores (running at 853MHZ) provides the Xbox One’s graphics. Finally, off package, there’s the the 8GB of DDR3 2133MHZ RAM, giving up to 68GB/s of memory bandwidth.
http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-operating-system-architecture-diagram-sdk-leak.jpg (http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-operating-system-architecture-diagram-sdk-leak.jpg)
Dual Graphics Drivers & Operating SystemThe Xbox One’s Operating System doesn’t run a single operating system, but rather 3 OSes operate simultaneously. The ‘Host OS’ is a hypervisor, and is relatively light weight and controls and runs the other two operating systems. Microsoft label these as ERA (Exclusive Resource Allocation) and SRA (Shared Resource Allocation). As one can see in the diagram, the Exclusive Partition is what eats up the bulk of the Xbox One’s Resources; and is the OS responsible for running games. The Xbox One’s Shared Partition (which runs on a Windows 8 core) meanwhile runs other functions, such as system services. These range from being able to message your friends, performing background updates and other functions which aren’t related to the game.
The ERA has multiple states, as we discussed back in June, 2014 with our Microsoft Developer Day (http://www.redgamingtech.com/inside-xbox-one-amd-microsoft-developer-day-analysis-breakdown-tech-tribunal-part-1/)analysis. “Full Screen is the first, meaning all of the resources are available to games and this applies even when the application is ‘snapped’ The second is ‘constrained’ – while the RAM allocation doesn’t shrink, CPU and GPU resources are reduced slightly as there’s no user input with the game so a slight drop in performance won’t impact things. Finally, there’s suspect. This state means that game is effectively in a halted state on the CPU and GPU (it’s using zero resources), but its still resident in memory and using the same amount of RAM.”
If you’ve played an Xbox One title, you’ll notice one area the game can experience slow down (Frame-Rate drops) is when you earn an Achievement. Microsoft specifically point the finger at this in their own documentation. “Cross-OS calls can take a significant amount of time, up to several milliseconds” says Microsoft. This means that you can’t (or at least, it’s not good practice) to call these functions multiple times per frame
http://media.redgamingtech.com/rgt-website/2015/01/branching-vs-integer-mask.jpg
Xbox One has dual Graphics drivers? You might ask yourself – and indeed the answer is yes. Remember, that the GPU (Graphics Processing Unit) inside the Xbox One is a Radeon Video Card using the GCN architecture. When the first development kit PC’s rolled out from their alpha states, back in April, 2012, Microsoft were using a “Generic DirectX 11 Manufacturer Supplied Driver”. But they’d also included a “Specific Graphics Driver developed by Microsoft”. A few months later, in July, 2012 Microsoft had gotten a lot of the Durango User-Mode Driver (UMD) functional. Now, this release of driver supports important features such as Tessellation. This driver, slowly evolved over the coming months, improving features and went hand in hand with PIX (Performance Investigator for Xbox) to help you get games running on the hardware as quickly as possible. It makes logical sense – particularly if you believe the murmurings from developers prior to the Xbox One’s launch. If you recall, there were a lot of complaints that the early Xbox One’s SDK was awkward and unfriendly. It would tie up those rumors and demonstrate Microsoft were trying to knock this on the head – but at the same time, the cost was a driver that wasn’t optimized for raw performance.
Things changed considerably on the date of July, 2013 however. Microsoft added a preview version of the Direct3d Monolithic Runtime. The primary difference between Monolithic Direct3d and regular Direct3d (that you’re running on say your WIndows 8 PC) is that its been optimized for a fix spec of hardware. Therefore, Microsoft could remove a layer of abstraction which in turn increases the performance of the Xbox One’s version of Direct3d. Useless functionality that wasn’t relevant to a closed system was removed, and the Direct3d runtime and UMD were starting to merge together.
Over a year later (and numerous rather large performance improvements to the Xbox One’s drivers) and it was obvious the Direct3d Monolithic Driver was the direction Microsoft were headed. As you read over the API’s, it’s of little surprise that in May, 2014 Microsoft officially lead with the line (under the “what’s new section”) – “Stock Direct3D support has been removed in the May 2014 XDK, in favor of Monolithic Direct3D”. Microsoft also announce afew other rather large changes, includes Asynchronous Compute is no longer in preview mode, which means developers are able to leverage the compute potential of the Xbox One’s GPU (more on this later).
http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-gpu-block-architecture.jpg (http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-gpu-block-architecture.jpg)The Xbox One’s GPU block that’s located on the APU.

Xbox One GPU reserves and AllocationReading through the SDK, just how the depreciation and eventual retirement of the User-Mode Driver (UMD) was evident, the changes in policies regarding Kinect is also also hiding in plain sight. As you’ll likely know, a fair chunk of the memory bandwidth (of the systems DDR3 2133 RAM), CPU performance (hence the recent revelation of access to the Xbox One’s seventh CPU core) and GPU were cordoned off for the purposes of running the Xbox One’s Kinect. If a title uses these Kinect functions, 91.5 percent of the Xbox One’s GPU reserve is available to running the game title, but should developers not use certain features almost up to 100 percent of GPU performance is available.
<center><iframe src="https://www.youtube.com/embed/f-oGhgJ6kBk?feature=oembed" frameborder="0" allowfullscreen=""></iframe>
</center>Game OS titles can access the full performance of the 1.31TFLOPS of GPU power if the developers meet a few conditions. The NUI, or Natural User Interface, which is what allows users to control apps using gestures rather than needing to use the Xbox One’s controller. Microsoft originally stated it was the ‘preferred user interface technology’ for the Xbox One; has a GPU reserve of about 4.5%. If a game doesn’t use the NUI, then it’s available for developers to allocate it to the game – but only during gameplay. Microsoft specifically prohibit its usage during menus and lobbies. The thought behind this is to still allow biometric to function. Another caveat: it doesn’t matter if we as a user disconnect the Kinnect. A games title must specifically request this reserve.
The remaining four percent of GPU reserves is allocated to the Extended System Reserve. This is used for system UI rendering and for message dialogs. In other words, everything that the system has to render (for example, a chat invite) asks for its GPU pound of flesh. Because of the uncertain nature, Microsoft assert that the title must be able to tolerate a variance of up to 3%. Furthermore, you as a user can have a profound affect on the reserve. In some instances, the ESR (Extended System Reserve) will force the reserve to 4 percent of the GPU reserve per frame. In other words, for certain actions either you take, or notifications and other prompts, the game will not have access to the 100 percent GPU reserve (even if normally does) and instead has to operate with only 96% of the GPU available to it while it’s drawing that particular frame of animation.
<center></center>
http://media.redgamingtech.com/rgt-website/2015/01/Xbox-one-fill-mode-snap-mode.jpg (http://media.redgamingtech.com/rgt-website/2015/01/Xbox-one-fill-mode-snap-mode.jpg)The Xbox One Screen Display Showing off the areas and resolutions when the title is running in “Fill Mode”

While it might not sound like a lot, it’s about 50GFLOPS of computing power that the game isn’t able to run. Because of this, Microsoft suggest that you alter how the game runs while this happens to fit into the lower performance. It suggests either downscaling the internal rendered image (it suggests an example of 1440×810 (a total of 1,166,400 pixels, compared to 1080P’s native resolution which is 2,073,600 pixels) and then upscaling the image, OR the other alternative would be to reduce effects quality (so, perhaps render things in the distance with a lower level of detail, or cut back on certain shadow details…). This will generally happen in situations where the title is in fill mode, and other menus or items are on the screen, so users can’t notice the drop in quality and thus difference in quality.
<center><iframe src="https://www.youtube.com/embed/ZoOU6vNJ5aE?feature=oembed" frameborder="0" allowfullscreen=""></iframe>
</center>This corresponds to the multiple view states that the Xbox One has, which are Full View (where the application fills the Window at 1920×1080), Fill View, This is where an exclusive application is sharing screen space with an app (it’s snapped to the right), and then not visible (in other words, it’s in the background).
CPU & Memory Allocation InfoWhen a game is running in the “Full” state (simply put, meaning the game is active and the user is playing on it) , unsurprisingly the Xbox One devotes most of the systems performance to it. As mentioned earlier, the Xbox One uses eight AMD Jaguar CPU cores, running at 1.,75 GHZ. Six of these CPU’s are normally available to games (with 100 percent of their resources available to the game) and should the resource configuration for the game request it, 50 – 80 percent of the resources of the seventh CPU core can also be used. In addition, 5GB of the Xbox One’s 8GB is available to games developers, meaning in effect the Xbox One has 3GB of the RAM reserved for the Operating System. Further, there is no disk paging available for this RAM once it’s consumed. this means developers have to be on their toes when allocating memory resources.

<tbody>
Available Resource
Full
Full Extended
Constrained


CPU Cores
6 CPU Cores
6 CPU cores + 50- 80 % of seventh.
4 CPU cores


% GPU Time
91.5 Percent GPU
Almost 100 Percent
45 Percent


Available System Memory
5 GB
5 GB
5 GB

</tbody>
While we’ve heavily discussed Microsoft increasing CPU allocation by freeing up the Seventh CPU Core (http://www.redgamingtech.com/microsoft-frees-seventh-cpu-core-xbox-one-developers/) it’s vital to remember that the proportion of the CPU time available on the seventh core can vary drastically based on what’s happening with the Xbox One. Microsoft assures developers they can count on having at least 50 percent of the core available at all times, but when the system must process certain commands spoken by the user (say, “Xbox go to friends”, 50 percent of the seventh core is required to run that task. So in effect, in the worst case scenario, developers gain 50 percent of a single CPU core. If these commands are not running (which should be the majority of the time) this raises to 80% CPU time available for the game. In effect, this means the amount of CPU performance available on the seventh core can vary by 30 percent, and currently the title isn’t informed that previous CPU performance is about to be snatched away from it. This clearly isn’t an ideal situation and Microsoft admit as much, and point it that an updated SDK release will fix this issue and provide the game notification. Also, optimization isn’t easy at the moment, because performance counters aren’t providing details of what’s happening on the Xbox One’s seventh CPU core. This means developers can’t really profile the performance and optimize as well as they should – again, Microsoft are keen to stress this will also be fixed ASAP.
http://media.redgamingtech.com/rgt-website/2014/06/inside-xbox-one-CPU-martin-fuller-microsoft-dev-day.jpg (http://media.redgamingtech.com/rgt-website/2014/06/inside-xbox-one-CPU-martin-fuller-microsoft-dev-day.jpg)
The other problem with this is that custom voice commands are no longer going to work. So, while your “Xbox record that” will not skip a beat, the specific voice commands developers could create for their games will no longer function. Obviously, this won’t make a difference in certain titles – but the voice commands that you’d use to issue support orders in say Ryse: Son of Rome are an example of the types of things that go bye-bye if developers opt to use the seventh CPU core. No such thing as a free lunch!
The other side effect for all of this is that no only do developers get the additional CPU core, but they’re also afforded an additional 1GB/s of precious memory bandwidth from the systems DDR3 RAM. Since there’s a lot of concern over the memory bandwidth of the Xbox One, this change will be just as important.
The Xbox One’s DDR3 2133MHZ RAM can theoretically push 68GB/s (that number is the sum of both read and write to the DRAM). In reality, Microsoft concede that this number isn’t achievable, and developers will hit about 80 – 85% of memory bandwidth; or 55 GB/s to 57.8 GB/s if you prefer. While we’re on the topic of memory bandwidth, back in August, 2014, Microsoft’s SDK update increased GPU DDR3 Bandwidth by 1.5 percent by “tuning system bandwidth consumers”.

<tbody>
Xbox One Cache hit type
Latency (lower is better)


Remote L2 hit
Approximately 100 cycles


Remote L1 hit
Approximately 120 cycles


Local L1 hit
Three cycles for 64-bit valuesFive cycles for 128-bit values



Local L2 hit
Approximately 30 cycles

</tbody>
Regarding the Processor Core Allocation and Cache information. As we’ve stated a few billion times in this document, the Xbox One contains 8 AMD jaguar CPU cores. They’re created using two processor modules, with each module housing four processor cores. Each module houses its own level 2 cache memory which the four processors must share, and they also share the bandwidth to main memory too. So, for example, core 0 and core 2 share the same cache (as they’re on the same module), and must use the same bus to access the DDR3 memory. But, CPU core 5 and CPU core 2 don’t share the same L2 cache, because they’re not on the same module.
If a CPU core wants to access the other modules level 2 cache (so for example, CPU core 1 which is housed in module A wishes to access the cache housed in Module B) it’ll be considerably slower. This logically means transferring data, and the same could be said for a level 1 Cache hit too (which is even slower). Thus, it’s better for developers to ensure they plan their CPU threads correctly. If a thread is going to interact with OS shared apps, Microsoft rather obviously suggests developers run the code on either core 4 or 5.
Microsoft also add that while the Level 2 cache has doubled (in comparison to the Xbox 360), the number of cores has also creased by a factor of 2.6. When you couple this with the Xbox One’s pointers consuming twice the memory, on a cache per core basis, the Xbox One actually has less Level 2 cache available than the Xbox 360.
Furthermore, DRAM (main system RAM, in other words the 8GB of DDR3 memory) contention is a bigger issue than it was on the Xbox 360. It’s much easier for the Xbox One’s CPU to adversely impact the GPU’s performance, or vice-versa. Thus, Microsoft say “Optimizing to reduce memory bandwidth is a key strategy for Xbox One”
http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-peak-memory-bandwidth-251x300.jpg (http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-peak-memory-bandwidth.jpg)Xbox One Peak Memory Bandwidth – Click For Larger Image

The Xbox One has two buses available for the GPU to access main system memory. The first of these is GARLIC, which is designed to reduce latency. It’s four channel bus, running at a peak bandwidth of 68GB/s (limited by DRAM memory bandwidth). These four GARLIC channel connect directly to each of the four DRAM controllers which comprise the Xbox One. For your point of reference, the GARLIC bus on the Playstation 4 also is limited by its own DRAM, but because it uses the faster GDDR5 RAM, its limit is 176GB/s.
The Second bus is ONION, and it’s noncoherent. It’s a two channel bus, capable of 24 GB/s READ or 16 GB/s WRITE (Peak). This limitation is due to the North Bridge. The two ONION controllers connect to the memory controller. Thus the total coherent memory bandwidth through the North Bridge is limited to 30GB/s.

Winjer
22-01-15, 22:50
Xbox One SDK Leak | GPU Architecture Overview & Analysis Part 2 | Tech Tribunal (http://www.redgamingtech.com/xbox-one-sdk-leak-gpu-architecture-overview-analysis-part-2-tech-tribunal/)

About a week ago, we’d taken a rather in-depth overview of the Xbox One’s SDK leak. The leak, for those unfamiliar with it, provided a glimpse of what lies inside the Xbox One’s silicone, and how the machine has evolved over the years Microsoft was both designing it and since it was released back in November, 2013. It was evident Microsoft had not only changed the focus of the machine (my lessening the focus on Kinect, and pushing more of those resources to game developers) but also rather large improvements in the machines drivers and SDKs (completely re-writing the consoles graphics driver to be more efficient).In this article, we’ll be focusing on GPU performance, and tackling things such as the two Graphic Command Processors present inside the Xbox One, and performing analysis on their potential use and purpose. To discuss this we’ll also need to dive a bit into how various components function, and of course we’ll do a nice break down to simplify things as much as possible. Without wasting further word count, let’s begin, eh?
We’ve discussed the basics of the hardware inside Microsoft’s Xbox One several times by now, and if you require a refresh I’d suggest you read the first part of our SDK leak analysis. (http://www.redgamingtech.com/xbox-one-sdk-hardware-leak-analysis-cpu-gpu-ram-more-part-one-tech-tribunal/) I’d also suggest you read it if you haven’t and you want a more complete understanding of the Graphics API’s, software, buses and bandwidth of the system (which we’ll touch upon in this article, but it’ll not be the focus).
The Xbox One GPU is based on an AMD GCN (Graphic Core Next) DirectX 11.1 GPU, the closest desktop variant would be Bonaire (more specifically, the Radeon 7790 – albeit with some customization). It’s API is a heavily modified version of DirectX, known as Monolithic Driver (more info) (http://www.redgamingtech.com/xbox-one-sdk-hardware-leak-analysis-cpu-gpu-ram-more-part-one-tech-tribunal/) that’s been designed to be more console like, and do away with (among other things) layers of abstraction, due to the fixed hardware of the Xbox One.
http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-esram-die-stack-explained.jpg (http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-esram-die-stack-explained.jpg)Microsoft’s PR slide to ‘explain’ the esram inside the console.

To understand Microsoft GPU modifications, it’s vital we establish a basic understanding of AMD’s GCN architecture, which is not only of the basis of the Xbox One, but also the PS4’s GPU’s. At the highest level, the Xbox One’s GPU is comprised of 12 Compute Units, or CU for short (but be aware, Microsoft doesn’t stick with AMD’s official naming convention, and instead refers to the Compute Units as Shader Cores). Technically, there are 14 CU’s on the Xbox One’s die (and 20 CU’s on the PS4), but two have been disabled on both consoles to increase yields, leaving 12 and 18 CUs for each console respectively.
A Compute Unit is partitioned into four separate SIMD Units (each of these can run between 1- 10 wavefronts, which we’ll discuss soon in another article). Each SIMD contains 16 ALU’s (Arithmetic Logic Units, sometimes these are referred to as Shaders). In addition to the four SIMD units, you’ll find a L1 Cache, a LDS (Local Data Share, Microsoft changed the convention to LSM, or Local Shared Memory in its documentation), 4 Texture Units and a scalar unit. The Scalar Unit handles arithmetic operations the ALU’s can’t or won’t handle (for example, conditional IF / When statements).
In addition to the CUs, the Xbox One’s GPU features 512KB of level 2 Cache. L1 cache is faster, but is shared just between that CU, where as the L2 cache is a ‘shared’ resource for the entire GPU and finally, 16 ROPS (Raster Operators, these are the final stage in a scenes rendering, and ‘assemble’ the scene). This means that the Xbox One’s GPU contains 768 ALU’s, which put out a combined performance of about 1.31 TFLOPS of computing power (this isn’t accounting for any performance that’s been allocated towards the OS).
http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-gpu-bus-overview.jpg (http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-gpu-bus-overview.jpg)
So what’s different with the Xbox One GPU?The Xbox One features two ACE (Asynchronous Compute Engines) and two Graphic Command Processors which, along with the 8 Graphic Context’s has spurred a flurry of gaming websites to report the news – but it raises several questions, which we’ll attempt to answer in this very article. The first question: what resources are available for developers, after all, we know there’s 8GB of RAM in the Xbox One, but 3GB is allocated purely to OS use, leaving just 5GB for games. The second question: While it might sound impressive, how ‘different’ is it from either the PS4 or the plain regular GCN architecture? The third question (and the one I suspect most of you will care about) what does it do for the games? Will they have better detail, run at a higher resolution… higher frame rate?
Unfortunately, we need to firstly discuss what are Graphic Contexts, ACE and Graphic Command Processors (known as GCP). If you’ve a good understanding of this, you can skip the new few paragraphs! If If you’re unfamiliar, then consider the information below a very basic primer to give you an understanding of what’s going on.
http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-gpu-block-architecture1.jpg (http://media.redgamingtech.com/rgt-website/2015/01/xbox-one-gpu-block-architecture1.jpg)The Xbox One GPU Block Diagram and architecture

Graphic Context: Think of a graphics context as a state where it can be saved and then resumed at a later date. It defines basic drawing attributes of a scene, such as the colors to use, the basic layout of a scene and so on. The next logical thing to discuss is Context Switching, which is business as usual for a GPU, and its purpose is to keep the GPU utilization as high as possible. The reason for this is because GCN is an In-Order processor (instructions are fetched, executed & completed in order they are issued. If an instruction stalls, they it causes other instructions ‘behind it’ to stall also), and thus ensuring the pipeline is running smoothly is critical for best performance.
GCP / Graphic Command Processor: Their (remember, there’s two in the Xbox One) job is to communicate with the host CPU (the AMD Jaguar CPU, in the case of the X1) and to keep track of the various graphics states, read commands from the command buffer. In other words, its job is to tell the GPU to ‘draw stuff’ and to keep track of the various bits of data. The Graphic Command Processor tries to run ahead of the GPU as much as possible so that it’s better able to know what’s coming up, and delegate work accordingly. For a very simplified example, imagine that if the GPU’s shaders are processing instruction 25, the GCP would like to be at instruction 30.
http://media.redgamingtech.com/rgt-website/2015/01/playstation-4-gpu-queue-overview.jpgThe Playstation 4’s GPU overview – notice the 2 GCP’s

ACE / Asynchronous Compute Engines: Their job similar to the GCP, but instead its for Compute work, in other words, if the developers is using the GPU for physics or other such uses. It dispatches the compute tasks to the CU’s, manages resources and naturally interprets instructions.
Now that we’ve got that out of the way, let’s start things out with the GCP, for which there are two inside the Xbox One. Unfortunately, because we’re still missing certain documentation from the SDK leak, we can only make a few educated guesses as to the second GCPs usage, but there are a couple of leading theories. Desktop GPU’s (such as the Radeon 7970) only feature one GCP, but according to rumors and leaks, the Playstation 4 does indeed feature two GCPs too. This would mean that the leading theory is that the second GCP is for OS tasks – for example, running snap (and other OS displays). While the Xbox One does have a percentage of its CPU locked for only OS (1 full core, and 20 percent of another in extended mode, or 2 core in regular mode – depending on if developers wish to make use of the custom Kinect voice commands), the visual display still needs to be rendered on screen.
http://media.redgamingtech.com/rgt-website/2015/01/SIMD2.png (http://media.redgamingtech.com/rgt-website/2015/01/SIMD2.png)
It’s also possible that the GCP is responsible for helping issue additional Graphic Context’s (which we’ll discuss soon), but the first explanation is more likely. Throughout the SDK documentation there’s only reference to a single GCP, which likely means developers simply do not have access to the second one (or one would assume it’d be mentioned and how to best leverage the performance of both of them). We can speculate that there’s a good chance Microsoft did indeed customize the Command Processor(s) to an extend, because of a quote from one of the Xbox One’s architects, Andrew Goossen with EuroGamer: “We also took the opportunity to go and highly customise the command processor on the GPU. Again concentrating on CPU performance… The command processor block’s interface is a very key component in making the CPU overhead of graphics quite efficient. We know the AMD architecture pretty well – we had AMD graphics on the Xbox 360 and there were a number of features we used there. We had features like pre-compiled command buffers where developers would go and pre-build a lot of their states at the object level where they would [simply] say, “run this”. We implemented it on Xbox 360 and had a whole lot of ideas on how to make that more efficient [and with] a cleaner API, so we took that opportunity with Xbox One and with our customised command processor we’ve created extensions on top of D3D which fit very nicely into the D3D model and this is something that we’d like to integrate back into mainline 3D on the PC too – this small, very low-level, very efficient object-orientated submission of your draw [and state] commands.”
<center></center>
Just how Microsoft “Highly Customized” the GCP is of course, down to debate. It’s possible we’ll see some DirectX 12 functionality, such as Draw Bundles – but it’s too difficult to know for certain. Many zeroed in on the phrase “In particular, compute tasks can leapfrog past pending rendering tasks, enabling low-latency handoffs between CPU and GPU.” – but in reality it’s pretty much business as usual for the GCN architecture.
It’s always possible that the customization were blown out of proportion by Microsoft, but rumors behind the scenes is that they did implement some changes… it’d make sense, given that the Xbox One’s Monolithic Driver supposedly helped inspire DX12 (well, that and a nice dose of AMD’s Mantle technology, once again, if rumors are accurate).
http://media.redgamingtech.com/rgt-website/2015/01/pc-directx11-no-deferred-driver-stall.jpg (http://media.redgamingtech.com/rgt-website/2015/01/pc-directx11-no-deferred-driver-stall.jpg)Trouble in DX11 PC paradise – nothing is deferred

Regarding the number of Graphic Contexts, let’s first read over what Microsoft says on the matter: “The Xbox One GPU has eight graphics contexts, seven of which are available for games. Loosely speaking, a sequence of draw calls that share the same render state are said to share the same context. Dispatches don’t require graphics contexts, and they can run in parallel with graphics work.” In a different part of the leaked document it says: “The number of deferred contexts you want to create during initialization time depends on the maximum number of parallel rendering tasks the engine needs to perform anytime during rendering. Although the system allows a maximum of48 deferred contexts to exist at any one time, in general you shouldn’t create more than six deferred contexts at once, because that’s how many cores you have in the game OS. Of course, it is up to you to precisely tailor your thread usage for maximum efficiency. For example, if one deferred-context thread is waiting for a direct memory access (DMA) operation, you can swap in another deferred context thread to use the otherwise-wasted CPU time on the same core. In this case, having more than one deferred context and deferred context threads per core prevents a CPU bubble.”
So what is a ‘deferred context’? The keyword is ‘deferred’ – it means that calls (think instructions) aren’t executed straight away. instead they are sent over to the command list to be executed at a later date.
http://media.redgamingtech.com/rgt-website/2015/01/gpu-cpu-deferred-context-jobs-example.jpg (http://media.redgamingtech.com/rgt-website/2015/01/gpu-cpu-deferred-context-jobs-example.jpg)An example job flow of contexts between CPU / GPU

You’ll notice that the number 48 divides rather beautifully into 6 (which is the number of CPU cores developers have by default, unless developers opt to free up some of the 7th core from its Kinect and OS masters). So., because of this only 6 (or possibly 7) of the cores are able to send work ‘per cycle’. But, remember, sometimes an instruction can take longer than another to execute. If that’s the case, if just working on ‘current’ contexts the CPU time will go idle, which is clearly far from ideal. You want to hit as close to 100% usage across all available cores as much as you can. Thus, if you’re waiting for a slow operation (in Microsoft’s example, a slow memory operation) another deferred context will start up on the same core (that’s currently waiting for the memory operation).
We don’t have enough information to make a guess on the Playstation 4, but we can make a guess on the PC thanks to a variety of different documents that have been released, including the Southern Island Programming guide. On page 14 “MAX_CONTEXT; Maximum Context in Chip. Values are 1 to 7. Max context of 0 is not valid since that context is now used for the clear state context. For example, 3 means the GPU uses contexts 0-3, i.e., it utilizes 4 contexts.”
The typical desktop GCN architecture according to documentation processes a single context at a time. Naturally (as we’ve just seen above), it’s possible to operate on multiple, but to do this you’ll need to run them serially and context switch. GCN also process compute, and can process as contexts based on the number of ACE’s that are available. Typically, in CPU’s the length an operation takes is known as a ‘Time Slice’. The length of each time slice can be critical to balancing system performance vs process responsiveness – if the time slice is too short then the scheduler will consume too much processing time, but if the time slice is too long, processes will take longer to respond to input.
A lot will obviously change in the future with DirectX12 – but how this integrates with the Xbox One isn’t known. With the current DirectX 11 mode, the CPU talks to the GPU a single core a time (so in other words, not in parallel). In the DX12 future, this will change because each core can issue instructions to the GPU and talk with the GPU simultaneously. How much of a difference this makes in GPU bound scenarios (particularly for the Xbox One) remains to be seen, but for CPU bound scenarios (particularly for Windows) it’ll be a nice performance increase.
http://media.redgamingtech.com/rgt-website/2015/01/Xbox-one-gpu-command-processors.png (http://media.redgamingtech.com/rgt-website/2015/01/Xbox-one-gpu-command-processors.png)
Compute on the Xbox One runs in parallel with graphics workloads, a set of ‘fence’ API’s synchronize the execution between these contexts and the CPU. L1 cache is shared between both compute and graphical data. We do know that Microsoft didn’t implement a ‘Volatile Bit’ inside the GPU – which allows the application to delete (for instance) a single line of code. Mark Cerny discussed this back withGamaSutra (http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?print=1), where he said “…to support the case where you want to use the GPU L2 cache simultaneously for both graphics processing and asynchronous compute, we have added a bit in the tags of the cache lines, we call it the ‘volatile’ bit. You can then selectively mark all accesses by compute as ‘volatile,’ and when it’s time for compute to read from system memory, it can invalidate, selectively, the lines it uses in the L2. When it comes time to write back the results, it can write back selectively the lines that it uses“,
We know this isn’t implemented in the Xbox One because it specifically says in the SDK documentation: “Note that cache flushes affect the entire cache; range-based cache flushes are not supported by the hardware. This may affect any GPU work executing on the graphics context at the same time.” This means that cache flushes are the only way to resolve certain situations, which obviously will be a performance penalty (as Microsoft have just described).
Another slight problem with the Xbox One’s GPU (compared to say the Playstation 4’s, or indeed a more modern PC GPU such as for example the R9 290) is that the total number of ACE’s is lower. While the two ACE’s on the Xbox One can handle eight queues each, the Playstation 4 (and modern desktop GPU’s) support 8 ACEs. In the case of the PS4, this means that the GPU can handle a total number of 64 compute queues, which combined with the Level 2 Volatile Bit certainly gives the PS4 a bit of a helping hand in certain situations.

Winjer
05-02-15, 13:20
PS4 continues to lead console market, Xbox One is catching up (http://www.techspot.com/news/59648-ps4-leads-console-market-xbox-one-catching-up.html)



A new bunch of sales data for the current generation of consoles is in, and unsurprisingly, the PlayStation 4 still leads the Xbox One in global shipments. However, a strong fourth quarter of 2014 for Microsoft indicates that they are catching up in the console race.Lifetime sales data for all three consoles, as compiled and analyzed by Ars Technica (http://arstechnica.com/gaming/2015/02/ps4s-sales-dominance-slackens-slightly-in-holiday-quarter/), shows the PS4 firmly in the lead with just shy of 20 million total worldwide shipments. The Xbox One follows with between 11 and 13 million shipments, and the Wii U in third place with 9.3 million shipments despite launching a year earlier.
While Sony has shipped roughly 66% more current-gen consoles than Microsoft in total, the Xbox One is catching up in recent quarters. Between October and December 2014, the critical holiday period, Sony shipped 6.4 million PS4s to Microsoft's 4.5-5.5 million Xbox Ones. Through this time period, the PS4 had a much narrower lead of around 30%.
Increased shipments and sales for Microsoft's console line up pretty well with decisions to cut its price. The Xbox One's price temporarily decreased to $349 (http://www.techspot.com/news/58586-xbox-one-gets-holiday-inspired-price-cut-now.html)through the 2014 holiday season, and again in January in an ongoing promotion. The previous price of the entry-level Xbox One SKU was identical to the current price of the PlayStation 4, at $399.
Sony has no plans to drop the price of their console to match the Xbox One, insisting that their $50 more expensive console is still the preferred choice of buyers and gamers. And they'd generally be right: the PS4 commands around 47% of the current-gen console market, with the Xbox One on around 31% and the Wii U trailing with a 22% share.

A Sony ainda vai ter de responder com um corte de preço na PS4, ou arrisca-se a perder a liderança das consolas.

Jorge-Vieira
05-02-15, 17:49
As sucessivas baixas de preço da XBox começam a dar frutos, a Sony terá de fazer alguma coisa se quiser continuar a lioderar.

Winjer
15-02-15, 17:19
Xbox One SDK Leak Part 3 | Move Engines & Memory Bandwidth Performance | Tech Tribunal (http://www.redgamingtech.com/xbox-one-sdk-leak-part-3-move-engines-memory-bandwidth-performance-tech-tribunal/)

Winjer
26-02-15, 13:35
Digital console games made $263M in January — and most of that was from PlayStation (http://venturebeat.com/2015/02/25/digital-console-games-made-263m-in-january-and-most-of-that-was-from-playstation/)

Digital downloads are making up a bigger piece of the market on consoles, and intelligence firm SuperData Research is shedding some light on exactly how big it is.Digital console games generated $263 million in January, according to SuperData. That’s up 14 percent year over year. Grand Theft Auto and several Call of Duty games were the primary drivers in this growth, but SuperData chief executive and lead analyst Joost van Dreunen notes that horror games Resident Evil 4 HD Remaster and Dying Light were also big contributors. Sony’s PlayStation 4 and PlayStation 3 consoles also dominated by making up 63 percent of digital sales on console. Digital games on console, PC, and mobile is now a $49 billion category, and it’s only getting bigger — and Sony, Microsoft, and Nintendo are all trying to get as big a piece of that as possible.
For the first time, SuperData is not just sharing how much money digital console games made, but it has broken down the top 10 highest-earning releases during the month.
Check it out:
http://1u88jj3r4db2x4txp44yqfj1.wpengine.netdna-cdn.com/wp-content/uploads/2015/02/digitalconsoleTEST.jpg (http://venturebeat.com/wp-content/uploads/2015/02/digitalconsoleTEST.jpg)
Obviously, GTA V is the big winner here. This includes both people paying to purchase the digital version of the game as well as content for GTA Online. The same is true for the three Call of Duty games on the list, which all have downloadable content and purchasable cosmetic items.
SuperData also notes that the top 10 games generated $133 million in revenue by themselves, which is just slightly more than half of all digital console spending.
“What allows us to publish this list today is that we’ve reached critical mass with our data collection,” said van Dreunen. “With the help of our client network and data providers, the data necessary to release a reliable monthly overview of key performers on digital console is now available. We’ve been providing our customers with similar information on free-to-play, mobile, and PC for years, but being able to now do the same for digital console is a historic moment.”

Winjer
13-03-15, 23:43
ESRAM Performance Improves By 15% & DX12 Info For Xbox One | Analysis (http://www.redgamingtech.com/esram-performance-improves-15-dx12-info-xbox-one-analysis/)
Umas melhorias de desempenho são sempre bem vindas, especialmente em hardware mais fraco.

Jorge-Vieira
14-07-15, 08:13
Technomancer Dev: Xbox One’s 7th CPU Core Will Allow the Game to Deliver Smoother Frame Rates

Microsoft has revived its eighth generation Xbox One video game console quite a few times since its launch. After announcing a Kinect-less version (http://wccftech.com/xbox-bundle-kinect-reality-399/) of the console last year, the company temporary slashed the price of the console to $349 during the holiday season, giving a much needed boost to the sales. However, the Xbox One experienced a more important type of upgrade earlier this year when Microsoft decided to release the reserved seventh CPU core (http://wccftech.com/xbox-one-gpu-contexts-command-streams/) of the console for game developers.
http://cdn3.wccftech.com/wp-content/uploads/2015/07/The-Technomancer-635x357.jpg (http://cdn3.wccftech.com/wp-content/uploads/2015/07/The-Technomancer.jpg)
Spiders CEO Says Access to the Xbox One’s Seventh CPU Core will Allow its Upcoming Mars-set RPG to Deliver Better FPS Over the Christmas break last year, some important leaked documents revealed (http://wccftech.com/xbox-one-gpu-contexts-command-streams/) that with November 2014 Xbox One SDK update, Microsoft boosted the gaming performance of the Xbox One and allowed video game developers to gain access to the seventh CPU core of the console that, along with the eighth core, was previously reserved for Kinect and operating system-related tasks. The additional CPU power offers several benefits for games to utilize and perform better on the Xbox One console.





To see if the increased CPU performance is really helping the developers to achieve their targeted goals, GamingBolt (http://gamingbolt.com/xbox-one-extra-cpu-core-will-allow-smoother-frame-rates-technomancer-dev) talked with Jehanne Rousseau, CEO Spiders; a French video game development studio which is currently working on a new Mars-set post-apocolyptic RPG known as The Technomancer, and asked him how the Xbox One’s seventh CPU core is allowing the development team to add more to their upcoming game. Rousseau replied:

“We are not using it yet but this is only a matter of time! This is very interesting for us to have access to this processor. We will use it to do a part of our computations to smooth the frame rate. In fact we designed our new engine to be very flexible about the number of processors. With mainly in mind the numerous PC designs, but this will also be useful for this 7th processor.”
Microsoft has been pushing things to enable video game developers to deliver game experience to players on Xbox One console that is equivalent to that on the PlayStation 4. A number of major triple A multiplatform games such as Dying Light have performed better on Microsoft’s latest home console reportedly (http://wccftech.com/xbox-one-sdk-performance-api-esram/) due to the new upgrades that the company has been offering with its software development kit. We still, however, have yet to see if the seventh CPU core and the updated SDKs will help the Xbox One console to sail parallel with the PlayStation 4 in terms of performance.







Noticia:
http://wccftech.com/technomancer-xbox-7th-cpu-core-frame-rate/#ixzz3fqr8fSID

Jorge-Vieira
14-12-15, 14:19
PS4 Kernel Exploit Allegedly Allows RAM Dumping & More

A programmer claims to have a working PS4 kernel exploit, which allows RAM dumping from other processes.
http://cdn.wccftech.com/wp-content/uploads/2015/12/ps4_apu.jpg (http://cdn.wccftech.com/wp-content/uploads/2015/12/ps4_apu.jpg)
Alleged C-programmer CTurt (http://cturt.github.io/index.html) took to Twitter to share that he finally created a working PS4 kernel exploit. According CTurt, this might imply that Sony’s PS4 has officially been ‘jailbroken’.


https://abs.twimg.com/sticky/default_profile_images/default_profile_4_normal.png CTurt @CTurtE (https://twitter.com/CTurtE)
PS4 kernel exploit finally working! Thanks to everyone involved!
<time class="dt-updated" datetime="2015-12-06T19:16:04+0000" pubdate="" title="Time posted: 06 Dec 2015, 19:16:04 (UTC)">7:16 PM - 6 Dec 2015</time> (https://twitter.com/CTurtE/status/673581693207502849)






The kernel exploits would allow CTurt to dump RAM to other processes like SceShellUI using ptrace. He’s currently working on patching RAM, according his Tweets.



https://abs.twimg.com/sticky/default_profile_images/default_profile_4_normal.png CTurt @CTurtE (https://twitter.com/CTurtE)
Just broke WebKit process out of a FreeBSD jail (cred->cr_prison = &prison0). Guess you could say the PS4 is now officially "jailbroken" :P
<time class="dt-updated" datetime="2015-12-12T14:28:25+0000" pubdate="" title="Time posted: 12 Dec 2015, 14:28:25 (UTC)">2:28 PM - 12 Dec 2015</time> (https://twitter.com/CTurtE/status/675683629977178112)





https://abs.twimg.com/sticky/default_profile_images/default_profile_4_normal.png CTurt @CTurtE (https://twitter.com/CTurtE)
Can successfully dump RAM from other processes (like SceShellUI) using ptrace! Next step: patching RAM...
<time class="dt-updated" datetime="2015-12-12T19:40:31+0000" pubdate="" title="Time posted: 12 Dec 2015, 19:40:31 (UTC)">7:40 PM - 12 Dec 2015</time> (https://twitter.com/CTurtE/status/675762173583454209)






https://pbs.twimg.com/profile_images/672220969423863809/eERNif0M_normal.png endrift @endrift (https://twitter.com/endrift)
@CTurtE (https://twitter.com/CTurtE) @frwololo (https://twitter.com/frwololo) Does this involve a jail vuln in FreeBSD proper, or just the PS4?

Follow (https://twitter.com/CTurtE)
https://abs.twimg.com/sticky/default_profile_images/default_profile_4_normal.png CTurt @CTurtE (https://twitter.com/CTurtE)

@endrift (https://twitter.com/endrift) @frwololo (https://twitter.com/frwololo) This isn't a jail vulnerability. It's only possible because I'm executing in kernel mode.
<time class="dt-updated" datetime="2015-12-13T10:44:39+0000" pubdate="" title="Time posted: 13 Dec 2015, 10:44:39 (UTC)">10:44 AM - 13 Dec 2015</time> (https://twitter.com/CTurtE/status/675989705373151232)






At the moment of writing, the programmer is playing the available RAM in the console, and he hopes to release a full write-up and video soon enough.




https://pbs.twimg.com/profile_images/675439929346949120/g_Si0tlA_normal.jpg Jamie Adams @Jamie_Adams17 (https://twitter.com/Jamie_Adams17)
@CTurtE (https://twitter.com/CTurtE) what's your main goal in "jail braking"?

Follow (https://twitter.com/CTurtE)
https://abs.twimg.com/sticky/default_profile_images/default_profile_4_normal.png CTurt @CTurtE (https://twitter.com/CTurtE)

@Jamie_Adams17 (https://twitter.com/Jamie_Adams17) At the moment I am playing with the RAM of other processes, (think cheats). Hoping to make a nice PoC video some time.
<time class="dt-updated" datetime="2015-12-13T22:24:54+0000" pubdate="" title="Time posted: 13 Dec 2015, 22:24:54 (UTC)">10:24 PM - 13 Dec 2015</time> (https://twitter.com/CTurtE/status/676165930083307520)

PS4 Kernel exploit allows system access Basically, jailbreaking a system means the ‘hacker’ bypasses the system’s DRM restrictions, in order to run “unauthorized” software. When jailbroken, a user could technically install custom software, a make tweaks to the system’s operating system.
At the moment, the exploit from CTurt is said to only work on PS4 firmware 1.76 (http://wololo.net/2015/12/13/ps4-jailbreak-possible-cturt-confirms-ram-dump/). Normal owners of the PS4 are currently using firmware version 3.11. Firmware version 1.76 is rumoured to be installed on PS4 bundled with the ‘The Last of Us’ PS4 bundle, but this hasn’t been confirmed yet.
The exploit doesn’t necessarily mean that the PS4 is vulnerable to piracy. Nothing has been announced regarding breaking encryption, but kernel access to the system does allow programmers to analyse the PS4’s system, and search more vulnerabilities.









Noticia:
http://wccftech.com/ps4-kernel-exploit-said-to-be-working/#ixzz3uIwp8cBq

Jorge-Vieira
27-01-16, 17:47
Unlocked PS4, Xbox One cores add 'nice reserve of CPU power', says dev



Microsoft and Sony have both opened up 7 out of the 8 CPU cores that power their respective consoles, giving devs a bit more processing power to play with. Now that developers are starting to harness the extra CPU power, it's time to put the extra boost into perspective to see if the added core will make a difference or not.


http://imagescdn.tweaktown.com/news/4/9/49975_1_unlocked-ps4-xbox-one-cores-add-nice-reserve-cpu-power-dev.jpg (http://www.tweaktown.com/image.php?image=imagescdn.tweaktown.com/news/4/9/49975_1_unlocked-ps4-xbox-one-cores-add-nice-reserve-cpu-power-dev_full.jpg)

According to the devs at Techland, who used the extra cores while developing The Following DLC for Dying Light, the 7th core adds a "good reserve of CPU power" for demanding situations. Given the limited nature of both the Xbox One and PlayStation 4, devs are keen to use any extra power they have access to.

"Dying Light was made and optimized to work on six cores since that's what was available when we made the game. So the opening up of the 7th core CPUs on both platforms simply gave us a good reserve of processing power. It essentially gave us a helping hand in dealing with more processor intensive situations, but given when Dying Light was developed, it simply means we use the additional CPU power as a nice to have and not something we need have to rely on," Techland dev Tymon Smektala told Gamingbolt (http://gamingbolt.com/dying-light-the-following-using-7th-cpu-core-of-ps4-and-Xbox-one-dev-talks-about-cloud-gaming).

Some developers didn't find any real performance differences while using the Xbox One's unlocked 7th core. While developing Divinity: Original Sin for consoles, Larian co-founder Swen Vincke said that the unlocked core provided only slight performance boosts. "Yes, we are using [the seventh core].There are not a lot [of benefits] apparently but we are using it. You can only use 60 or 70% of it, so that is not big of a difference. Essentially it won't make much of an impact."

It'll be interesting to see what other developers say about the unlocked PlayStation 4 and Xbox One cores in the future. As all games and devs are different, we should see more teams utilize the extra CPU boosts in various ways, especially when the games themselves are built from the ground up using the new cores rather than being ported from existing titles.




Noticia:
http://www.tweaktown.com/news/49975/unlocked-ps4-xbox-one-cores-add-nice-reserve-cpu-power-dev/index.html