THREE OS's? Boy, worse then I thought. More or less guarantees at least one core is reserved for the OS, possibly even two.
Or the separate custom ARM chip, that's designed to run the OS, could be running the OS, at least...that's my guess. Mark Cerny also discussed any additional OS loads that the custom chip could not handle would be offloaded onto the 64 thread capable compute portion of the GPU.
EDIT: I bet that's at least part of the reason AMD went after a license to make ARM architecture...the other end of that being ARM servers.
For the record, aside from one statement from one official, there has been ZERO talk of the ARM chip. I'm 99% sure it doesn't exist at this point. Nevermind you would have to have an OS that could support two totally different instruction sets (CISC versus RISC) on the fly, which no one has come close to doing. Nevermind the Kernel for such an OS would be HUGE. So...no.
Little details are known for the CPU on the Xbox, but I think that it is safe to wait about the same performance that the PS4. The PS4 has a CPU that gives 102.4 GFLOPs. For the sake of comparison the i7-3770k gives 112 GFLOPs. Moreover the PS4 has a HSA design, which means that the GPU can help to the CPU under heavy loads, unlike what happens to a tradittional CPU made by Intel, which botteleneck under heavy loads.
The issue with offloading the OS itself is one of latency: nothing kills the user experience then an OS that takes half a second to respond to the user. So I doubt the core OS could be offloaded; more likely the individual tasks get offloaded. And in the case of games, the GPU portion is already going to be overworked.
If we COULD run an OS on a GPU with acceptable performance, CPU's would be dead by now, given how GPU's are orders of magnitude more powerful then CPUs already.
What is up with TH and retarded testing methodology but anyways my point is TH has the 780 well below the Titan, while other sites including TPU have the GTX 780 on par or slightly quicker which was what I was told last week and it subsequently turns out that the 780 is about par with the Titan and thats going to piss a lot of people off I did say on Titan's lauch wait for the 780 and save $400. Nvidia is a mess.
There's nothing retarded. Techpowerup review is Gigabyte 780 with ~100MHz faster core than reference 780 in Tom's review. See this?:
Stock 780 is ~10% slower than Titan, similar to Tom's 10%.
Okay we know the problem, what is the fix or are we going to be in 2099 and still be talking about the same problem?
You can't make parallel serial tasks. Even breaking up tasks that can be run in parallel doesn't always yield performance increases. What you are going to see is stuff that CAN be made easily parallel (large datasets, physics, graphics) will be offloaded to some sort of GPU-like architecture, leaving just a few serial tasks running on the CPU. Thats where we are currently headed. You will see CPU usage go up in the short term due to new things being done on the CPU that weren't done before (more physics, for instance), but not due to any real change in how threading is handled. But long term, I expect most of those tasks to get offloaded from the CPU.
If we want "optimized" threading on a per application basis, the best place to do it is during compilation, by the compiler. Problem is, you need a VERY tightly coupled compiler with the OS to do this [and it becomes impossible if different schedulers can be used, such as exists under Linux/Unix based OS's], and every attempt to do this so far has ended in failure. Will happen eventually though, just like it did for register assignments and the like. OpenMP is already doing this for basic things such as FOR/WHILE loops, when the compiler threads based on whether or not it will yield a performance benefit.
The SERVER uses Linux; the PCs that access it are ALL WinNT based.
*Works on contract for the DoD*
*Worked for the NSA for 3 years as a Computer Engineer*
I stand by what I said...I was at Ft. Meade, MD for a long time, and not on contract, I was a US DoD employee. I think I know what they were doing...my sub unit was one of the first to go to RH desktop back in 2000. Nearly all of Ft. Meade was scheduled to be converted by the time I left in 2001.
Little details are known for the CPU on the Xbox, but I think that it is safe to wait about the same performance that the PS4. The PS4 has a CPU that gives 102.4 GFLOPs. For the sake of comparison the i7-3770k gives 112 GFLOPs. Moreover the PS4 has a HSA design, which means that the GPU can help to the CPU under heavy loads, unlike what happens to a tradittional CPU made by Intel, which botteleneck under heavy loads.
The issue with offloading the OS itself is one of latency: nothing kills the user experience then an OS that takes half a second to respond to the user. So I doubt the core OS could be offloaded; more likely the individual tasks get offloaded. And in the case of games, the GPU portion is already going to be overworked.
If we COULD run an OS on a GPU with acceptable performance, CPU's would be dead by now, given how GPU's are orders of magnitude more powerful then CPUs already.
You consistently dismiss the ARM chip designed to run the OS...why?
@gamerk316
For the record, aside from one statement from one official, there has been ZERO talk of the ARM chip. I'm 99% sure it doesn't exist at this point. Nevermind you would have to have an OS that could support two totally different instruction sets (CISC versus RISC) on the fly, which no one has come close to doing. Nevermind the Kernel for such an OS would be HUGE. So...no.
That official just happened to be the project lead, and the head of PS4 development himself...what further proof did you want? Google Mark Cerny's name...he is THE man to talk to about PS4...he was behind designing it. Who better than him would know what's in it??
EDIT: Where are you coming up with 2 different instruction sets can't interact together? x86 and RISC interact daily through mobile devices all over that interact with x86 devices...how do you find this so difficult?
Additionally...who says the OS is going to be doing any scheduling or interfering with the CISC architecture anyway? This isn't a PC...though it's very similar, and versions of the Linux kernel can run perfectly fine on different architectures.
Released in late 2011, ARMv8 represents a fundamental change to the ARM architecture. It adds a 64-bit architecture, dubbed 'AArch64', and a new 'A64' instruction set. Within the context of ARMv8, the 32-bit architecture and instruction set are referred to as 'AArch32' and 'A32', respectively. The Thumb instruction sets are referred to as 'T32' and have no 64-bit counterpart. ARMv8 allows 32-bit applications to be executed in a 64-bit OS, and for a 32-bit OS to be under the control of a 64-bit hypervisor.[1] Applied Micro, AMD, Broadcom, Calxeda, HiSilicon, Samsung, ST Microelectronics and other companies have announced implementation plans.[49][50][51][52] ARM announced their Cortex-A53 and Cortex-A57 cores on 30 October 2012.[23]
What is up with TH and retarded testing methodology but anyways my point is TH has the 780 well below the Titan, while other sites including TPU have the GTX 780 on par or slightly quicker which was what I was told last week and it subsequently turns out that the 780 is about par with the Titan and thats going to piss a lot of people off I did say on Titan's lauch wait for the 780 and save $400. Nvidia is a mess.
Well i like techpowerup because they test it with a LOT of games. And it is behind the Titan and only 16% faster then the 680 Move along nothing to see here, Now i want to see my potential future card the 9970.
Although the Xbox 1 is using a more efficient GPU then the PS4 the PS4 is going to have a stronger GPU probably by around 10-15%. Not only that but the PS4 uses GDDR5 memory vs the Xbox 1 DDR3 so that would cripple its memory bandwidth by half. Besides useless ports the PS4 is going to be clearly more powerful.
Although the Xbox 1 is using a more efficient GPU then the PS4 the PS4 is going to have a stronger GPU probably by around 10-15%. Not only that but the PS4 uses GDDR5 memory vs the Xbox 1 DDR3 so that would cripple its memory bandwidth by half. Besides useless ports the PS4 is going to be clearly more powerful.
More like 50%. 12 CU (Xbox) vs 18 CU (PS4)
As reference the HD 7850 has 16 CU, and 7870 has 20 CU.
They're claiming delivering ARM Opterons in Q1 2014 to commercial customers...why wouldn't they be able to do it in a console in a slightly truncated timeframe? (Q4 2013)
They're claiming delivering ARM Opterons in Q1 2014 to commercial customers...why wouldn't they be able to do it in a console in a slightly truncated timeframe? (Q4 2013)
Where does it say Q1? It says " AMD begins production... in 2014". Recent history shows production to availability at 5-6 months.
Also, if you consider, even the US Government has conceded Linux is worthy by switching over. Many public/private sector entities are switching over because it's more efficient, and has better security than windows.
Not just no, but hell no. The US Government most certainly isn't using Linux as a desktop OS and NT (Server 2003/2008/2012) is the prevalent server OS. You only see Linux is net-app style devices (McAfee security devices, RPAs, IDS and various other utility systems) or where they deploy ESXi. For heavy processing Solaris is the preferred choice, previously on SPARC but their now using x86 due to costing. This is pretty much the same thing you see in Corporate America and for a very good reason.
Microsoft provides an extremely wide range of management services and solutions along with best practices and a very robust credentialing system. That all works to reduce man-hour requirements for administration of IT services which is the largest driver of IT cost. A handful of college students with Linux knowledge may be able to provide services for small business's but absolutely doesn't work for Corporate solutions. The predominate Linux for big corporate isn't Ubuntu but RHEL and it's twin sibling CentOS. Anyone who's planning on working Linux in big IT needs to be intimately familiar with RHEL.
In July 2001[1] the White House started moving their computers to a Linux platform based on Red Hat Linux and Apache HTTP Server.[2] The installation was completed in February 2009.[3][4] In October 2009 the White House servers adopted Drupal, an open source content management system software distribution.[5][6]
The United States Department of Defense uses Linux - "the U.S. Army is “the” single largest install base for Red Hat Linux"[13] and the US Navy nuclear submarine fleet runs on Linux.[14]
In April 2006, the US Federal Aviation Administration announced that it had completed a migration to Red Hat Enterprise Linux in one third of the scheduled time and saved 15 million dollars
The US National Nuclear Security Administration operates the world's tenth fastest supercomputer, the IBM Roadrunner, which uses Red Hat Enterprise Linux along with Fedora as its operating systems.[34]
In June 2012 the US Navy signed a US$27,883,883 contract with Raytheon to install Linux ground control software for its fleet of vertical take-off and landing (VTOL) Northrup-Grumman MQ8B Fire Scout drones. The contract involves Naval Air Station Patuxent River, Maryland, which has already spent $5,175,075 in preparation for the Linux systems.[45]
The government hasn't what? If the White House and the DoD and all military installations and congressional offices use Linux...then what part of the government that's left wouldn't be using it?
Even the major stock exchanges run Linux:
The New York Stock Exchange uses Linux to run its trading applications.[86]
The London Stock Exchange uses the Linux based MillenniumIT Millennium Exchange software for its trading platform and predicts that moving to Linux from Windows will give it an annual cost savings of at least £10 million ($14.7 million) from 2011-12[87][88]
EDIT: Of course it's RHEL, and they were changing the DoD over to Linux in pieces before I quit working for them 13 years ago...that was the "test bed" for the rest of the Government offices to my knowledge. The logic was Linux is free, and can be shaped into their demands for security much more easily than windows.
"It's been coming" has been said for 20 years now, it's still not here. And you should go reread what was posted, I made a special comment that the anomaly for Linux adoption is webservers due to how much better Apache is over IIS. Though if you want to get into real webapp power your talking something like Oracle Weblogic (they bought BEAWLS) though that can get crazy expensive with all the components needed to make it work.
Linux will never see desktop prevalence in corporate IT, and definitely not DoD systems. I can speak with the utmost authority on the DoD side. The central issue is automated management, something that RHEL (CentOS) is so far behind in it's just not a contest. Thing is, Linux is not a full Operating System, it's only a Kernel and set of standards for inter-compatibility. RHEL, CentOS, SUSE, Ubunto, those are actual Operating Systems as used in practice. Due to their open nature there is limited incentive for heavily monetized R&D work. You don't spend 11 million USD developing a solution just to have your competitor swipe it and use it for free. The Linux community demands openness and thus create it's own barrier to growth. It takes a company like Apple to actually create a feasible mass marketable product out of "Open Source" (BSD in their case).
Anyhow the cost of the "OS" is absolutely nothing compared to the cost of support, Installation & Engineering (I&E) and Operations and Maintenance (O&M). If anything you can claim that the concept of Linux scared the piss out of MS and forced them to develop solid cheap (relatively speaking) solutions for the I&E and O&M components of IT. SCCM is a good example of that in practice.
The DoD, and particularly the intelligence community in the DoD, have more programmers than anyone else in the world. They can make modifications that are not open source based on national security reasons. Even Linux developers cannot argue with that.
Additionally, they were already using Linux desktops in many areas of the DoD 13 years ago.
What makes you think they're not now? Most of them were on RH back then...which I would imagine hasn't changed, or if they're bothering with a newer distro it's likely Fedora (for obvious reasons).
The government has to pay the coders/programmers/IT guys that they have anyway...they're all salary. So working 40 hours or 100 hours per week makes no difference, the pay is the same. They have the manpower, and the man hours, to be able to do whatever they want with any Linux distro they want.
You're talking to someone who was in the intelligence community working on hardware...
Additionally, if you think the white house is still running windows exclusively...you'd be wrong. The majority of desktops in the white house run Linux. The president's PC may be windows, or something like that. However, the staff have transitioned since the Bush Administration. That was one of the more "progressive" things Bush did for the country (it also helped trim quite a bit of fat from the budget by doing so...but I digress).
IPv6 connectivity from anywhere. Only a few of the DREN sites planned to support IPv6, yet the IPv6
pilot wanted to offer IPv6 connectivity to the entire HPCMP user community, including users at sites that
only supported IPv4. Providing connectivity was complicated by the variety of operating systems on the
users’ desktop computers, which included versions of Linux, UNIX, and a lesser number of PCs on Microsoft Windows, and Apple OS X.
Connectivity was provided by installing a pair of Hexago4 Gateway6 tunnel brokers at a total cost of less
than $70,000, one for users at IPv4-only DREN sites, and one for users on the Internet5.
The Intelligence community is wed to Sybase & Solaris 10 & OWLS. I know this not from some far gone age but from what I do on a day to day basis, their my primary customer. The Ops community is wed to Oracle & Solaris 10 & OWLS, both groups use SPARC hardware but due to costing are now exploring x86 alternatives, lots of recoding of their core apps needs to be done before that's possible. Even the logistics community prefers to use Solaris for their backend servers with a NT client.
The DoD is most definitely not using linux for desktop use (and won't anytime soon). It's MS for core services + MS Clients with everything else being a PM controlled product. I know for a fact that the Intel and Ops community is using XP SP3 as their current baseline client OS due to software compatibility with GCCS & JOPES. The service branch's themselves have mostly moved onto Vista and are now transitioning to Windows 7. If you had a CAC and actually worked in that environment you would be able to go download the AGM and pre-STIGed images. The security accreditation and requirements pretty much rule Linux out as a desktop product. Code openess has never been an issue for the DoD as they can get access to any manufacturers code (yes even MS's) to determine FIPS 140 compliance amongst others (technically NSA does this not the DoD).
Anyhow that's a discussion WAY outside this thread. Just know that you've done the equivalent of walking into MS HQ and proceeded to tell their developers how their code works.
Although the Xbox 1 is using a more efficient GPU then the PS4 the PS4 is going to have a stronger GPU probably by around 10-15%. Not only that but the PS4 uses GDDR5 memory vs the Xbox 1 DDR3 so that would cripple its memory bandwidth by half. Besides useless ports the PS4 is going to be clearly more powerful.
The way sony put it, the GPU in the ps4 is divided into 2 parts, one part for graphics workloads at 14CUs and 4 CUs for general processing. Likely the ps4 will be much more powerful in general processing for games but not much faster in terms of pure graphics. The ESRAM does give xbox slightly better efficiency when dealing with compute but its unlikely to offset 4 GCN CUs dedicated completely to compute.
The memory bandwidth is the least of the xbox's problems. It has real time texture compression and decompression and data engines to manage data in main memory and ESRAM at the same time so the bandwidth will be completely sufficient. It might even be more than the PS4 in some situations. The framebuffer and the dependency data can be in the ESRAM while compressed textures can be streamed from main memory. The combine bandwidth of the ESRAM and DDR3 is probably not far from operation bandwidth for something like this.
The xbox one just lacks the extra GCN cores that are in the ps4 and this will probably prove to be what drive the ps4's first party games.
We don't know clock speeds yet for the gpus; they should be similar, but it could be that ps4 runs at 800mhz and ONE runs at 1ghz offsetting some performance but even if this was to happen, its unlikely to be more than a 50mhz difference.
What is up with TH and retarded testing methodology but anyways my point is TH has the 780 well below the Titan, while other sites including TPU have the GTX 780 on par or slightly quicker which was what I was told last week and it subsequently turns out that the 780 is about par with the Titan and thats going to piss a lot of people off I did say on Titan's lauch wait for the 780 and save $400. Nvidia is a mess.
Well i like techpowerup because they test it with a LOT of games. And it is behind the Titan and only 16% faster then the 680 Move along nothing to see here, Now i want to see my potential future card the 9970.
We have seen space where the Titan was only 5% faster than the 7970GE and others where its around 20-25% faster but 60% more cost, The aggregate median which the 780/Titan is over the 7970GE is somewhere around 15% excluding AMD's 13.5 expected gains which they are saying that there possibly is about 7% gains in performance across the line, from AMD's site direct the pre-test on 13.5 had the 7790 now mixing it with the 650ti Boost and 7850 1GB, while at the top the 7970GE is about 8% faster than the 680. I can't see AMD being in a rush for the next gen, they now know that GK110 was never that all conquering GPU despite packing double the resources on that of Tahiti, its now about executing and delivering a complete knockout particularly what we saw in this generation AMD's drivers far outpaced Nvidia's and we have seen chief writers from nvidia leave these are the guys that did drivers, and we all know what drivers do to performance.
Anyone else extremely disappointed in jaguar? Competing in laptops for 350-500$ is a horrible idea. And to think this is what's going to be in the next consoles. Man.
Anyone else extremely disappointed in jaguar? Competing in laptops for 350-500$ is a horrible idea. And to think this is what's going to be in the next consoles. Man.
that's unusual. jaguar, disappointing? iirc brazos was so good (for the form factor it was built for) that oems put it into ultra portable laptops instead of just netbooks. it's igpu was way better than then-available atom socs. amd still made enough money (50 mil. brazos powered devices) to have it become their most successful launch. if anything, it's fx that is disappointing (and just when you thought it stopped disappointing, it sinks to new lows).
consoles is actually a success. their approach to designing jaguar paid off in a big way. they can put it in semicustom devices as well as their own platforms. can bulldozer/piledriver double-dip like that? :lol: