AMD CPU speculation... and expert conjecture

Page 669 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

8350rocks

Distinguished


i3 will not play dragon age inquisition. At all. Period. System locks up with 100% core load on both cores.

 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


MSAA 4x has overhead like playing on 4K monitor without AA.



Accidently Far Cry 4 has the same issue. It's suspicious...
Maybe that will be fixed shortly, but not by Bioware :>
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


Actually people with i3 play DAI without issues...
 
See comments here:

http://www.pcper.com/news/Processors/Far-Cry-4-Does-Not-Support-Dual-Core-Processors-Budget-Landscape-Shifting

if you don't know hwat you talk about and have not tested it for your self, then stop defending ubisoft for makening everyone buy a so called 64 bit game with redicules min requirements, for a bloody 32 bit single core game.
let me give you some hints here, incase you have not noticed it.

1st win_32 folder contains actual data to the game it self, mostly the graphics, if it supposed to be 64 bit then wy the win_32 refference folder name.

2nd the game is installed in the Program files (x86) folder, a clear sign it is a 32 bit progam(it is 64 bit exe file, but i come to that later), if it would be a actual 64 bit game it would go to the Program files (without the 86).

3rd while exploring and examening the 64 bit exe files, i noticed there was a second forced pe refference in there, judging from that code it is a 32 pe refference to continue loading the nessary files who are all 32 bit.

4rd Every thing this game does shows that it is a bloody 32 bit game, all dependencies it loads or uses are all 32 bit dll and additiional data code in 32 bit.
Naming a file or dll with a 64 in it does not mean it is actualy 64 bit, anaylzed those files to, and they all are 32 bit.

5th it uses the old far cry 2 engine, wich is again 32 bit, not judging about the age of that engine, they have adjusted some things in there but again all 32 bit code.

6th Because this game is actualy 32 bit, and they tried to optimize it for a 64 bit machine and code, where they failed big time, they where forced to use the old single core function RDTSC, and forced it to use a single core(inthis case froced to core3) with a even more bigger scam to demmand a min req of 4 cores, to hide their misstakes.
If they would not use this RDTSC single core operation, they would have frame drops, lags, freezes, huge glitches and slow motion effects in the game, deu to the optimization on their part, on their 32 bit game.

Don't belive me try ising ida pro, pe explorer,Dependency Walker, to check what the progam needs to run and some other program analyzers.

UBISOFT could not even tweak and optimize a 32 bit application, let alone program and optimize it on 64 bit, and tried to get away with it by obscuring it with fake file naming that look like a 64 bit dll or addons on the ouside but are 32 bit on the inside, and hide their 32 bit exe, in a 64 bit exe file.

Ubisoft SHAME ON YOU, do you realy think we would not notice it, REALY, i'm not the only one who saw this and they are also examening your game code!!!!

Which highlights the artificial four core requirement. If this checks out (I plan to confirm this myself; the programmer in my is appalled), then Ubi violated all good programming practices in this title.
 

8350rocks

Distinguished


Show me.
 
As an add on to the above: If this user is right and Ubi is forcing their workload on core 3, then a simple hack should allow FC4 to run on any CPU, by simply changing the constant to some other value (say, 0, or the first core). In any case, FC4 would run like crap if ANY OTHER THREAD IN THE SYSTEM TRIES TO USE CORE 3, since it forces the workload onto that core. If this checks out, Ubi really should be thrown to the wolves for atrocious coding practices.

Seriously, I'm dumbfounded anyone would code this way. Even college grads should know better then this.
 


Probably i3 with HTT (4 cores). See my above post on FC4.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


http://cdn.overclock.net/5/51/500x1000px-LL-513b8db4_http--www.gamegpu.ru-images-stories-Test_GPU-RPG-dragon_age_inquisition-test-DragonAgeInquisition_proz_proz.jpeg

http://www.youtube.com/watch?v=D4QDXIXw3gA

Oh, and I met guy with X3, doesn't work. So we can say that at least 4 VIRTUAL cores are required.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


The question is: why?

Two different companies, two different games, same issue, very similar release date.

Waiting for conspiracy theories...
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


And a pair of days after... it is made official:

Samsung partners with GlobalFoundries to fabricate Apple chips

The collaboration with Glofo has not only increased the volume production, but has also "lowered costs".

Moreover, I can add that AMD will do a custom ARM SoC, based in K12, for future Macs. Just saying it...
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The DDR4 standard mentions DIMM modules up to 2466MHz. The a10-7850k support officially up to 2133MHz DDR3.

The future DDR4-2466MHz modules bring about 16% more bandwidth than the current DDR3-2133MHz modules. This would partially alleviate the memory bottleneck of the A10-7850k, but is not enough for the future APUs.

You have to think that DDR4 (despite being faster and efficient than DDR3) is really optimized for capacity, whereas the new HBM standard is optimized for speed. If you combine both types of memory, DDR4 and HBM, you will get both capacity and speed. This is what AMD will do with future designs

08_s.jpg

 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I don't think anyone should be taking anything Ubisoft says seriously. They are running around saying the only reason Watch Dogs on Wii U isn't selling is because Wii U doesn't sell M rated games. The problem? Bayonetta 2, a rated M game, released in North America a little over a month ago and sold very well. it topped Nintendo online store charts instantly. And a bonus, Watch Dogs released a long time ago, the people who wanted to play it already bought it.

Ubi doing this wouldn't surprise me at all. In fact, people were expecting Ubi to wait on the Wii U port until after everyone realized the game was terrible and played it somewhere else so they could run around blaming the Wii U entirely for poor sales. Wii U sucks for multi-platform games and it's no surprise that Watch Dogs is selling poorly on Wii U since it's an old game that had a lot of issues and is on one of the worst platforms for multi-platform games.

Also, 14nm coming first half of 2015. GloFo ahead of Samsung.
http://forums.anandtech.com/showthread.php?t=2409843
I think 14nm GPUs from AMD are not a distant dream and some of you are vastly over-estimating how long it's going to be until we see one.

Basically:
TSMC only has 20nm and it's terrible for GPUs
GloSung will have 14nm in first half of 2015

Now I understand why we have Maxwell when we do.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


We are talking about GPUs, around 200-300mm^2 chips. Won't happen until 2015Q4 at best.
 


This weekend, I'm going to force it to run on two cores, and report the results. Should be an interesting exercise.
 

cemerian

Honorable
Jul 29, 2013
1,011
0
11,660


it has already been done http://youtu.be/iAzBJnkqZDw and it runs just fine one two cores
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


On 32nm node there is not enough space to bring 4 Piledriver modules plus L3 cache plus GPU on same die. On 10nm node there is enough space.

I am not going to give here further technical details of the HSA APU announced by AMD for the extreme-scale supercomputer. Enough is enough. I will only say that the iCPU is more powerful than an Opteron 6366 SE (a 8-module dCPU)...
 


Don't you LOVE artificial requirements?
 

cemerian

Honorable
Jul 29, 2013
1,011
0
11,660


Indeed what can be better
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
So if apple is going to be made at GF, where does that leave AMD to be fabricated? Im seeing samsung's primary fab in korea. Looks like the deal was to trade apple for AMD. Better for AMD imo.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


iToys will be released at least 6 months before any 14nm AMD part. It's even better for AMD that Apple takes few first iterations.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
Apple has always wanted an entire dedicated line from tsmc, im guessing they might get that from GF. Gives GF a dedicated income and allows AMD to move away from GF as I said.

If this is the case its a genius move by Samsung to dedicate GF's fab to full production so they cant get new customers, not that GF will complain about being at 100% capacity (at least till apple sues them for some random reason).
 
Status
Not open for further replies.