Is the AMD FX 8350 good for gaming

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


 
All this arguing about which processor is superior or if AMD can keep up with Intel is a moot point. Current Sandy/Ivy bridge chips destroy all FX chips when both processor are at same clocks. Why same clocks? Because both have identical overclocking capabilities and therefore they should be compared as such. Most people don't buy an unlocked CPU to leave at stock

An i5 at the SAME CLOCK as an FX 8350 will absolutely smoke it in anything using 4 cores or less and most likely come slightly behind or even mildly ahead in some multi threaded aps.

An i7 at the SAME CLOCK as an FX 8350 will DEMOLISH the FX is virtually everything outside of a very few select programs. And it will do all of this while using HALF the power and a POS $25 heatsink from NewEgg. Meanwhile smoke bellows from the H100 attempting to quench the power hungry beast of an FX chip.

Lets quit acting like these chips are even in the same league here.

 


Ok, OC either one of those to 5.2-5.6 GHz....then talk about OC'ing potential being equal.

 


1) Comparing a stock FX to an overcloked Intel chip is biased.

2) The FX has superior overclocking capabilitites. It is easier to overclock, more stable, and has the world record on overclocking.

3) Rest of your points were addressed before as well.
 




I have said that on some CPU bound games like Civilization 5 and others the fx8350 can be noticeably slower.Even in hitman absoultion which is quite a recent game it was slower.Nobody is saying that fx8350 is a bad processor or something.Without a doubt the most important things for gaming in general is the GPU,but in some CPU heavy games the difference between an fx8350 and an i5 3570k is noticeable.Even in games like borderlands2 fx8350 lagged behind :http://www.xbitlabs.com/images/cpu/fx-8350-8320-6300-4300/borderlands2.png.So if GPU is taken common to both set ups then i5 3570k definitely looks the better option for gaming overall as of now.For most games there won't be any noticeable difference but for some like skyrim i5 3570k would be better.That's nothing to be selective about.The thing exists so it exists.Fx8350 doesn't perform well in some games that's it.There are enough guys with an 120Hz monitor as well.You are quoting toms.Let me quote one
"At the end of the day, AMD still has work to do in improving game performance. But Piledriver certainly does help rectify the slide backward we saw Bulldozer taking relative to some of AMD’s previous quad-core parts in processor-bound games."

About that thread check it out here:http://forums.anandtech.com/showthread.php?t=2294486
Idontcare is pretty reliable.Rendering on 3DS MAX which takes all cores to maximum,here i5 3570k was able to keep up with fx8350,that was the point to show how an OCed i5 3570k can keep up with an OCed fx8350 even on a thing which scales so well on cores.One just can't ignore the light threaded performance.It is not as if i5 3570k is a lame processor just becoz it has only 4 cores.It does the perform quite well with what it has and consumes lot lesser power when overclocked,though power consumption might not be a concern here.

World record!! Not practical at all.How many people use LN2 for daily stuffs?Both ivy i5/i7 k and fx8350 overclock fairly well.Problem with ivy processors is heat produced which limits OC potential which is not as good as SB processors as intel cheap out on TIM.Still ivy i5/i7 do 4.7-4.8GHz mostly on a good board.By the way though it is not something that would concern everyone but at higher clocks the fx8350 eats out much more power than an ivy i5/i7 at same clocks.http://www.tomshardware.com/reviews/build-a-pc-overclocking-do-it-yourself,3366-15.html does tell that i5 3570k when overclocked is very good and both fx8350 and i5 3570k traded blows in some.

Total.png

Quotes from that page on toms "When it comes to overclocking though, Intel extends its lead with significantly lower power consumption and much better performance. If we were measuring efficiency, that'd be a home run"
"It'd be a great experiment, and we might even play around with it in the future, but it's clear that Intel's Core i5-3570K remains the better choice for overclockers in this price range."

For the things tested i5 3570k was overall better when both were overclocked.They are far more sensitive to frequency than fx8350 is.So simply talking about clocks won't do.IPC is a thing as well.So the so called "superior clocks" are just not the only thing,it's also important how much they are sensitive to those clocks.


ericjohn004 has explained things quite well.I think you are just not ready to look on other side.He and I were talking about both side of things but you just don't seem to acknowledge what is on other side.I do not support or am loyal to a company.I would pick whatever best for the money is available within what I can spend which is best suited to things I would do.We all want AMD to do well simply becoz it would result in more competition and better priced products from both sides.But that doesn't mean we should not look at the overall picture of the things now.I have made my points on the context,period.
 



1) What you really said is quoted above. Borderlands 2 is not using all the potential of the FX and still it was only slightly slower than the expensive i7-3770k

borderlands-2-results.jpg


Hitman is not using all the potential of the FX, albeit you present it as new, "new" does not mean "parallelized for eight cores".

The point here is that FX is good enough for gaming and you are not going to change that by posting here.

2) It is good that you quote Toms now, but are you aware that the quote that you gave is a bit old? In their cpu gaming hierarchy cart (updated to March 2013). Toms puts the FX near the top and say, in the text, that upgrading from a FX-8350 to an expensive i5-3570K/i7-3770k does not offer overall gaming benefit. You and another user pretend otherwise but precisely both of you avoid Toms advice.

3) As explained before, Idontcare comparison was biased in any possible point. No, 3DS MAX is not taking all FX cores to maximum. Where did you got that exaggeration? In any case the same forum that you link says it clear

So in these 2 workloads 8350 is actually on par or faster than 3770K

IMG0039194.png


IMG0039193.png


You pretend that "the i5 3570k was able to keep up with fx8350", still data shows otherwise. Moreover, the same 3DS MAX review says that the new FX chip allowed AMD to beat the entire range of LGA 1155 Intel chips:

Ceci permet à AMD de prendre les devants sur toute la gamme LGA 1155.

4) Who said you that everyone will be obtaining a world record at her/his home? The world record shows that the design of the FX chip is superior to that of the Intel chips. A more solid design is the reason why it is so easy to overclock an AMD FX beyond Intel chips.

5) You quote toms overcloking review of the FX-8350, but once again you forget to quote relevant info. That review was made using older software, favourable to Intel chips as the i5. Toms write in the same page that you quote:

This will likely change as we fold more heavily-threaded tests into the Marathon, starting this quarter.

Moreover, you also forgot to say that the default Intel builts used stock memory, but the default FX build used under-clocked memory, thus lowering its performance.

The point is: if you force yourself to using older software or one application at once the i5 looks a better option. If you run several applications at once and/or modern optimized software the FX looks better.
 


Until you start playing something like Skyrim and quite a few others. Then you'll notice a difference. But I get your point. But you can't assume a CPU doesn't matter at all. If that were the case then Tom's wouldn't recommend "Best gaming CPU's for the Money". And the 3570k is the best one available.
 


You like to quote Tom's Hardware right? Well I got quite a few things Tom's says about the 8350 RIGHT HERE.

Dude seriously just stop. I've never seen the word "biased" used so many times in my life. I GET YOUR POINT! MY point is that an FX is still kind of lacking as far as being an OVERALL great chip because of it's lack of single threaded performance. It's missing that.

To say that's Tom's Hardware or any other site is "bottlenecking" an AMD cpu just because they used 1600Mhz memory is absolutely ludicrous. Just read the Tom's review on the FX8350. It clearly states that they used 1600mhz memory because they tested with BOTH 1600mhz and 1866mhz and they got the same results with BOTH. And when they didn't get the same results with both, they stated so. And the only benchmark that happened in was, guess what? MEMORY BANDWIDTH! So they DID use both but only got different results with one test. And if you don't believe Tom's then IDK what to tell you. Tom's even has an entire article dedicated to proving this myth false. If you would believe facts, but sometimes I guess you choose not to. You like to quote Tom's hardware but it doesn't seem like you even read the 8350 review they did. If you quote Tom's trying to say they are 100% accurate then why don't you believe what they say about the 8350 in this review? Seems like your nit picking your talking points, heh?

I've seen reviews that completely dis the AMD FX8350. I've seen reviews that say the 8350 "beats" a 3570k. I've seen a whole lot more that say the 3570k "wins". The only review site that I trust is Tom's Hardware, and they give a VERY fair review of the 8350. Clearly, gaming/single threaded performance is not on par with the 3570k. And gaming/single threaded performance is at least 65% of the whole picture.

Multithreaded performance is BETTER on an 8350 than a 3570k but a 3570k's multithreaded performance is still very good nonetheless. Especially if you match them up clock for clock. So a 3570k doesn't lack at anything. So MY POINT is, is that a 3570k is a more well rounded chip. Because it's good with everything. And a 3770k is good with EVERYTHING, and is also better than the 8350 with multithreading as per Tom's Hardware. And since your such a fan of Tom's and because you like to quote Tom's, then Tom's must be right about this, RIGHT? Due to your logic, Tom's should be right, RIGHT?

Tom's has BEEN dispelled the rumor that the 8150's and 8350's perform that much better with 1866mhz memory. Why are you even arguing this point when it's clearly NOT true. The research has been done, the facts are out, your NOT right. Your WRONG. Stop pushing all this 8350 propaganda when it's clearly not true as far as Tom's Hardware is concerned. I just went from 1600 to 2133 and I saw a 3% performance gain. 3%! Going to 1866mhz would most likely be 1.5%. So ok, I'll give the 8350 a 1.5% advantage next time I take a look at a benchmark.

How can you possibly disagree with anything I've said? I know the facts, you probably know the facts but choose to only acknowledge the ones you like. Why hide from the truth. It'll set you free.
 


In the past you already commented on stuff you did not even read, which says a lot of...

Now you chose to misinterpret me, to avoid relevant parts of my above message, and to try an appeal to authority fallacy. Toms' analysis of memory scaling of the AMD FX was not correct (they even selected memory modules designed for a specific Intel chipset). Moreover their test was for the Bulldozer architecture not Piledriver. The facts are that people have noted increase in gaming performance by using faster memory in an FX-8350 chip.
 



This.
 


You should be banned for spreading misinformation and your biased thoughts and opinions. You should honestly be banned. How do you disagree with benchmarks?! How?!! It's evidence right in front of you and you still argue with it! WHY?! Why do you people not get it?

3570k>8350 for gaming

/thread
 


That's a gross over generalization...

In single threaded games 3570k > 8350

In multi-threaded games 8350 > 3570k in many instances...

What it boils down to, is that the 8350 is about on par with a 3570k across a broad spectrum of games, and it is right on the heels of a 3770k in a lot of things outside of gaming.
 


Seriously? No, it's not. No legit benchmark exists that backs up your claims.

I've seen your posts, and all of them are filled with BS just like this one. You will never be able to prove your claim, because it's false. The 8350 is a good processor, but it doesn't touch the 3570k in gaming, and is destroyed by a 3770k. To say the 8350 is right on the heels of a 3770k proves how little you know and how biased you are.

Seriously, stop spreading misinformation and spewing out your biased thoughts.
 


Show me one Crysis 3 Benchmark or Far Cry 3 Benchmark or BF3 or Metro 2033 where the 3570k destroys the 8350. You can't produce one...a few FPS one way or another is less than the margin for error on the benchmark so that is an equivalent draw. You would need a greater than 10% difference in FPS to say that one is significantly better than the other as the margin for error on ALL your benchmarks is 5-10%. So go ahead and spend the next few days hunting for a benchmark on any of those 4 new games that proves your "unbiased" point.

Otherwise, go talk about the greatness of intel elsewhere...where some fool who hasn't done the research might believe you.
 
To extend a bit on the memory issue.

As said above Toms tested the sensibility of bulldozer chips to memory speed but they did not use AMD optimized modules but four Intel optimized modules. Moreover at least two of the modules used in that test were explicitly designed for an Intel chipset (the X79) and failed at higher speeds.

Toms only tested two games: dirt 3 and metro 2033. Both games are well-suited for FX chips (the piledriver 8350 run them so fast as an i7-3770k) and thus they did not benefit from faster ram. Going from 1600 to 1866 only generated about a 1% more FPS.

However there are games more sensible to faster ram like the next

games.png


where going from 1600 to 1866 gives about a 4% more performance on Bulldozer FX-8150. This 4% increase is 2.7 times more than the fair estimation made by the well-informed poster ericjohn004.

Yes, 4% is little, but I have not said that it is the maximum increase possible. I am going to show how we can go from "no benefit" to 4% by selecting a different game. People reports up to a 25% increase in frame rate from going to 1333 to 2133 on Civilization V on the full render mode. Taking a linear scaling this gives about a 10% increase when going from 1600 to 1866 on Civ V. Faster RAM is also noticeable when running multiple apps at once or in specific tasks such as transcoding.

No here where overall I find good info, but I find lots of reviews on the internet that insist on feeding the FX processors with underclocked memory, selecting applications not specially favourable on the FX side, some non-optimized memory modules, and avoiding OS patches and a pair of tricks more... and you are going to change significantly the conclusion of a 'review'.
 
8350rocks, I know you've said that the FX8350 doesn't run as fast when it's not using 1866mhz memory. I and I trust what you say to be true. But you did say that while both of us were arguing and sometimes we may have just said certain things to prove our point. And I'm not sure whether you said that to prove a point of whether you said that because it was really true.

From what Tom's says, an 8150 doesn't benefit with 1866mhz memory. They say one benchmark gave a 6% better results but most did nothing. They did an entire article on it. And if you read their 8350 article, they say they used both 1866 and 1600 for the review and only saw a difference in memory bandwidth. So they decided to stick with 1600mhz to be equal. They did a benchmark with 1066, 1333, 1600, and 1866. And performance got better with each change of Mhz but it stopped after 1600mhz. And 1600mhz and 1866mhz showed the same results.

Now, me personally, I do know that higher Mhz memory does give additional performance. I just went up from 1600-2133mhz and not only did my RAM scores shoot up I also got a 3% increase in most things I did with my CPU too. And that's like being clocked .1Ghz higher. Some benchmarks don't show this increase and some did. So I'm happy with it. So therefore, I know that an AMD 8350 moving up from 1600mhz to 1866mhz, will have a benefit in some benchmarks. But to me, that benefit should only be 1-2%, if that, and it should only benefit in specific benchmarks that are memory intensive.

So are the benchmark results really drastically different like some people would like you to believe? Will it make the 8350 all of a sudden surpass the 3770k in a benchmark it would normally fall behind? Or can this just sometimes be an 8350 fans talking point? From what I've researched, it's basically just a talking point because you should see a 1-2% gain. And in gaming I see no difference with the different RAM. Not that I can physically see.

BTW, you won't see any BF3 or Crysis 3 or Metro 2033 benchmark that a 3570k destroys an 8350. The only games, as I'm sure you know, that do show a difference, are games that use 4 cores or less and require a lot of CPU power. Like Skyrim.

Most games, especially at 1080p and higher resolutions, rely so much on the graphics card that the CPU doesn't matter as much. So basically, the higher settings and the higher resolution you use, the more the CPU doesn't even matter. If you gaming at like 1200x900 at low settings on any game that's not brand new, you'll want a 3570k. Otherwise, it won't matter too much unless your playing something like Skyrim. When games can use 8 cores the 8350 is actually really good. Like in 3DMark11 Physics test.
 
FANBOYS!!! I'm sick and tired of your bickering... The FX 8350 is enough to maintain 60FPS (even in skyrim), and If you have a 60Hz monitor, the difference is negligible between the i7 and the FX 8350 because you have to turn on vsync anyways or you're going to get screen tearing. And the 8 cores make for a better multitasking experience. Even a stock FX 8350 with crossfire 7970s on a 2560x1600 monitor is going to be pretty close to intel in gaming performance. Also, clock for clock, INTEL IS FASTER. But performance isn't measured that way... If you were to measure performance 500Mhz past stock clocks on each system, that would be a fair benchmark. You guys have no idea what you're talking about if you're saying a i5 at 4GHz pwns a FX 8350 at 4 GHz. DUH! No shiznit sherlock, that's the FX 8350's stock clock. And, if you didn't know, most benchmarks are done at stock clocks with the AMD Processors, so you don't see that extra 200Mhz. I've seen multiple benchmarks done, and I've caught this.

The only time I would recommend a i5 over the FX 8350 is if you're going to be using a 120Hz monitor. If you check out (not the Tom's or Anand) benchmarks,
Multiple sources show that the FX 8350 beats out the i7 3770k in Metro 2033.

In case you guys didn't know... The horse is dead (and so was the thread), stop trying to kill it.
 


Yeah, I agree with you there. Some people are totally not fair to the FX. I read this review yesterday. And everything the FX did great, they were like "well it seems to be ok at this, but look at this single threaded benchmark!". So I see why if you own an FX you'd be completely pissed at some of the reviews out there.

When I actually read Tom's review on the 8350, it surprised the hell out of me. Not only was it literally nipping at the heels of the 3770k like you say, in some of the benchmarks, but it wasn't as bad at single threaded tasks like I had once thought. I reread the review yesterday since we were chatting about this and I was surprised the say the least. And Tom's does do a very fair review. They don't even use SuperPi and some of the other stuff that's not optimized for AMD.

And like I said in my last post. I'm sure 1866mhz memory does give the 8350 an improvement. I just don't think it's quite as much as some people would like to believe. Maybe it does, IDK, I haven't read as much into it as you have.

But yeah, the 8350 is really super impressive as far as multithreading goes. And in theoretical benchmarks like Sisoft Sandra(which I hate, it give my 3570k low scores), the 8350 does really well. And for only 179.99, I can definitely see why people get this processor. I'm going to be doing a build for my little nephew. He wants a cheaper gaming computer. And I think I'm going to stick an FX6300 in there. You absolutely can't beat the FX6300 for only 129.99. Can't beat it.

 


That is a discrepancy, that, as a hardware guy, I cannot explain...there should be an increase in performance there...unless...the motherboard did not support higher memory bandwidth than 1600 MHz.

Now, me personally, I do know that higher Mhz memory does give additional performance. I just went up from 1600-2133mhz and not only did my RAM scores shoot up I also got a 3% increase in most things I did with my CPU too. And that's like being clocked .1Ghz higher. Some benchmarks don't show this increase and some did. So I'm happy with it. So therefore, I know that an AMD 8350 moving up from 1600mhz to 1866mhz, will have a benefit in some benchmarks. But to me, that benefit should only be 1-2%, if that, and it should only benefit in specific benchmarks that are memory intensive.

So are the benchmark results really drastically different like some people would like you to believe? Will it make the 8350 all of a sudden surpass the 3770k in a benchmark it would normally fall behind? Or can this just sometimes be an 8350 fans talking point? From what I've researched, it's basically just a talking point because you should see a 1-2% gain. And in gaming I see no difference with the different RAM. Not that I can physically see.

In serial applications it makes less difference, in highly threaded applications it makes more difference...though your number is close enough to accurate...you should typically see not more than 5% performance gain on highly threaded applications.

BTW, you won't see any BF3 or Crysis 3 or Metro 2033 benchmark that a 3570k destroys an 8350. The only games, as I'm sure you know, that do show a difference, are games that use 4 cores or less and require a lot of CPU power. Like Skyrim.

I know, that's why I said what I did to trogdar or whatever his name was...his assertions were foolish and uninformed...so by inviting him to prove me wrong when I know he cannot, he will have to do his own research and LEARN the facts about what he is talking about rather than spewing trash about "i5>8350 /thread". Those comments were not addressed to you...they were addressed to him.

Most games, especially at 1080p and higher resolutions, rely so much on the graphics card that the CPU doesn't matter as much. So basically, the higher settings and the higher resolution you use, the more the CPU doesn't even matter. If you gaming at like 1200x900 at low settings on any game that's not brand new, you'll want a 3570k. Otherwise, it won't matter too much unless your playing something like Skyrim. When games can use 8 cores the 8350 is actually really good. Like in 3DMark11 Physics test.

+1 THIS has been my point EXACTLY all along, to get people to do some research enough to see reality. That the difference between the 2 slightly favors one over the other in certain areas, but it's not enough to sweat anything on the weak points of either side. In a perfect world, I would love to see more support for AMD to continue competition between the 2, but it seems some intel fanboys need a dose of reality every now and again when they pop into a thread and spew BS about dominant supremacy from intel...the facts are...the performance gains are mostly trivial with some exceptions. I will grant you Skyrim as an exception...and I am sure there are a few others...but in general...buy what you can afford and try to buy the best VALUE for the money. You always get the most value for your $ that way.
 


Most benchmarks sites purposely try to isolate the CPU just to show you what CPU is better for gaming. But what they don't realize, is that no gamer with a decent card is going to game at 1200x900 and on low settings. And these benchmarks are the exact reason why most people, myself included, always thought the 3570k was dominant in gaming. But if you are a high end gamer on a budget and don't care too much about games like Skyrim, that are just not optimized for AMD, then I really don't think you'll ever notice that you have an 8350 and not a 3570k. The 3570k is technically "better" but it's not like your going to notice while gaming at 1920x1080 at Ultra settings.

I used to think totally different about this because of all the benchmarks I had read and how far they all had the 3570k. But the more research I do the more I find information like what I've said above.

 


You have realized that which the masses have not...

Benchmarks are a useful yet skewed interpretation of possibilities...

As I said to you once before, you don't spend a chunk of money on a gaming rig to run every game on super low settings. You want to run them on Max settings if you can...in that realm, the difference is nitpicking in 90% of the cases out there...

The FX6300 is a great buy...however, there is a FX6350 coming in June that will increase clock speed and tweak a few things on the FX6300...if you can wait a few months on the CPU, then it will be a similar price point, as the FX series CPUs except for the 8350 will see a decrease in price across the board.
 
Lol, my earlier post was directed at the earlier post. I was a bit heated... Any who, The memory controller on AMD CPUs have been terrible for quite some time... How ever, they're getting closer to having a good one like Intel's (hence you see the difference between the FX 8350 and the FX 8150). And like 8350Rocks was saying, if you can afford a full upgrade in 12 months, the i5 or i7 would be worth it. But the difference isn't enough at the price point IMO.
 


Check borderlands 2 here: http://www.xbitlabs.com/images/cpu/fx-8350-8320-6300-4300/borderlands2.png
GPU used was a GTX680.So it quite clear that these are not any bogus links.

That's a pretty lame comment you made by saying Hitman absolution is not using Fx8350's full potential.
You expect every game out there to scale to eight cores on this date,really?Not every game engine is same,not every development is done on the same way.How many games from 2008-2012 are threaded enough to scale well across 8 cores??

And with quoting those 3DS MAX benchmarks you tried to twist things again in your favor.
Let me correct you:Idontcare tested with 3DS MAX 2013 and in that i7 3770k was the best of the three while
i5 3570k OCed was as good or better than fx8350 OCed,and for your information 3D rendering softwares can easily task all the cores to full.Just checkout what Idoncare had to say about rendering in 3DS MAX2013.There is no exaggeration here.
The 2 pics you showed out of that forum is of 3DS MAX 2011.
Priviewing things in viewport and many other things in 3DS MAX are light threaded which benchmarks never show so again i5 3570k with all round performance was capability is a pretty good choice.It would have been better if you had read that whole thread rather than just picking one thing which was a benchmark on an older version of the software.
And that thing only was quoted since you implied that an OCed fx8350 would be lot better than an OCed i5 3570k.
How many people care about which achieves the highest clock??i5 3570k is able to do at the level of 4.7-4.8 and is more responsive to frequency by design.Most people don't buy an unlocked processor to run it at stock.
In the softwares used in system builder marathon by 2013 OCed performance of i5 3570k was good 20-25% better than fx8350.The gap would reduce if more multi-threaded softwares are used even though toms did a pretty good job by running many kinds of apps which are used by many of us on a daily basis.

Nobody ever said that fx8350 was a bad chip for gaming.It is just in some cases/games it is not able to keep with i5 3570k(already quoted cases like Civilization 5,skyrim etc).In my country the fx8350 setup costs much lesser than an i5 3570k set up,so if I was money-limited and my sole purpose was gaming I would take the fx8350 and invest the rest on a better GPU.But that is not the context here.If I was not money bound and gaming is not important then i would choose whatever best suits for the tasks I do out of both fx8350 and i5 3570k.
fx8350 is a pretty solid value for the price especially for video encoding and some other heavily threaded tasks.It has full instruction set support,not cut any corners like intel with their k series processor like by not having VT-d.
Per core performance is not on par with i5 3570k.What i5 3570k has is good all round performance i.e.good single threaded as well as multi-threaded performance.I don't think any of these chips can be termed overall bad or something.
ericjohn004's point on gaming and how it is tested on some sites is right.Like many sites test on 1280*800 to show CPU side of things but that is not relevant to most.



 
Wow I just realized this thread has 4 freaking pages of comments and it has been read almost 10,000 times. I hope some people out there can use all of our bickering, and rage, but also some really quite useful information to learn about the 3570k and FX8350.

This theads going to be like the top 10 searches of "3570k vs. 8350".
 
Status
Not open for further replies.