This topic contains 28 replies, has 0 voices, and was last updated by  PeanutsRevenge 9 years, 8 months ago.

  • Author
    Posts
  • #50132

    katjapurrs
    Participant

    finally! A thread that delivered exactly what it promised 😕

    Besides, if my Athlon 1500 with MX4000/256 gfx and 7200RPM drives can push scorched then the numbers don’t mean much then do they?

    #50133

    PeanutsRevenge
    Participant

    Just cant ignore this any longer, I’ve tried, I really have and for me, I believe I have done well.

    So what your saying then cat, is that game developers do not develop games for 1680×1050 or higher resolutions and that people using that are weird messed up individuals?

    That people getting a hgih end graphics card with a system to go with it, so £1500 + ($3000+) should run their games @ 1024×768 on 17″ monitors as they odd otherwise?

    Damn those devs for putting these high resolutions in the options menu and not putting a warning on them to say that you must be odd to use this… infact, ‘hey, Americans, sue them!!!!!!’

    This coming from a guy who apparently has an SLI setup!

    EDIT: I do agree that their are components out there which manufactuers release just to grab headlines with what they can produce and that almost no benefit would come from using them.
    These are sometimes just to get more money from idiots that believe the branding (come on, a 5200FX still has descriptions of ‘blistering gaming performance from the latest games’ which, even in its day couldnt achive even adequate performance. However, these companies produce these products so they can refine the design for production in the mainstream at reasonable prices having street tested them already.

    #50134

    Laptops Daddy
    Participant

    @katjapurrs wrote:

    finally! A thread that delivered exactly what it promised

    : ) does exactly what it ses on the tin.

    @peanutsrevenge wrote:

    Just cant ignore this any longer, I’ve tried, I really have and for me, I believe I have done well.

    So what your saying then cat, is that game developers do not develop games for 1680×1050 or higher resolutions and that people using that are weird messed up individuals?

    I think you did very well, peanut.

    I’m pretty sure we’re each weird, messed up individuals. : ) I can understand people wanting to run stuff in native resolution. What I don’t get is why you’d apply loads of antialiasing to a high res picture. I mean, if such high res is appropriate for the game (that isn’t scorched), AA is only going to detract from it and blur up your sharpness.

    I don’t deny that if money’s no object, and you have a monitor/system that can make use of it, 512MB(+) is cool. (sorry if i’m repeating)

    Anyhow, what grated me was you suggesting that scorched needs 512MB, ’cause it’s not true. In fact, you’d be cool with 64MB from what I know.

    (Plus you were pissing on my lawn).

    Ps:
    I’m sorry for jumping on your test results. You put the effort in, and I should have been more tactful.

    Have a look at developer.nvidia.com/. There are some cool tools for analysing GPU usage and stuff.

    #50135

    PeanutsRevenge
    Participant

    I believe I said ‘needs’ if I did, then I was wrong.
    I meant and sure I put, uses. To me, if something wants it, let it have it, especially, given the subject matter (Vikings rather cool looking system) knowing what your applications want is a damn good start to designing the ideal system.

    It didnt suprise me at all that u stomped on the results I found and posted, you always stomp on something you disagree with!

    Heres something else you can stomp on…

    http://techreport.com/discussions.x/14147 (This is not the page I read, which I cannot find atm, but its the same press release).

    Physics processing to be done by 8 series, SWEET… wont be accelerated of course, but hey, should be an improvement.

    Now you can say that it makes no difference and that graphics cards can do the job anyway blah blah blah….

    EDIT:

    Was playing during last post so forgot to say, thnx and good cool with the link, im always wanting more ways to monitor and check what is going on with my system and the actual usage of my GPU is one I am currently missing and really want. This is due to the fact that currently my system is WAY out of balance, the cpu and memory are far slower and (for want of a better word) less advanced than my GPU and motherboard, so I am wanting to know HOW out of balance everything is to assist in my next performance upgrade. (my next upgrade is cosmetic and accustic).

    Thnx for apology aswell, I guess we just grate eachother, always find people like that.

    #50136

    Deathstryker
    Participant

    Just to throw something out there. Not really replying to anyone’s comment or anything…

    Here’s a link to a review for benchmarks of stock 8800gt 1gb vs 256 vs 512.

    http://www.firingsquad.com/hardware/palit_geforce_8800_gt_super+_1gb_review/page4.asp

    I have provided a link straight to the first game benchmark. You can browse through the other pages.

    As you see, the 1 GB card provided no or less performance gain but the 512 card was a bit better than the 256 card.

    This of course does not prove that 512 is a requirement. But it proves that it helps considerably.

    Now I’m not able to speak from experience on this matter since I have not personally been able to test the same type of card (just with different ram) but I believe the article.

    #50137

    Laptops Daddy
    Participant

    i’m reluctant to post on this again. : ) i’ve already used up my 1000 words and am paying extra per letter.

    but that’s interesting.

    something that may affect the conclusions you draw from those results, anything above 40fps or so, is superfluous to most people’s needs.

    what i’m saying is, if the game were restricting the gpu to a moderate (taxing) frame rate, the results would likely be different. (e.g. the video ram wouldn’t need to supply 100fps worth of textures, so you wouldn’t need so much). plus antialiasing is silly at 1600×1200 or higher.

    i don’t know. maybe i’m wrong.

    i ran a few tests before on a demo of crysis, and i couldn’t get video memory usage to go above 90mb or so. it’s possible i was misinterpreting the results from ‘meminfo’ (in game).

    i’d be interested to see a screenshot of the crysis cpu benchmark for comparison, with meminfo enabled? (open console, type: con_restricted 0, then: meminfo 1). peanut?

    #50138

    PeanutsRevenge
    Participant

    Firstly, that a CPU benchmark, its been designed to test the cpu, not the gpu.

    I’ll run the test over the weekend, need to sleep right now, just checking in quickly with various things but when I ran the gpu benchmark VRAM was full and overflowing.

    As for the benchmarks you KEEP missing the point cat, if someone has a high end gfx card on a 17″ screen with 1024×768 res then they are just stupid it IS a complete waste.

    Look at the high resolutions, thats where ‘high end’ is, people with £2-300 gpus should have at least a £1000 system but more like £1500-£2000 ($4-600, $2000 and $3000-$4000 respectively) which would be at least a 20″ screen where most of those benchmarks showed at least a 20% increase in performance between the 256 and 512MB.

    As for AA filtering, you say yourself your not a gamer and clearly dont even use your SLI configuration properly for scorched, have you not seen the difference between no AA, 4x and 16x in the game? No AA looks pretty bad, although mostly for the targeting cone.

    I did believe some things you said since suggesting applying AA filtering to scorched to get a better frame rate, which worked and works on other games too.

    BTW if you want a REALLY testing benchmark, try Supreme commander : forged alliance with two screens and one being split, that would probably stress even the highest spec machine atm. It also shows the problem with low gfx memory, if the system is using system memory then there is less for use with tracking the thousands of units milling around, which also need to be rendered by the gfx etc…

    Oh and BTW, benchmarks showing 60fps in current games will end up at 30 in the not too distant future. Then again, I have seen you say you would rather buy mid-range cards often than high end cards now n then.

    Anyway, bed.

    P.S lappy, you do realise that, if memory serves, this discussion is between yourself and 2 8800GT 512MB users

    #50139

    Laptops Daddy
    Participant

    whichever. doesnt seem to be cpu limited for either for me, and the memory usage is shown to be similar.

    im not too interested in benchmarks. im just interested to see if extra video ram makes a difference when the gpu is taxed to a realistic frame rate.

    far as your argument for preserving system ram through having more video ram. i dont think it works like that. the more video ram used, the more main ram used in supplying the video ram with data.

    ps:
    no, i would never suggest a mid-range card. in terms of e.g. 8600 vs 8800 or whatever, the high-end architecture is always the way to go. (a £70 x1950 is high-end).

    #50140

    PeanutsRevenge
    Participant

    I’ll run those tests later tonight, unless I manage to get the cash to go out.

    In the mean time, I thought I’d mention something thats been bugging me over the past few years and I still cant come to a definate conclusion on and has been brought to mind by that 1GB card review.

    It used to be that you had to upgrade hardware to do something. The software was being written and hardware was being developed to run the software and was often behind.
    Yet over the last few years its been the other way around. Hardware has been release and the software companies have then been taking advantage of it.
    At the moment the 1GB card is very nearly pointless, unless you are very rich or spend every waking mement on your machine (with exceptions of course), yet, given a couple of months or so and im sure the additional buffer will start to pay dividens far more often.

    Which way do people prefer the industry to go?
    Have it, that people buy software in the hope they can run it at a satisfactory level but ‘probably’ be able to go and buy the required hardware upgrades, or, to buy a nice sparkly hardware upgrade (it was time to upgrade anyway) and then wait a couple of months for the software to be released to really push that upgrade for you to feel good about your purchase. In the mean time having the joy of loading any software and have it run soo quick that you lower the level your hardware is running so as to save power and the associated heat and noise?

    Of course in a perfect world everything would run perfectly on everything and an ideal world the hardware and software would be released together, but, we live in a world where a girl can be walking to school as a house blows up and kills her…. what a world we live in ! ! !

    P.S a £70 x1950 is not high-end IMO, that would be a pro, the xt, xtx, xtxx, xxtxxx, and xxxtxxttxxtxxxtxxtxxxx cards would be high end (what IS it with all the x’s ATI like to add?

    #50141

    Deathstryker
    Participant

    The one thing about that 1GB card though is that by the time games actually utilize that much memory, the card itself will be outdated. At least that is my opinion on it. I don’t think you’ll see a satisfactory performance gain with the 1GB card for at least a year.

    In my opinion on the upgrade thing is that I believe hardware manufacturers don’t really think about how much a person has to upgrade their computer to be able to run something or how much it will cost. Because of this, we are always having to spend money on upgrades on (sometimes useless) features that later become required down the road by software, forcing everyone to upgrade, not just enthusiasts. Added to this problem is that every company wants to make their own format so we end up spending money on something that includes as many formats as possible.

    Sure, some of these features are nice, but does the “niceness” justify the cost?

    @PR wrote:

    Of course in a perfect world everything would run perfectly on everything and an ideal world the hardware and software would be released together, but, we live in a world where a girl can be walking to school as a house blows up and kills her…. what a world we live in ! ! !

    This quote reminds me of my area. We get lots of exploding houses around here due to the rampant meth use.

    #50142

    PeanutsRevenge
    Participant

    DX 9 cpu benchmark, note that I have riva tuner on second screen showing vram usage and sidebar showing system mem usage (I have 2GB)

    DX 9 settings are set to high on everything (from memory)

    DX 9 GPU benchmark

    DX 10, set to very high if I remember right, rivatuner vmem plugin does not work with vista, so not using second screen

    #50143

    Deathstryker
    Participant

    It’s nice to know that there’s someone else out there who can’t run the game at 30 fps avg…

    My FPS is pretty close to yours in my benchmarks for Crysis.

    #50144

    Laptops Daddy
    Participant

    Very interesting. Thanks for that, Peanut.

    I’m not sure I’m any less confused by the results, but your numbers look similar to mine.

    I did some tests myself. (Apologies for the size of the images) (and you’re gonna hate me for the resolution I’m using ; )

    Crysis on 128MB ATI (‘FVM’ shown top-right is ‘free video memory’)

    low settings:

    medium settings:

    And on a 256MB Nvidia:
    low:

    medium:

    high:

    Seems high detail in Crysis goes way beyond the limits of a lowly 6k 3dmark 06 score. I did test at 1600×1200 (0.5fps in high), for a max usage of 237MB.

    Matter of interest, Peanut. What are your max settings for say, 30fps. And are you CPU limited?

    @peanutsrevenge wrote:

    In the mean time, I thought I’d mention something thats been bugging me over the past few years etc

    …a £70 x1950 is not high-end IMO

    You know there’s a definite dividing line between targeted mid-range and high-end architecture, though. e.g hd2600 vs hd2900 etc. I don’t think you should consider high-end directx 9 cards mid-range, it’s confusing enough already : )

    I think ati added the ‘hd’ to limit the chances of the old “xt is the budget model”, “no, xt is the top-end model” game reoccurring. don’t know about the x on dx9 cards.

    I’d have said hardware graphics power stopped being a limiting factor for game prettiness 4 or 5 years ago. It’s down the artists and programmers.

    May or may not be worth considering, game developers stand to make a lot of money endorsing hardware. (e.g. “plays best on nvidia”).

    pretty, high-res graphics are all very well, but a shit game is still shit.. doesn’t matter how they dress it up.

    #50145

    PeanutsRevenge
    Participant

    I am definately CPU limited, the GOU / CPU balance in my system is shameful, high end graphics (although the new 9600GT is quicker and cheaper :@ :@ :@ :@ damn the constant envolution of computer components.

    DeathStryker, what cpu and main components are you using, as we have the same graphics, would be interesting.

    30fps is, I believe, medium settings @ 1680×1050, as you can imagine I don’t like going below that res, but the game still looks amazing at medium, but OMG very high is, just, so, O, M, G, incredible. It’s the little things, interaction with foliage n the like is just better.

    BTW, very high settings can be achieved in DX9, obviously the DX 10 features are not present, but they are soo small atm that it makes lil difference aparently. I say apparently as I havnt really played the game, so not bothered to tweak the files to allow very high.

    So, what do you get on your SLI’d 6 series system then Cat?

    A point would like to raise is the COD4 vs Crysis issue.
    Is Crysis REALLY that much better than COD4 graphically? I can run COD at the highest level with a stable 30fps and it looks damn fine and, to me seems to be a much better game, hence why I shall be buying COD and not crysis.

    I think this is a good point also for the 256-512MB difference, those extra few £££/$$$ give a small amount, but if you really want that small amount you have to pay for it, as a value for money discussion, well, its still all relative (as someone with millions in the bank can think a £6000 pc is reasonable) but, if your a hardcore gamer that plays bang upto date graphically intesive games, then those few fps to go from 20-25 might well be worth the extra £200.

    #50146

    PeanutsRevenge
    Participant
Viewing 15 posts - 16 through 30 (of 30 total)

You must be logged in to reply to this topic.