Seriously man ? Are you saying that because you work this stuff or is it just the all knowing general culture telling you what to write ?
Tonight I am very high and therefore nice so I will actually tell u why what u said makes no bloody sense at all.
Drives contain often compressed data. CPU gets given a file system which queries the actual drive -> it goes and fetches data, caches it somewhere in the slow chache levels and in chunks and decompresses it. The result is sent to RAM.
The application's job is to determine which data from the game needs to get sent to RAM.
U don't want to load act2 in memory while u play act1, right ?
Having an SSD obviously speeds up the first process. Meaning that having an SSD will supply the cpu with more data to decompress, and since CPUs can handle really a lot, we can exclude it as a bottleneck. Then the result must be written in memory, so then memory write speed and latencies play an important part in the speed of this process.
However!! Eventually you're going to write all your necessary Rakki's Crossing data into memory. (THAT IS ASSUMING YOU HAVE ENOUGH MEMORY! You do, dont you ?!) and then you can pretty much turn off your bloody hard drive (joking, do not try) because you have everything cached.
To summarize - do NOT attribude low FRAMERATE with hard drive speed. Look out for a different kind of stutter, known as the memory lag, which is what you experience in any demanding game when you just start a level (unless you have a space PC :D )
I actually thought a bit on what could potentially be the problem, ofcourse I made some assumptions on how the render pipeline operates.
Now I just want to say that I have NOT experienced bad framerate there or actually anywhere with my clocked 7970. I'm running a 6 year old 7200RPM HDD.
Meaning I can't know whether there really is a potential 'sink' or whether D3's graphic developers have screwed up some particle emitters and some D3 level designers have decided to use that emitter a billion places in the level.
I attribute the problem to be of rendering nature, since otherwise, we would all have it.
I believe some shader in the client side is not getting compiled with the right directives. That is, after the processed shader is uploaded to the gpu, it does not meet a certain shader clause and fails to execute a particular render pass. Now this should immediately fail with 'unsupported shader exception', but I believe the shader makes a fallback to some other highly unoptimized implementation.
This problem could be coupled with a leak still, it is possible that it's connected to the level generation and (more importantly) instantiation processes, since before i get sent the meta to generate the map from, all I have is assets in memory. It doesn't make sense that those are wrong.
I'd go in the driver panel options of my GPU and try to first and foremost play around with Antialiasing, since it is a shaderbased postprocessing effect. I'd actually play around with all settings tbh. I'd try to change driver. Perhaps to something older. Check what GPUs were out, say, 6m-1y before release date (probably when act3 was made) and check out what drivers for your GPU were actual back then and try those. there's a good chance testing dept. tested with some of them. If your GPU is new and you don't have support for it in older drivers ... I don't know :) The rendering API (DirectX) is a step in the shader compilation process before bytecode gpu upload so theoretically it can make a difference, tho I really doubt it.
BLIZZARD WE NEED A RENDER DEBUG STATS SCREEN !! :D I know they have one in, but it's surely been disabled for release branch ;(
Sorry for the long post.
Edited by GreedyBlizz#1844 on 1/28/2013 11:44 AM PST