OK, this is getting to be a very annoying issue. I run Fiddler 24/7 on dev machine for web development. I have a CustomRule setup to bypass gateway on certain requests from a website where I download memory dumps for troubleshooting. Also have Fiddler set to Stream all requests, and this is a 64-bit machine running Fiddler 4. When I download a 4GB + memory dump, it runs perfectly fine right up until it gets to the 2GB limit. The Fiddler memory usage is reasonable, then as it hits 2GB file size, suddenly Fiddler consumes all memory on the machine (24GB+) and grinds to an utter halt. There is no request captured in the Fiddler window because it has been removed with a custom rule. What is going on here and can it be fixed?
FIddler Classic has the limitation of 2GB as discussed in the following threads:
Currently, Fiddler Everywhere also has similar size limitations bu you should be able to use the app (just won't see the large response body).
Marking this one as a feature request, so that the team could research if there are possibilities to support larger sizes.
In the typical case, Fiddler Classic and Fiddler Everywhere should drop the large response body, and you should see a warning message that the body is dropped (the session size itself won't be large).
Example for dropped body warning in Fiddler Classic.
Can you let us know the format of the memory dumps files and if there are any specific in how the session is executed?