I read a great article published by Brent Ozar and came up with some questions related to memory grant. I am unable to address my questions in the comment section of his article, so I thought to get any help from here.
- Question: How much data is spilled into disk? 400 MB or 60 MB(7643KB*8)?
in the article he states:
And no matter how many times I update statistics, I’ll still get a
~400MB spill to disk.
I am kinda confused here(
- Question: If everything is okay with estimates, stats are up to date, box has sufficient memory, and no queries were running at that time, then why does spill to disk happen?
look at the estimated number of rows versus the actual number of rows.
They’re identical. The stats are fine.
I’m not using a small server, either: my virtual machine has 32GB RAM, and I’ve allocated 28GB of that to SQL Server. There are no other queries running at the same time – it’s just one lonely query, spilling to disk…