Only explicitly called out on the 3520, wonder if it's in the others?
Does anyone know the power consumption hit of ECC in mobile devices? The power hit of DDR self-refresh of 32GB compared to 8GB was about 1.5W on my Dell Precision 5510.
Interestingly enough, thew new product page from the Precision 5520[0] shows the Xeon E3 as the same E3-1505M in my 5510, I guess I'd be shocked to see ECC added as it wasn't present in the 5510.
The E3-1505M does support ECC, not particularly hard to believe that it's a supported feature now.
I believe ECC overhead is around 12.5% (9 bits instead of 8) for the memory system. However I don't know what percent of power the memory system usually takes.
Seems like about half the power is required for a 3200x1800 display + touchscreen. The XPS 13 goes from 22 hours to 12 hours when you switch from 3200x1800 touch to 1080 non-touch. Of course 1080P screens are lower power, but not zero. Of the rest the CPU takes a fair amount. As a guess I'd say a 15% increase in memory power might decrease the runtime by 20 minutes.
ECC is overrated. It's almost never necessary (outside of servers) and other types of errors are orders of magnitude more likely to occur on a developer laptop.
I just built a ECC desktop, the math I saw implied I'd say 1 or 2 random crashes a year. Additionally to true random bitflips it seems MUCH better at identifying bad dimms, or bad dimm slots. I'd much rather have an ECC error than start randomly replacing CPUs, dimms, and motherboards.
Seemed pretty cheap to me. How much is a few less crashes a year worth it to you?
Keep in mind the error rate is per GB, so 32GB crashes 2-4x more often than the more common consumer sizes of 8-16GB.