I don't agree. Macbook retina shows everything perfectly. Nothing should be small, the OS should do scaling by taking screen size into consideration, which seems is not the case in some Windows versions.
You are not quite corect while blaming Windows here and in other topics (like this one:
viewtopic.php?t=85365).
Windows so called 100% DPI means
96 DPI, thats "standard" (in reality
default, not standard) DPI done a lot of bad impact, because a lot of programmers used to it as to
the only one DPI. 150% DPI in Windows doesn't mean it's something bad and non-standard, just monitor has not default 96 DPI but 144 DPI (or close to that, mine is 142 DPI).
Windows from old times can say current DPI to application. That's not something new. Application just have to ask it. OS X retina mode introducing some logical pixels not physical one for that bad written applications, that's one of they ways, bad way of this approach is that pixel is not really a pixel anymore. And current Windows 8/10 scaling is just a workaround for application that doesn't behave well with DPI. But it makes application blurry and just readable not well-readable anyway.
So it's not a Windows problem it is an
application problem, it's not DPI aware (in Windows terms:
https://msdn.microsoft.com/en-us/librar ... s.85).aspx).
If you are using WinAPI you could rely on good-old GetDeviceCaps(hDC, LOGPIXELSX) and GetDeviceCaps(hDC, LOGPIXELSY):
https://msdn.microsoft.com/en-us/librar ... s.85).aspx. And yes, despite modern displays has square pixel that's not always true, they could be rectangular, but that's not too actual nowadays.