Quantcast
Channel: MSDN Blogs
Viewing all articles
Browse latest Browse all 12366

A little 64 bit follow-up

$
0
0

I recently wrote a pair of at least slightly controversial articles about 64-bit vs. 32-bit applications and the performance costs associated with going to 64 bit.

They are here and here.

Some people thought I was nuts.

Some people provided benchmarks "proving" that 64 bit code is in fact faster than 32 bit code.

I guess the lesson for me here is that I can't say often enough that I only ever write approximately correct blog entries and YMMV should be stamped on top of everything.

But, just so you don't think I've lost my mind, here is some data.

These are recent builds of IE and Edge in 32 bit and 64 bit.  Its the same code, or as close as possible to the same code.  This is just the main engine of both plus the old script engine and the new script engine.

 32 bits64 bitsGrowth
edgehtml.dll18,652,16022,823,93622.4%
mshtml.dll19,322,88024,577,53627.2%
chakra.dll5,664,2567,830,01638.2%
jscript9.dll3,667,9684,895,23233.5%

 

This result is completely typical -- between 1.2 and 1.4x growth is very normal for mainstream code.  This is not universal, but it's in line with what I've seen elsewhere.

Yes there are exceptions.  Yes it totally depends on the codebase.

Now is this going to be slower?  Almost certainly because bigger is slower.

Why almost?

Well, if the code has really good locality (e.g. you only use a tiny slice of it;  e.g. you move from one well used slice to another well used slice)  then it might be that the parts of the code that are hot at any given moment are still fitting well into the cache.  But that's sort of a lie as well.  You see in a real system there is pollution because device drivers are running and background processes are running and so it's never really the case that there is surplus cache capacity.  In real systems extra cache capacity basically just creates the opportunity for your code to remain efficient in the presence of other normal workloads.  Micro-benchmarks don't these effects.

I also said that the data is always bigger.  People disagree, but there really is no room for disagreement here.  I think we can all agree that going to 64 bits won't make your data smaller, so the best you can hope for is a tie.  I also think you are going to have at least one pointer on the stack somewhere.   So a tie isn't possible.  Pedantic yes, but on point.  The truth is that depending on your workload you will see varying amounts of growth and no shrinkage.  If your growth is sufficiently small, or the locality of the data is sufficiently good, then the extra cache misses from size growth will not be so bad.  Micro-benchmarks will often still fit in the cache on a 64 bit system. 

Visual Studio is in a poor position viz data growth because it has a lot of pointer rich data structures.  Microsoft Edge, Chrome, and Firefox are in relatively better positions because much of their data is not rich in pointers -- bitmaps for instance are essentially (minus some metadata) identical, strings are identical, styles can be stored densely.  As a consequence browsers will suffer less than VS would.

The bottom line is that this will vary a great deal based on your workload.  But the price of cache misses should not be underestimated.  A modern processor might retire over 200 instructions in the same time as one cache miss.  More in extreme situations.  That's a lot to recover from, extra registers won't do it universally.

I wrote about these phenomena in greater detail in this article from 2014.  That article includes a quantitative examination of data growth and locality effects.

I had some top level observations for anyone contemplating a port:

* if you think you're going to get free speed, you have it backwards -- you start in the hole, you will have to win back your losses with processor features, definitely possible but not a given

* if you're going to 64 bits because you're out of memory but you could fix your memory problems with "let's not do this stupid thing anymore" then you should do that

* if you're avoiding 64 bits because you keep saying "I can stay in 32 bits if I do this new stupid thing" then you're kidding yourself, crazy workarounds are doomed

Visual Studio in 2009 was definitely in the situation whereby avoiding some dumb stuff we could make it fit.  I don't know where it is in 2016.

This also excludes collateral benefits you get from going to 64 bits, such as avoiding the wow subsystem, cleaning up an otherwise potentially clumsy architecture, greater security due to increased exploit difficulty.  Some of these benefits may be very important to your workload/customers. 

 

 


Viewing all articles
Browse latest Browse all 12366

Trending Articles