I'll be honest, from what I have heard of many history classes, most of them merely mention WWI (though several do go in depth). Even then, I've yet to have one that mentioned the Pacific portion of the war (even though it had relatively little impact on the outcome), or tried to explain what the Japanese gained from the war (rather, what they didn't gain), that being if they mentioned Japan at all.
I just feel it doesn't get enough attention. Thoughts?