Crispy Pixels and the Golden Ratio 2014-10-13

Not too long ago I heard an offhand remark about monitor aspect ratios from someone who will not be named, though I totally just named them.

Anyway, This nameless person suggested that 16:9 monitors were best and that it was time for everyone to settle on it as the eternal standard. (I’m paraphrasing because it’s been more than ‘not too long ago’ and I can’t remember the exact words).

There’s a big problem with this; 16:9 turns out to not be a very good aspect ratio when you factor in the need to deal with lots of text (like, the internet), and legacy 4:3 media.


First let’s go over some of the many aspect ratios you’re likely to find on various display devices and media; they are in order of increasing relative width.

  • 4:3 - Found on legacy TV shows, Old movies, iPads, most e-readers, most video games prior to ~2005

  • 3:2 - 35mm still photography, gameboy advance, iPhones 1-4, some modern tablets and laptops (surface pro 3, chromebook pixel, others)

  • 16:10 - Older widescreen computer monitors, many modern tablets.

  • 16:9 - HDTV, Newer widescreen computer monitors, recent video game consoles, many movies(not exact).

  • 21:9 - CinemaScope / modern anamorphic widescreen movies (none of these are exact)


Most printed forms of media(books, letters, comic books, magazines,etc) have aspect ratios between 5:4 and 16:10, I'm using a very rough average of ~√2:1 for visualization purposes, this aligns with the aspect ratio of ISO 216 A-series paper (e.g. A4). I'm also always going to refer to aspect ratios with their longest-side first, even when I'm talking about them in a vertical orientation.

Also of interest; the typical human field of view is about 180 degrees by 135 degrees, or 4:3.

There’s actually a lot of science and tradition on how to arrange text so they can be read quickly and comfortably. We tend to read best when text is in well-margined delineated blocks(This blog’s text blocks are actually too wide, I’m sure you’ve noticed by now. Also, the font is blurry with chrome’s new renderer. sigh).

With all that background out of the way, here’s each of the ratios we're examining with semi-transparent overlays indicating an approximate average book-page/printed page’s ratio(~√2:1). The dark blue indicates where an average book page would be if you fit-to-page. The semi-transparent parts below the blue indicate how far the page would extend if you fit-to-width. The more you’ll have to scroll to read width-aligned content.

Page-to-Screen Ratio 1


Here’s the same fit-to-page, only with the screen oriented vertically. This also approximately reflects the screen usage when viewing legacy 4:3 video when oriented horizontally.

Page-to-Screen Ratio 2


Here’s each ratio with 16:9 media content overlaid.

Page-to-Screen Ratio 3


It’s clear why most e-readers and reading-oriented tablets use 4:3 resolutions, it lines up very well with the average book ratio while in portrait mode. It’s only able to use 75% of its screen space for 16:9 media.

3:2 is actually a decent compromise ratio, It’s able to put 84% of its pixels to work for 16:9 media, and ~92% for vertical text and legacy video. Horizontal-mode text should usually only need two page scrolls to read. It’s a shame apple’s switched to 16:9 for the iPhone 5 and later, as I suspect people spend more hours reading text on their phones than watching videos.

16:10 has the lowest maximum amount of unused space when used in optimal orientation (~86% with vertical text/legacy video, and 90% with 16:9 media). Some pages will take three page scrolls to read, but the majority will take two.

16:9 ends up being only able to utilize more than ~77% of the screen when displaying book-sized text vertically and legacy video. Many pages will take three scrolls to read when they'd take two for the other ratios.

21:9 is not well-suited for text at all; and is only able to use 76% of the screen when displaying 16:9 media.

However, the ratio is only part of the whole problem. Resolution plays a huge part as well.

LCD panels have what is called a ‘native’ resolution, essentially this is the only resolution it can actually display information at, and supporting other resolutions is done through software scaling.

Let’s say you have a computer monitor with a resolution of 2560x1440. You fire up a 720p video and full screen it on your monitor. As the incoming content is not the panel’s native resolution it has to scale things up, in this case every incoming pixel becomes 4 on screen (2 in each direction), as 1280 * 2 by 720 * 2 is 2560x1440.

Scaling 720p to 1440p

The content scales well and looks as good as if it were displayed on a native 720p panel.

Now let’s try a more complicated example, You have the same 720p video as above, but this time you’ve got a computer monitor with a resolution of 1920x1080. Now we have a problem, 1080 is 1.5x 720, so we can’t perfectly match our panel’s native pixel boundaries.

Scaling 720p to 1080p

This introduces numerous scaling artifacts, all of which compromise the resulting image.

It can also get far worse, let’s say you have a 1080P TV with a previous-gen console attached running a game that natively renders at 640p (Halo 3 does this), the console’s internal scaler up-samples this to 720p, which is then fed into the TV.

Scaling 640p to 720p, then to 1080p

The single-pixel checkerboard pattern is pretty much the worst-case scenario for scaling, so let’s show a real-life example using the wonderful indie game Shovel Knight. The game is internally rendered at 400x240 before being scaled to whatever resolution you choose. 400x240 isn’t a 16:9 aspect ratio, it’s 5:3 (The same as the 3ds’s upper screen, which is likely why it was chosen), by default the game picks and only offers 16:9 resolutions w/o manually dragging the window to size it.

Here we have the game displayed in a true multiple of the game’s resolution, 1600x960, vs 1600x900, which is the nearest option offered by the game’s menu:

Shield Knight Comparison

The top image offers perfect crispy sharp pixels with no introduced scaling errors while the lower image is blurry.

You might ask why being a bit blurry is so bad?, Other than just being inaccurate to the pre-scaled images, your eyes will actively try to focus the blur away, constantly trying to bring blurry images and text into focus, leading to eyestrain.


Now, as a gamer who often plays retro games, there are some ‘magic’ screen resolutions that allow you to reach clean integer-perfect scaling for a large number of content while still having a nice high resolution display.

There are a number of common resolutions that the majority of PC gaming used (And a reasonable chunk of console gaming's history as well)

  • 320x200 - Most DOS games, games from many other early computers as well.
  • 320x240 - Covers some mid/late-era dos games
  • 640x480 - Covers some late-era dos games(warcraft 2, simcity 2000) and most early-mid windows games (starcraft), several video game consoles
  • 800x600 - Commonly the top-supported resolution for early 3d games, and is the top supported resolution for Diablo 2
  • 1024x768 - The maximum resolution for very few late-era dos games and some windows games
  • 1600x1200 - The usual top-end supported resolution for games in the late 4:3 era

We need to talk a bit about the 320x200 mode first, as it's some what odd. 320x200 isn’t a 4:3 resolution, it’s 16:10, so once displaced on a standard 4:3 monitor of the era the pixels are no longer square. However, many games made their art assets assuming square pixels anyway.

Here’s an example with the original Civilization game for DOS.

Civilization 1, intro sequence

In the intro the planet is only properly spherical when viewed at 16:10

In actual gameplay, square tiles are 16x16, not the ~16x13 necessary to make them square when displayed at 4:3.

Civilization 1, gameplay

Here’s a counter-example from X-Wing, the radar is clearly designed to only be a circle when displayed at 4:3.

X-Wing radar

What this means is that each 4:3 320x200 pixel is 1.2 times taller than it is wide, to end up with an accurate pixel-aligned edge when scaling you need to map each square pixel to 6 pixels vertically and 5 pixels horizontally, doing so yields a resolution of 1600x1200, the minimum resolution necessary to aspect-correct this without scaling artifacts.

That is, 320 * 5 ->1600; 200 * 6 ->1200

From about 2005-2012 Acquiring a large 16:10 1920x1200 monitor for your desktop was simple; however, for various market reasons 16:9 1920x1080 screens have come to dominate the retail space. If you're a retro gamer who wants accurate crispy pixels, the display market can seem pretty dire.

Here's a comparison of the above resolutions displayed on a 1920x1080 vs 1920x1200 panel while maintaining integer-scaling.

1200p vs 1080p retro scaling chart

A 1080p panel can't do 1600x1200 at all; You can't have 320x200 aspect-corrected, and none of them will fit to your screen's full height.

On a 1920x1200 panel all but 640x480 and 1024x768 can achieve full-height integer scaling, including pixel-perfect aspect-correction for 320x200 4:3 content.


Now let’s take a moment to talk a bit about 4k displays. Somewhere along the line marketing rounded up 3840 to mean ‘4k’, so, you end up with the following resolutions:

  • 4k 16:9 -> 3840x2160
  • 4k 16:10 -> 3840x2400

The best thing about 4k is that 720p and 1080p video scales perfectly into it ( times 3 and times 2, respectively) meaning virtually all of your modern video content will look clean and crisp.

No one has announced one, but my future dream monitor is a 16:10 4k panel. It would have the same full-height scaling characteristics of the 1920x1200 display, but will also allow 640x480 to hit the monitor's full height while integer scaling; and can fit 720p and 1080p content to the screen's width while integer scaling.


A bit of an after discussion, but a lot of display panels have crappy scaling; even when they can properly scale by 2 or 3, they still cause blurring in the resulting picture. You can usually tell your video card to scale content instead of your display, AMD's scaling introduces blurriness when it shouldn't, but when I last checked, NVIDIA's did not.

It's also a problem with a lot of streaming video sites, they seem to pick a really small size for their player's window and don't have any controls to set the window size to something that will exactly match the content, leading to decreased video quality for the user.


Also; sorry for the long delay in posting; I thought I'd be dead by now.

May your pixels be sharp as your skills,

Sarah