Crispy Pixels and the Golden Ratio 2014-10-13

Not too long ago I heard an offhand remark about monitor aspect ratios from someone who will not be named, though I totally just named them.

Anyway, This nameless person suggested that 16:9 monitors were best and that it was time for everyone to settle on it as the eternal standard. (I’m paraphrasing because it’s been more than ‘not too long ago’ and I can’t remember the exact words).

There’s a big problem with this; 16:9 turns out to not be a very good aspect ratio when you factor in the need to deal with lots of text (like, the internet), and legacy 4:3 media.


First let’s go over some of the many aspect ratios you’re likely to find on various display devices and media; they are in order of increasing relative width.

  • 4:3 - Found on legacy TV shows, Old movies, iPads, most e-readers, most video games prior to ~2005

  • 3:2 - 35mm still photography, gameboy advance, iPhones 1-4, some modern tablets and laptops (surface pro 3, chromebook pixel, others)

  • 16:10 - Older widescreen computer monitors, many modern tablets.

  • 16:9 - HDTV, Newer widescreen computer monitors, recent video game consoles, many movies(not exact).

  • 21:9 - CinemaScope / modern anamorphic widescreen movies (none of these are exact)


Most printed forms of media(books, letters, comic books, magazines,etc) have aspect ratios between 5:4 and 16:10, I'm using a very rough average of ~√2:1 for visualization purposes, this aligns with the aspect ratio of ISO 216 A-series paper (e.g. A4). I'm also always going to refer to aspect ratios with their longest-side first, even when I'm talking about them in a vertical orientation.

Also of interest; the typical human field of view is about 180 degrees by 135 degrees, or 4:3.

There’s actually a lot of science and tradition on how to arrange text so they can be read quickly and comfortably. We tend to read best when text is in well-margined delineated blocks(This blog’s text blocks are actually too wide, I’m sure you’ve noticed by now. Also, the font is blurry with chrome’s new renderer. sigh).

With all that background out of the way, here’s each of the ratios we're examining with semi-transparent overlays indicating an approximate average book-page/printed page’s ratio(~√2:1). The dark blue indicates where an average book page would be if you fit-to-page. The semi-transparent parts below the blue indicate how far the page would extend if you fit-to-width. The more you’ll have to scroll to read width-aligned content.

Page-to-Screen Ratio 1


Here’s the same fit-to-page, only with the screen oriented vertically. This also approximately reflects the screen usage when viewing legacy 4:3 video when oriented horizontally.

Page-to-Screen Ratio 2


Here’s each ratio with 16:9 media content overlaid.

Page-to-Screen Ratio 3


It’s clear why most e-readers and reading-oriented tablets use 4:3 resolutions, it lines up very well with the average book ratio while in portrait mode. It’s only able to use 75% of its screen space for 16:9 media.

3:2 is actually a decent compromise ratio, It’s able to put 84% of its pixels to work for 16:9 media, and ~92% for vertical text and legacy video. Horizontal-mode text should usually only need two page scrolls to read. It’s a shame apple’s switched to 16:9 for the iPhone 5 and later, as I suspect people spend more hours reading text on their phones than watching videos.

16:10 has the lowest maximum amount of unused space when used in optimal orientation (~86% with vertical text/legacy video, and 90% with 16:9 media). Some pages will take three page scrolls to read, but the majority will take two.

16:9 ends up being only able to utilize more than ~77% of the screen when displaying book-sized text vertically and legacy video. Many pages will take three scrolls to read when they'd take two for the other ratios.

21:9 is not well-suited for text at all; and is only able to use 76% of the screen when displaying 16:9 media.

However, the ratio is only part of the whole problem. Resolution plays a huge part as well.

LCD panels have what is called a ‘native’ resolution, essentially this is the only resolution it can actually display information at, and supporting other resolutions is done through software scaling.

Let’s say you have a computer monitor with a resolution of 2560x1440. You fire up a 720p video and full screen it on your monitor. As the incoming content is not the panel’s native resolution it has to scale things up, in this case every incoming pixel becomes 4 on screen (2 in each direction), as 1280 * 2 by 720 * 2 is 2560x1440.

Scaling 720p to 1440p

The content scales well and looks as good as if it were displayed on a native 720p panel.

Now let’s try a more complicated example, You have the same 720p video as above, but this time you’ve got a computer monitor with a resolution of 1920x1080. Now we have a problem, 1080 is 1.5x 720, so we can’t perfectly match our panel’s native pixel boundaries.

Scaling 720p to 1080p

This introduces numerous scaling artifacts, all of which compromise the resulting image.

It can also get far worse, let’s say you have a 1080P TV with a previous-gen console attached running a game that natively renders at 640p (Halo 3 does this), the console’s internal scaler up-samples this to 720p, which is then fed into the TV.

Scaling 640p to 720p, then to 1080p

The single-pixel checkerboard pattern is pretty much the worst-case scenario for scaling, so let’s show a real-life example using the wonderful indie game Shovel Knight. The game is internally rendered at 400x240 before being scaled to whatever resolution you choose. 400x240 isn’t a 16:9 aspect ratio, it’s 5:3 (The same as the 3ds’s upper screen, which is likely why it was chosen), by default the game picks and only offers 16:9 resolutions w/o manually dragging the window to size it.

Here we have the game displayed in a true multiple of the game’s resolution, 1600x960, vs 1600x900, which is the nearest option offered by the game’s menu:

Shield Knight Comparison

The top image offers perfect crispy sharp pixels with no introduced scaling errors while the lower image is blurry.

You might ask why being a bit blurry is so bad?, Other than just being inaccurate to the pre-scaled images, your eyes will actively try to focus the blur away, constantly trying to bring blurry images and text into focus, leading to eyestrain.


Now, as a gamer who often plays retro games, there are some ‘magic’ screen resolutions that allow you to reach clean integer-perfect scaling for a large number of content while still having a nice high resolution display.

There are a number of common resolutions that the majority of PC gaming used (And a reasonable chunk of console gaming's history as well)

  • 320x200 - Most DOS games, games from many other early computers as well.
  • 320x240 - Covers some mid/late-era dos games
  • 640x480 - Covers some late-era dos games(warcraft 2, simcity 2000) and most early-mid windows games (starcraft), several video game consoles
  • 800x600 - Commonly the top-supported resolution for early 3d games, and is the top supported resolution for Diablo 2
  • 1024x768 - The maximum resolution for very few late-era dos games and some windows games
  • 1600x1200 - The usual top-end supported resolution for games in the late 4:3 era

We need to talk a bit about the 320x200 mode first, as it's some what odd. 320x200 isn’t a 4:3 resolution, it’s 16:10, so once displaced on a standard 4:3 monitor of the era the pixels are no longer square. However, many games made their art assets assuming square pixels anyway.

Here’s an example with the original Civilization game for DOS.

Civilization 1, intro sequence

In the intro the planet is only properly spherical when viewed at 16:10

In actual gameplay, square tiles are 16x16, not the ~16x13 necessary to make them square when displayed at 4:3.

Civilization 1, gameplay

Here’s a counter-example from X-Wing, the radar is clearly designed to only be a circle when displayed at 4:3.

X-Wing radar

What this means is that each 4:3 320x200 pixel is 1.2 times taller than it is wide, to end up with an accurate pixel-aligned edge when scaling you need to map each square pixel to 6 pixels vertically and 5 pixels horizontally, doing so yields a resolution of 1600x1200, the minimum resolution necessary to aspect-correct this without scaling artifacts.

That is, 320 * 5 ->1600; 200 * 6 ->1200

From about 2005-2012 Acquiring a large 16:10 1920x1200 monitor for your desktop was simple; however, for various market reasons 16:9 1920x1080 screens have come to dominate the retail space. If you're a retro gamer who wants accurate crispy pixels, the display market can seem pretty dire.

Here's a comparison of the above resolutions displayed on a 1920x1080 vs 1920x1200 panel while maintaining integer-scaling.

1200p vs 1080p retro scaling chart

A 1080p panel can't do 1600x1200 at all; You can't have 320x200 aspect-corrected, and none of them will fit to your screen's full height.

On a 1920x1200 panel all but 640x480 and 1024x768 can achieve full-height integer scaling, including pixel-perfect aspect-correction for 320x200 4:3 content.


Now let’s take a moment to talk a bit about 4k displays. Somewhere along the line marketing rounded up 3840 to mean ‘4k’, so, you end up with the following resolutions:

  • 4k 16:9 -> 3840x2160
  • 4k 16:10 -> 3840x2400

The best thing about 4k is that 720p and 1080p video scales perfectly into it ( times 3 and times 2, respectively) meaning virtually all of your modern video content will look clean and crisp.

No one has announced one, but my future dream monitor is a 16:10 4k panel. It would have the same full-height scaling characteristics of the 1920x1200 display, but will also allow 640x480 to hit the monitor's full height while integer scaling; and can fit 720p and 1080p content to the screen's width while integer scaling.


A bit of an after discussion, but a lot of display panels have crappy scaling; even when they can properly scale by 2 or 3, they still cause blurring in the resulting picture. You can usually tell your video card to scale content instead of your display, AMD's scaling introduces blurriness when it shouldn't, but when I last checked, NVIDIA's did not.

It's also a problem with a lot of streaming video sites, they seem to pick a really small size for their player's window and don't have any controls to set the window size to something that will exactly match the content, leading to decreased video quality for the user.


Also; sorry for the long delay in posting; I thought I'd be dead by now.

May your pixels be sharp as your skills,

Sarah

Xbox One And The Digital Dark Age 2013-05-25

According to reports, the new Xbox console must check in with Microsoft once every 24 hours. This is a huge departure from previous generations and it highlights one of the great tragedies of our time.

As an example, as long as the hardware I have is functional, I can still insert an NES cartridge into my nearly-30 year old NES and have it work. Someone should be able to do the same 1000 years from now, provided the mechanical bits of the NES are still functional. (and they can legally reproduce a working model otherwise)

What happens ten years from now when these Xbox One DRM servers are shut down?; Occasionally vendors in similar situations will disable the check or allow the community to take over their operation, but there's no guarantee this will happen.

It's quite possible that by 2020, it will be illegal to play the Xbox One games you've purchased with your own money. Companies may 'promise' that this won't happen but we've already seen lots of rights to various games get lost in one various legal quagmire or another, leading to situations where no one actually knows who, if anyone, owns the IP to a particular game.

In the past paying customers could modify the hardware or software to work around these issues, but this was made illegal, in the US, at least, by the Digital Millennium Copyright Act in 1998. The DMCA passed unanimously, by the way, showing that concern for the long-term protection of our cultural history is not on anyone's mind.

Yes, I do consider video games art. I think we can expect The Legend of Zelda and the like to be seen in the same sort of light as the Iliad is today; One of the first works in its form, influencing generations of artists and players. This will not happen anymore. No matter how good a Half-Life 3 is, it will eventually be locked away in some company IP vault never to see the light of day.

What will people a thousand years from now think happened to the art produced in this century? It's all locked away, rights either lost in the graves of dead companies or hunted to extinction by their creator's IP troll heirs.

They could either view eternal copyright and anti-DRM-circumvention laws as something similar to the burning of the library of alexandria, robbing the future of untold troves of art and knowledge, or they'll think nothing at all, that were simply savages devoid of any real creativity beyond animated cat pictures. A true digital dark age.

Just like the first dark age; It's not dark because nothing happened, it's dark because no one can really tell what happened.

I once heard civilization described as a sense of permanence. What does it mean when our art is no longer permanent? Are we less civilized?.

Maybe.


with love,

Sarah

Interviewing With Google 2013-03-14

A while back I was invited to interview with Google at their Kirkland, Washington facility. I probably signed an NDA or something, but I'm not going to talk about anything technical anyway. I'm mostly just going to talk about my observations of the process.

Google is weird.

I've worked in office environments pretty much my entire adult life. After a while they all tend to feel the same, sitting in cubicles, eye-rolling at endless office politics, with anything unique or interesting is slowly and relentlessly hammered out of existence.

Google is different, but I'm not sure if it's more an experiment against reality than something to try to emulate.

When I entered the main building, the first thing I noticed is an indoor rock climbing wall, complete with Google-themed hand-holds. My weird-o-meter was immediately pegged.

During one part of the tour we stopped over a suspended walkway thing and looked down at the floor. The floor tiles below were the same blue, red, yellow, and green that make up the Google logo. My handler suggested their might be a pattern to the tile layout. (I suspect not, I doubt you could get a contractor to lay out tiles in a seemingly-random pattern without them messing up enough to make it impossible to decipher).

Most of the decor in the facility followed the Google color scheme. It was all sort of eerie, like it's all really just a theme park with a secretive labyrinth below it all keeping things moving. I also don't know why they have to hate on orange and purple; those are some pretty good colors.

During the lunch-breaky part of the tour, We came across two Guitar Hero / Rock Band rooms, but the real winner was the completely-illegal fully loaded mame arcade cabinet. My handler and I tried to play a fighting game, but the controls were misconfigured(Google is so sloppy sometimes D:).

Hovering over their (free!) lunch room was a large print of this. Kind of cute, but let's get back to the issue of how crazy they are. Food in their cafeteria is free, they actually had a pretty wide selection of things too, unlike my previous experiences with free food, where it's pretty much just whatever the boss's secretary was craving that week that happened to fit in the budget. Though their idea of 'organic' fruit pretty much just means over-ripe and bruised.

They've also stashed numerous refridgerators full of energy drinks throughout the complex,

The whole free-food theme-park with vidja games and exercise is clearly created to keep people together and working as long as possible. It probably works pretty well at that too, though those are usually the first things to go whenever a company needs to cut expenses.

Hotties only

I'm not sure if it's just the local culture(probably not, I've been to that area more than once), or if (more likely) they pay for style assistance services or something, but everyone there was quite stylish and/or pretty good looking, following the nerdy-chic apple store template.

My usual experience with System Engineer types is that they dress in whatever shirt they got for free from that one vendor two training classes ago that happens to be (marginally) clean, put on pants that are too short for their legs, and their only pair of black shoes(regardless of condition). There's nothing wrong with this approach at all; It's reasonably comfortable and doesn't require much time investment. In fact, style in the computer field usually works the exact opposite as it does in other fields, the guy wearing shorts, a shaggy beard, and flip-flops is usually the smartest and most respected guy in the room (otherwise he wouldn't be able to pull that kind of stuff).

Either way, I'm not sure how Google managed to either convert hundreds (or thousands?!) of unstylish nerds into people who get haircuts more than once every six weeks, or find hundreds (or thousands!?) of them to hire, to exclusion of everyone else.

Oh, They actually asked me questions.

The interview part of the day was pretty exhausting. I think it was about 6 hours long, and they had groups of 1-2 people come in and ask me questions, one of them would write a transcript of what went on (getting system engineers to do paperwork is more impressive than getting them to get haircuts).

The questions were not all that difficult. The first set of questions were about a situation very similar to something we went over in one of my phone interviews; The rest was just trying to figure out my basic understanding of things and some reasonably basic technical questions.

The odd thing comes later, where after the first couple of sessions you start hearing the same questions again. The first couple times it showed up I mentioned that someone asked me that before, but they wanted me to keep going. After a while, you can tell what they're trying to do. They don't really care what you know, they just want to see if you can figure stuff out; Which is fine, but you don't need to be disingenuous or appear sloppy because of it.

Needless to say, I didn't get it.

As typically happens for me and fly-out interviews, I received the call when I was in the airport waiting for my flight. To me, that always seems like the worst time to get that sort of news, you have several boring hours ahead of you where you have to think what went wrong. You feel you've wasted time, and you always blame yourself.

One part of it has stuck with me more than anything though, is that before I flew out for the interview my mom said she was proud of me. It was the last time I ever heard it from her. Bleh.


With love, especially for orange and purple,

Sarah

Being In Love 2013-02-04

I'm going to talk about something I said I was going to avoid. Yay for consistency! Also, I might be drinking right now.

I was in a relationship with a man for almost nine years, It wasn't a very fulfilling one, even early on. I stuck with it hoping things would get better.

He rarely made me feel beautiful, the sex was always awful, our typical day was spent in the same room, not talking to each other. We were engaged for the last four years of our relationship. I never wanted to take his name. When I imagined my future, he was never there. To top it off, I was ready to die, and he didn't notice.

I'm being pretty hard on him, he's actually a nice guy (too nice in the bedroom) and a pretty good catch for someone who is a better match for him. Many of the issues we had together were actually my issues. I know he's going to find the wonderful person he deserves before long.

Anyway, while I was with him, I fell in love with someone else.

With her everything was different; She didn't just tell me I was beautiful, she made me feel it. Things in my life were long-overdue problems suddenly seemed like an adventure to share. I was easily able to imagine spending my last day on earth with her. I was perfectly willing to take her last name(not really required, but still). Everything was so clear. I knew where I wanted to be, and who I wanted to share my life with. I was in love for the first time in a decade, and it felt great.

Then things changed, but, that's not what this post is about.

I think it's clear I miss the way she made me feel. I hope we all can find someone like that, even if it isn't forever.

with love,

Sarah

Butter Is Best 2013-01-30

Let's go back in time a few years. It's probably 2009 or 2010, and we're at our friend Steve's(some of you might remember him) house. They've made brownies! yay! this is going to be great!.

The brownies look as expected, the top has the sort of semi-glossy top that forms, the interior is the sort of dense crumb you'd expect to see. However, the first bite goes horribly wrong. There is cholocate flavor, but it's covered up by a waxy funk.

Our hosts immediately note that these brownies are not good.

"I followed the directions on the box. I'm not sure what happened", Steve's (now ex) wife said.

She checks the directions to see if she missed anything. The directions for these (and most) brownies are to mix vegetable oil, eggs, and water with the powdered mix. She shrugs and assumes her eggs were bad.

I actually already know what the problem is, and head over to her pantry and pull out their vegetable oil. It's a large container, maybe 1.5 liters, nearly full. It appears to have been purchased maybe 3 years prior, and was probably opened around that time. I take a sniff of it. It smells like crayons.

I say, "Your oil is rancid"

"Vegetable oil goes bad?!", Steve's wife replies.

Yes. Vegetable oil goes bad. You really don't have very much time before it does, either. You can delay it from going rancid by stashing it in the refridgerator. But buying large containers of cooking oil is generally not a good idea unless you plan on doing some deep frying.

But wait, this story is about butter. Where is the butter?!

They tossed their rancid vegetable oil that night and called a few weeks later.

"I started making brownies and am out of vegetable oil!"

"Use butter", I reply

"But the directions! You can never violate the directions! My mom spent thirty years in juvie for that!", She counters

The reason the directions are written that way has less to do with the virtues of vegetable oil in a brownie (of which there are none), than with current and past fads. Butter has been mostly out of style for decades, hated on by health nuts who say that margarine or polyunsaturated vegetable oils are best. Boxed mixes also want to appear to be a good value, vegetable oil is significantly cheaper than butter.

She reluctantly makes the brownies with butter. She calls back later, telling me that these are the best brownies she's ever tasted, way better than ones made following the directions on the box(even with non-rancid oil).

Basically, the lesson is that butter is magic.

Why though? Why is butter so much better here?

There are multiple causes, though the most important is actually butter's melting point.

Butter is a soft solid at room temperature, but it melts a just a bit below human body temperature. This means each bite starts out as tender, and then the butter starts to melt in your mouth carrying flavor and the appearance of moistness.

Let's examine alternatives in our brownies to see how they stack up against the butter.

The vegetable oil called for in the directions has a melting point far below human body temperature, this leads to the brownie feeling less tender, and, were it not for the special chemicals they add to the brownie mix, the oil would actually mostly leak out or pool, leaving parts of the brownie tough.

Shortening and lard both have melting temperatures just above, or a ways above, human body temperature. They make tender brownies, but they sort of leave a weird feeling in your mouth because the fat isn't melting. This mostly manifests in the brownie being called a bit 'dry', even though it's really not any less moist than the other brownies.

Margarine comes in two basic forms, those still made with hydrogenated oils, and those that avoid them by suspending vegetable oil in various binding agents to simulate the desired texture. Margarine with hydrogenated oils will make a tender and moist brownie, but will contain very unhealthy trans-fats. The other margarine will usually end up having its binding agents squeeze their oil out in the heat of the oven and make something similar to the vegetable oil brownies.

The butter-is-better thing actually holds true for almost every baked good. Butter has its own, rich, subtle flavor, and imparts the best mouth-feel possible.

So, a quick recap:

  • Butter is best because melts in your mouth.
  • Margarine is either terrible for you or unsuitable for baking.
  • Avoid rancid oils entirely, you can tell if your oil is rancid by smelling it. A fully refined oil should smell like nothing. If it smells like crayons or fishy throw it away. non-refined oils will have own characteristic smell, but waxiness or fishiness is usually an indicator of rancidity.
  • Don't be afraid or ashamed of violating directions to make food better.

But wait, my sister married a vegan!, no butter allowed!!! D:

Is there any hope?

There is actually one plant-based oil that solid at room temperature (for most people, anyway), but also melts in your mouth. You can use refined coconut oil in your baked goods in place of the butter and the result will be quite excellent. The main problem with it is that it's melting point is 76F / 24C. It seems a lot of people consider higher temperatures acceptable for their homes and offices. These people are secretly cold-blooded lizards and must be watched closely, but the problem of having the goodness melt out of your baked good requires you take some care to ensure they don't get above that temperature after their initial cooling.

May more than just butter melt on your tongue,

Sarah