I’ve been visiting my parents for the last couple of weeks. I introduced my history-loving father to the series Band of Brothers. For those of you not familiar with it, Band of Brothers is a mini-series from 2001 that depicts the 101st Airborn Division in WW2, following them from training camp, through D-Day, the Battle of the Bulge, and eventually the end of the war.
My father has a relatively new OLED TV. The image quality is simply gorgeous, especially at night when the light in the room is low. But Band of Brothers looked weird on his TV. It took me a moment to realize what was wrong. If you paused the video, the image looked spectacular. When it was playing, it looked oddly smooth. The problem was that the movie was filmed at 24 frames per second but the TV was showing it at 120 frames per second. Between each frame of the film, the TV interpolated 4 additional frames so that the motion was glassy smooth. That made the image look technically better but artistically worse.
Frames per second is the term for how many pictures are displayed per second in a film. In the very early days of movies, there was no standard frame rate. Cameras were cranked by hand and speeds varied between roughly 15 to 24 frames per second. Anything slower and it would look like a bunch of individual pictures rather than motion. Higher frame rates meant smoother-looking images, but film was expensive, and higher frame rates used more film. When sound was introduced in 1927, moving the film at a constant speed was necessary for the sound to work properly and Hollywood standardized on 24 frames per second. Almost every Hollywood movie since then has been shot at 24 frames per second.
TV was different. Before High Definition TV introduced a bunch of new standards, TV in the United States was recorded at 60 frames per second. That wasn’t done to give you smoother, better-looking motion on TV. The best clue as to why it was done is that in Europe TV was recorded at 50 frames per second. If you remember from last week, the US electrical grids run at 60 Hz and the European grids run at 50 Hz. TV video needs a timing signal and the easiest way to get one is to just use the frequency of the attached AC grid.
So if 60 frames per second is smoother than 24 frames per second, why was a made-for-TV series like Band of Brothers shot at 24 frames per second, and why did I think it looked weird on my father’s TV at 120 frames per second? It’s because we are used to seeing movies at 24 frames per second and so anything smoother looks odd. In fact, when TVs first introduced the ability to smooth a 24 fps video to 120 fps, people called it the “soap opera effect” because we were used to movies being at 24fps and soap operas being at 60fps. Even though faster frame rates look objectively better, we’ve learned that 24fps looks “cinematic.” It’s not objectively better. It’s just what we are used to. And that is my topic for today - things that we take for granted as being correct or better or normal because they follow a custom that we accept as normal.
Our number system is another good example. Virtually everyone in the world uses base 10 numbers. Sure, geeky computer people sometimes use base 16 (hexadecimal) or base 2 (binary), but only when talking to computers. By base 10, I mean a number system with a one's place, a ten's place, a hundred's place, and so on. We could have used something like base 8 and counted 1, 2, 3, 4, 5, 6, 7, 10, 11, 12, 13, 14, 15, 16, 17, 20, and so on. That’s not skipping the numbers 8, 9, and 10. They use 10 for what we call 8, 11 for what we call 9, and so on.
Why did we pick base 10? The clue is in the term digit. The word digit is what we call the numbers from 0 to 9 and also what we call our fingers. We used base 10 because we have 10 fingers and before calculators, we used to count on our fingers. They might have used base 20 and included toes, but people were really poor back when counting started and they probably thought that 10 was the biggest number anyone was likely to need.
The ancient Babylonians used a sexagesimal system - base 60. Sort of. If you look at their numbers, it looks like they used base 10 for their symbols but base 60 for their digit positions. It looks very confusing, which may be why modern Babylonians don’t use it.
My point is that we take it for granted that everyone uses base 10, but that’s just an accident of history. If humans were AI-generated and had 16 fingers, we would probably be using hexadecimal, which would have made developing computers easier, because 16 is a power of 2. In the computer nerd world, we divide people into 10 groups - those that understand binary and those that don’t.
Some of the things we take for granted appeared for less logical reasons. Take the modern computer keyboard with its QWERTY key layout. Why that order and not alphabetical order or positioning the most frequently used keys where we rest our fingers? QWERTY was designed back in the era of mechanical typewriters and a problem with early typewriters was the letter hammers colliding with each other. To allow people to type faster, the designer spread out the most frequently used keys (when typing English words) so that typists could type faster without jamming up their hammers. That was about 160 years ago and we’re still stuck with a layout designed to slow us down. There have been some better designs created, like the Dvorak keyboard (named after the inventor rather than the key layout - there was nobody named Qwerty). But the Dvorak keyboard came out in 1936 and still hasn’t caught on, so don’t hold your breath that typing is going to get easier. Once people get used to an idea, it can be hard to get them to switch to something better. Just look at the way we in the US continue to teach kids about inches, feet, and miles and the weird 12 to 1 to 5280 to 1 ratios rather than switching to centimeters, meters, and kilometers like the rest of the world.
The fact that we read from left to right and top to bottom is also an arbitrary convention. Arabic and Hebrew are the only languages I know of that go right to left and look at the conflicts that has caused. I’m guessing that they were invented by a sinister group of lefties (talking about handedness and not political alignment). Our right-to-left languages are much easier to write for proper right-handed people because they aren’t smearing their writing as they go along. Some languages are written from top to bottom. I believe that is true of Chinese, Japanese, Korean, and Vietnamese. The only languages I could find that are written bottom to top are Batak, Hanunó’o, and Tagbanwa, which are related languages from the Philipines. Oh, and we write stuff on roads and sidewalks from bottom to top, which always seems confusing.
It seems particularly strange that Arabic is written right to left but Arabic numbers are written left to right. Reading Arabic financial reports must be hard on the eyes. I wonder if they are trying to compensate by not allowing interest. By the way, what we call Arabic numerals weren’t invented by the Arabs. We call them that because we adopted them from the Arabs, but they got them from the Indians. Indians were using proper numerals back when Roman engineers were trying to do math using numbers like LVII and CMXLXXI. It's no wonder that Rome fell and that the best math teachers on YouTube are Indian.
At this point, I was going to explain how the Western musical system with its octaves and notes is arbitrary and that Octaves needed to be based on frequency doublings and could instead be based on other arbitrarily chosen math. The problem is that, despite having listened to Western music for almost 60 years, I really don’t understand how it works. While I can happily make claims about Tagbanwa without fear that anyone reading this will realize that I don’t know what I’m talking about, I suspect that a lot of you are band nerd types who will quickly out me for a fool if I say anything more than I already have about music.
Another arbitrary convention is our habit of driving on the right side of the road. While there are still some island nations like the UK, New Zealand, Japan, and Australia that drive on the wrong side of the road, all of the sensible nations drive on the right side. According to the Federal Highway Administration, people in the US drove on the right side of the road because drivers of wagons sat on the right side and the risk of driving into a ditch was a bigger concern than oncoming traffic. The first cars with steering wheels copied this convention, but the Model T Ford bucked the trend and put the driver on the left to be closer to the middle of the road. It sold so well that everyone copied it.
Not all of our arbitrary seeming conventions are so old. Today, we think of the Democratic Party as blue and the Republican Party as red, but that didn’t become the standard until the 2000 Bush v Gore election. It seems weird now, but there was no standard for prior elections. In fact, if you go back to before the mid-1960s, TV news didn’t even display election maps with different colors at all, presumably because people were less partisan back then (although the lack of color TVs may have also played a role). You may think I’m joking about the change in partisanship, and I am, but some researchers desperate for something to publish did a study purporting to show that using green/orange instead of red/blue decreased perceived partisanship. Maybe if Biden dyed his skin green, we would see a decrease in perceived partisanship.
Why is blue used to represent cold and red hot on things like AC controls? Blue is a higher wavelength, so you’d think that it would be hotter. Blue stars are hotter than red stars. I guess that the convention began because we are used to reddish things like fire being hot and blueish things like ice being cold. And why is blue a male color while pink is a female color? Up until the 1920s, pink was considered a color for boys. It was a lighter version of the then-masculine color red. But between the 1920s and 1940s, social convention shifted and pink was perceived as a feminine color.
I’m sure that I’m missing lots of great examples of arbitrary conventions that are so ingrained that we don’t even think about them. That’s what makes it so hard for me to think about them. They just blend into our lives and we generally don’t notice them until we see an exception that defies our expectations. And that’s why Band of Brothers looked weird on my father’s TV. The motion looked smooth, like a modern TV show rather than choppy like a movie.
The choice of how Band of Brothers was filmed was full of conventions to help us see it as a realistic portrayal of World War II. It wasn’t just that it was filmed in 24 frames a second. They also used an unusually fast shutter. Normally, when something is filmed at 24 frames a second, the camera’s shutter is open for 1/48 of a second. For many of the action sequences, they used a much higher shutter speed, often 1/200 of a second, which makes the motion look even choppier and less fluid. Doing that helps increase the sense of disorientation for the battle scenes. They deliberately don’t look like reality.
They also used a muted color palette. Having fewer colors makes the shows look more period-correct. To understand why, picture a scene from back in the 1920s, preferably in a place like New York City. What does it look like? If you are like most people, the scene you picture in your mind is in black and white because that’s what you are used to seeing for the 1920s. I wasn’t there back then, but I’m pretty confident that buildings were sometimes brick red, trees had brown bark and green leaves, and boys wore pink clothes (or so I’ve been told). So when you are depicting something from the era when virtually all news and pictures were in black and white, you want to depict it with a more limited color palette than reality to better match how we think of that period. For Band of Brothers, it also helped that the bleak colors emotionally evoked the bleakness of war.
We’re back to the beginning. Band of Brothers looked unreal to me not because it appeared less accurately on my father’s television but because it didn’t appear inaccurate in the way it was designed to look. The way directors and cinematographers make things look is based on a lot of arbitrary seeming conventions we are accustomed to. So choppy motion and muted colors look more “realistic” for a WW2 battle than something with a more accurate look. It’s a strange world, by convention.
I wonder whether industry conventions, in net, drive more standardized thinking (less innovation) or more innovation through collaboration?