Recording MTGO in 4K with OBS

One of the perennial complaints about MTGO streams and recordings is how difficult the cards are to read. And it's no surprise — pretty much any program would struggle with the requirements that Magic imposes. It has to combine sometimes obscene amounts of text with an eye-wateringly small rendering area of less than a dozen square centimeters.

This problem is exacerbated by the fact that most MTGO streamers record at the standard resolution of 1080p. There are simply not enough pixels available to legibly render a font in such a small area at such a low resolution. Here is an example of what I mean:

Demonstration of 1080p vs. 4K resolution
Left: 1080p Channel Fireball recording
Middle: 4K recording, smallest hand size
Right: 4K recording, standard hand size

This is not meant to pick on Channel Fireball. Their content is impeccable, but much of it is hard to read unless you know exactly what is going on. To be clear, it's not just Channel Fireball; even professional content put out by Wizards of the Coast suffers from similar issues. Unless you follow Standard, I suspect you'll have a hard time telling me what these cards from the recent Magic Online Championship do:

MTGO Championship Screenshot


Is 4K really that big of a deal?

YouTube displays recordings in a variety of resolutions, from 144p all the way up to 2160p (4K). It may not seem as if there is a big difference between 1080p and 2160p, but remember that the “1080” in “1080p” only refers to the number of vertical pixels. In terms of overall pixels, there is a pretty vast gulf between 1080p and 4K:

YouTube Resolution Pixels % of 1080p
4K 3840x2160 8,294,400 400%
1440p 2560x1440 3,686,400 178%
1080p 1920x1080 2,073,600
720p 1280x720 921,600 44%
480p 854x480 409,920 20%
360p 640x360 230,400 11%
240p 426x240 102,240 5%
144p 256x144 36,864 2%

As you can see, 4K gives you 400% more pixels to render legible text in the exact same amount of screen space. That's why the Deathrite Shaman on the right can clearly display its entire text box despite taking up the exact same amount of space as the Deathrite Shaman on the left.

Demonstration of 1080p vs. 4K resolution


Cranking OBS to 11

Before I began recording, I simply assumed that 1080p recordings were a matter of inertia. Everybody recorded in 1080p, so what was the point in trying to bump the resolution up to 4K? After all, 4K means more CPU usage, larger files, and slower downloads. Why bother when nobody else was doing it?

It turns out my assumptions were way, way wrong. It turns out that it's actually really difficult to record in 4K while playing MTGO. My machine isn't top-of-the-line, but it's nothing to scoff at: a 2013 Macbook Pro with quad 2.6GHz i7 CPUs and 16GB of memory.

I assumed that all I'd have to do was tell OBS to record at the native “Retina” resolution (3360x2100) and I could go on my merry way. What happened when I did that?

OBS CPU Usage

PAIN. Telling OBS to record at 4K in real time took so much CPU that my machine was rendered completely useless. I couldn't actually play MTGO, because each click took over 15 seconds to register.

I then tried telling OBS to use one of its faster CPU settings (ultrafast), but the image quality came out very poor, with lots of noise and other encoding artifacts:

OBS on Ultrafast


Path to 4K

This process left with a whole new respect and understanding for the professionals who do these recordings. It's simply impossible to have:

  1. High quality 4K recordings
  2. Low CPU usage
  3. Managably-sized video files that you can upload directly to YouTube

I realized I'd have to compromise and do my recordings in two steps:

  1. Record with at high quality and low CPU usage, but large video sizes
  2. Re-encode post-recording to generate high-quality videos with low file sizes

This is certainly more work, and takes a lot more time. But it comes with some benefits:

  1. The recording takes very little CPU usage, instead of causing the typical OBS lag
  2. The re-encoding takes 8-12 hours, but the settings used result in YouTube quickly generating all the other resolutions post-upload
  3. My screen's resolution (3360x2100) is actually lower than 4K (3840x2160), but the re-encoding lets me upscale to 4K


The Technical Details

So how did I do it? Here are my OBS video settings:

  1. Recording Format: mp4
  2. Encoder: x264
  3. Rescale Output: unchecked (native resolution)
  4. Rate Control: CRF
  5. CRF: 12
  6. Keyframe Interval: 0 (auto)
  7. CPU Usage Preset: superfast
  8. Tune: stillimage
  9. Variable Framerate (VFR): checked

These settings generate very high quality recordings that average about 1GB for every ten minutes of recording. Lowering the CRF value leads to higher quality files at the cost of increased CPU usage, and 12 was the highest quality my machine could handle. If you find these settings too aggressive, bump CRF to a higher number.

Once I am finished recording, I have an automated job that upscales and re-encodes with ffmpeg, using the optimal YouTube video settings:

$ ffmpeg -i input.mp4 \
  -c:v libx264 \
  -crf 21 \
  -tune stillimage \
  -bf 2 \
  -c:a copy \
  -pix_fmt yuv420p \
  -flags +cgop \
  -sws_flags lanczos \
  -movflags faststart \
  -vf scale=-1:2160 \
  output.mp4

You don't need to know what all those settings do. Suffice to say, the generated files are perfect for YouTube and have no perceptible loss in quality despite being 75% smaller.

How big of a difference does 4K make in practice? All I can say is to take a gander at one of my recent video settings and witness the 4K difference for yourself.

[Category: Magic] [Permalink]


Understanding CORS

RTFM… just kidding! There is no manual for the CORS (Cross-Origin Resource Sharing) specification. I really had you going there, didn't I?

Don't worry, it's not your fault. After all, here is what a Google search provides:

Google Results for searching for CORS documentation

Each of these sites contains a wealth of information about CORS, and each of them is far over the head of your average developer. Given the frequent questions that I receive from confused and frightened developers trying to understand these documents, I thought it might be helpful to boil CORS down into a couple simple examples.

Q. If I have static content that depends neither upon cookies nor user-specific URLs and/or parameters and I want to share my site's content with the web, what should I do?

A.

Access-Control-Allow-Origin: *


Q. Well, that is great and all. But what if I want to let a foreign website interact with my site, as a logged-in user, allowing them to do anything they could as if they were on my site? I swear that I understand the risks that this entails and that I really trust this other site to not make any security mistakes such as falling victim to a cross-site scripting (XSS) attack.

A.

Access-Control-Allow-Credentials: true
Access-Control-Allow-Methods: GET, HEAD, OPTIONS, POST, PUT
Access-Control-Allow-Origin: https://example.com
Access-Control-Expose-Headers: X-Poop-Emoji
Access-Control-Max-Age: 300

Where these headers mean the following:

  • Access-Control-Allow-Credentials means that the user's cookies (such as their session cookies) will be sent with the request
  • Access-Control-Allow-Origin is the whitelisted origin sent in the Origin header by the browser and not * nor blindly reflected

And these optional headers mean the following:

  • Access-Control-Allow-Methods is the list of allowed HTTP methods beyond GET, HEAD, and POST
  • Access-Control-Expose-Headers allows example.com to read the contents of the X-Poop-Emoji header (💩, obviously)
  • Access-Control-Max-Age allows example.com to make these requests without preflights for the next 300 seconds

Again, please be aware that you need to be very careful with Access-Control-Allow-Credentials. Even if you think you're safe by only allowing idempotent methods such as GET, that might be enough to steal an anti-CSRF token and let attackers go to town with CSRF attacks.

If you need additional documentation about other features in CORS, I highly recommend the frustratingly hard to locate CORS for Developers document by Brad Hill.

[Category: Security] [Permalink]


Analysis of the Alexa Top 1M sites (October 2016)

Last April, I ran a scan of the Alexa Top 1M websites using the Mozilla Observatory. The results were dire, indicating a broad lack of awareness around modern security technologies such as Content Security Policy, Strict Transport Security, Subresource Integrity, and others.

But that was six months ago. With the Mozilla Observatory being publicly released almost two months ago, I was curious as to whether significant improvement had been made around the internet. After all, in those two months, the Observatory has scanned approximately 1.3M sites, totalling over 2.5M scans.

With that in mind, I ran a new scan of the Alexa Top 1M at the end of October, and here is what I found:

Technology April 2016 October 2016 % Change
Content Security Policy (CSP) .005%1 / .012%2 .008%1 / .021%2 +60%
Cookies (Secure/HttpOnly)3 1.88% 2.44% +30%
Cross-origin Resource Sharing (CORS)4 93.78% 96.21% +3%
HTTPS 29.64% 33.57% +13%
HTTP → HTTPS Redirection 5.06%5 / 8.91%6 7.94%5 / 13.29%6 +57%
Public Key Pinning (HPKP) 0.43% 0.50% +16%
  — HPKP Preloaded7 0.41% 0.47% +15%
Strict Transport Security (HSTS)8 1.75% 2.59% +48%
  — HSTS Preloaded7 .158% .231% +46%
Subresource Integrity (SRI) 0.015%9 0.052%10 +247%
X-Content-Type-Options (XCTO) 6.19% 7.22% +17%
X-Frame-Options (XFO)11 6.83% 8.78% +29%
X-XSS-Protection (XXSSP)12 5.03% 6.33% +26%

I'll admit, I was a bit taken aback by the overall improvement across the top million sites, especially as some of these security technologies are almost a decade old.

When we did our initial scan of the top million six months ago, a stunning 97.6% of websites were given a failing grade from the Observatory. Have those results changed since then, given the improvements above?

Grade April 2016 October 2016 % Change
  A+ .003% .008% +167%
A .006% .012% +100%
B .202% .347% +72%
C .321% .727% +126%
D 1.87% 2.82% +51%
F 97.60% 96.09% -1.5%

While a decrease of 1.5% in failing grades might seem like only a small improvement, the latest Observatory scan contained 962,011 successful scans. With each percentage point representing nearly ten thousand sites, a drop from 97.6% to 96.09% represents approximately fifteen thousand top websites making significant improvements in their security.

I'm excited for the possibility of seeing further improvements as additional surveys are completed. Please share the Mozilla Observatory and help to make the internet a safer and more secure place for everyone!



Footnotes:

  1. Allows 'unsafe-inline' in neither script-src nor style-src
  2. Allows 'unsafe-inline' in style-src only
  3. Amongst sites that set cookies
  4. Disallows foreign origins from reading the domain's contents within user's context
  5. Redirects from HTTP to HTTPS on the same domain, which allows HSTS to be set
  6. Redirects from HTTP to HTTPS, regardless of the final domain
  7. As listed in the Chromium preload list
  8. max-age set to at least six months
  9. Percentage is of sites that load scripts from a foreign origin
  10. Percentage is of sites that load scripts
  11. CSP frame-ancestors directive is allowed in lieu of an XFO header
  12. Strong CSP policy forbidding 'unsafe-inline' is allowed in lieu of an XXSSP header

[Category: Security] [Permalink]


Let's Encrypt now supports IDNs

Today was a huge leap forward for humankind, for it marks the day that Let's Encrypt now supports internationalized domain names. That means that you can now get certs with non-ASCII characters in them, which will be huge in helping Let's Encrypt improve HTTPS uptake in countries that use languages outside of the traditional ASCII character set.

More importantly for me, it means that https://👉👁.pokeinthe.io is now a thing.

How did I do this? First, you must transform unicode (in this case, the 👉👁 emoji) into what is called punycode. Punycode is simply a method of representing unicode characters in ASCII, the only characters supported by the domain name system (DNS). There are many ways to do it, including a simple tool at punycoder.com. For 👉👁, its punycode encoding is xn--mp8hpa.

I simply setup DNS for xn--mp8hpa.pokeinthe.io, updated my nginx configuration to include xn--mp8hpa.pokeinthe.io in its server_name parameter, and requested a cert using my favorite Let's Encrypt client (lego):

root@pokeinthe:~# /opt/go/bin/lego -d pokeinthe.io -d www.pokeinthe.io -d 'xn--mp8hpa.pokeinthe.io' --email 'april@pokeinthe.io' --accept-tos -k ec384 --webroot /var/www/pokeinthe.io --path '/etc/lego' run
2016/10/21 17:30:02 [INFO][pokeinthe.io, www.pokeinthe.io, xn--ls8h.pokeinthe.io] acme: Obtaining bundled SAN certificate
2016/10/21 17:30:03 [INFO][pokeinthe.io] acme: Authorization already valid; skipping challenge
2016/10/21 17:30:03 [INFO][www.pokeinthe.io] acme: Authorization already valid; skipping challenge
2016/10/21 17:30:03 [INFO][xn--ls8h.pokeinthe.io] acme: Could not find solver for: tls-sni-01
2016/10/21 17:30:03 [INFO][xn--ls8h.pokeinthe.io] acme: Trying to solve HTTP-01
2016/10/21 17:30:04 [INFO][xn--ls8h.pokeinthe.io] The server validated our request
2016/10/21 17:30:04 [INFO][pokeinthe.io, www.pokeinthe.io, xn--ls8h.pokeinthe.io] acme: Validations succeeded; requesting certificates
2016/10/21 17:30:04 [INFO] acme: Requesting issuer cert from https://acme-v01.api.letsencrypt.org/acme/issuer-cert
2016/10/21 17:30:04 [INFO][pokeinthe.io] Server responded with a certificate.

A simple reload of nginx later, and my blog is available where it always should have been.

[Category: Security] [Permalink]