How many times have you had the frustrating experience of creating — or watching — a video only to discover it's pixellated, grainy and fuzzy around the edges? Especially older videos! We'd bet good money that your mouse has strayed to Youtube's 'Video Settings' pane in a desperate bid to up the resolution only to discover — oh, the horror! — that the list only goes as high as 360p.
It's easy to forget in these days of 1080 HD that there was a time when, not that long ago, HD content was the exclusive domain of professional film studios with plenty of reserves in the bank to devote to producing crystal-clear content with an army of post-producers, cinematographers, broadcast engineers and the like.
Heck, even today, plenty of independent creators — many of whom are a one-man army simultaneously juggling the roles of director, talent, editor and post-producer — struggle with creating high-resolution content that is broadly compatible with and that looks good displayed on the latest generation of display technology.
Upscaling for the masses
Enter: upscaling, the process of taking something that was captured in a lower resolution and ‘scaling’ that resolution into something different. The ‘up’ in ‘upscaling’ refers to improving the resolution, but you can also downscale something using technology based on the same principles.
For most of its history (and most commonly still) upscaling was handled by consumer electronics like televisions and computers at the point of delivery. The situation usually played out like this: if older footage shot in a lower resolution had to be broadcast on hi-res screens, these screens had hardware chips (in the case of TVs) and GPUs (in the case of computers) that typically employed the most rudimentary forms of upscaling, called bilinear and bicubic interpolation, to ‘stretch’ the image across the screen, oftentimes leading to blurry video.
This upscaling mechanism left a lot to be desired though, so the people who could afford to hired experts to manually enhance and upscale their footage rather than relying on the hardware chips and GPUs to do a substandard job. The only problem was (and still is) that manual video enhancement and upscaling are not just incredibly expensive but also incredibly time-consuming, requiring years of practice and experience to get just right. This makes them unavailable to most independent creators in the industry.
At Pixop, we figured out that there was a huge gap in the market for simple, accessible video enhancement and upscaling and we also had a pretty good idea of how to deliver it to the hungry masses: building automated filters using AI and ML so pretty much everyone with a basic familiarity of video editing could use them and delivering these filters via a pay-per-use cloud-based platform that didn't require expensive hardware or software equipment to function.
And it seems to be working! Since we began operations, we've had the pleasure of helping many independent creators make their footage the best it can be using the filters we offer. We previously wrote about one here, and we recently sat down with another independent creator, Brit Tobin, to ask her about her experience using Pixop’s AI and ML Deep Restoration 2 algorithm.
But before we get into the interview, let's do a quick refresher on what our new Deep Restoration 2 algorithm is meant to do.
Deep Restoration 2
Deep Restoration 2 is a significantly more sophisticated video restoration filter than its predecessor in terms of design, capabilities and how it has been trained. Some key differences are:
- It can enhance or upscale a video signal up to x6 while reducing degradations from blurring and compression.
- The new model is also trained to reduce noise when present which is a notable improvement over the previous Deep Restoration filter
- For many applications, Deep Restoration 2 is really the only generic restoration and upscaling AI filter people will need which also makes it time and cost-effective (previously most people would have to use both a denoiser and Deep Restoration).
- The 1920x1080 max resolution restriction of the previous deep restoration filter is not going to be an issue with Deep Restoration 2 as the maximum output resolution is UHD 8K.
Upscaling with Pixop
Tell us a little bit about yourself and how you got your start in filmmaking.
My name is Brit Tobin. I am the creator of the adult-animated series Garbage People and the founder of Black Market Media (BLK MKT Media) in Los Angeles, CA.
I’ve been writing and producing for ten years – I gained my hands-on production knowledge in the settings of live-action tv and short films, as well as working with major studios and top industry names in the business. After the creation of my first show, one of my actors introduced me to improve where I fell in love with sketch comedy and directing actors.
Launching my own boutique studio had been a goal of mine for a long time, and 2020 created the perfect opportunity to give Black Market Media the green light. Our first production, a live-action puppet film series called “Perry the Bear Gets Scared”, was born during the height of quarantine and was also my first experience working remotely with talent full time.
During “Perry’s” festival run, we started crafting this idea of ‘Bad Americans’ which was a satirical social commentary piece on American life and culture. With a little help from social distancing and a lot of help from the current state of affairs, that concept took on a life of its own and became Garbage People.
While other factors were a big deal (writing, casting, building the team) and definitely a lot of fun, animation became a dedicated passion for me and editing became a full-time job. I realized that I had all the tools I needed right in front of me to put an adult animation together without really having to leave my home studio – and because of that, the antithetical grace and essence of the show have been truly able to shine like nothing else.
Garbage People is a stop-motion animation at its core, but it’s mixed medium in the sense that we employ photos and PNGs for a surreal effect – this is one crucial area where Pixop really helped facilitate and “smooth” the vision of this project.
Before and After comparison clips from Garbage People
How did you hear about Pixop?
I’d been doing some research on post-production for animation which led me to a colleague from Fonco Studios who garnered my attention with the idea of non-subscription based uprez services, where I learned about Pixop.
Why did you choose to use Pixop/What did you want to use Pixop for?
What’s great about Pixop is the turnaround time. What’s incredibly impressive about Pixop is the transformation and overall quality filter adjustments can bring to your video. The ease of using Pixop in addition to the side-by-side video comparison tool has been extremely helpful. I ultimately chose Pixop based on two factors: cost transparency and customer service.
Which filters did you use?
For Garbage People, we tried a few different filters using the built-in create preview test feature which almost instantly gave us what we were looking for in terms of increasing resolution size and clarity. For us, it came down to the deep restoration 2 filter or the super resolution options and deep restoration 2 to upscale to 4K got the job done. For a trailer I put together (not animation), I found that 4K super resolution worked beautifully in generating a clearer, larger image without distortion.
Were you satisfied with the result?
Very.
What in particular did Pixop do well and what were you impressed with?
I was intrigued by all of the enhancement options for different types of video content and common video issues, but I was blown away by the quality of the end result as well as the efficient turnaround time. When you’re steadily approaching a looming deadline, especially with a cloud and/or internet-based project like ours, it’s a huge relief to know a service like Pixop exists.
What could Pixop have done better?
The only real recommendation I have is that more people should know about Pixop. I’m not sure it’s something Pixop could do better per se but being able to choose the segment of your video (as opposed to manually exporting a 10-second portion) for preview mode in-app would be a game changer.
Would you recommend Pixop? To whom and why?
I’m a fan of Pixop. I’ve already started recommending this service as much as possible to industry colleagues and I’ll definitely be coming back very soon for more upscaling needs.