HAP - Encoding/Transcoding - Fastest way?

Hi,
Have a HAP/Hardware question

Trying to work out the fastest method for Transcoding/Encoding HAPQ files.

Working on a show that has multiple (x17) large res show files with a run-time of over 1 hour. Currently developing our workflow pipeline and we are planning to build a PC specifically for transcoding image sequences into our HAP showfiles.

Will the transcoding process of HAP scale with the number of cores in a machine?
or
do you start to loose any benefits from more cores above a specific core number?

Currently planning to use Premiere>MediaEncoder+AfterCodecs.
Would like to avoid using just straight Ffmpeg in the command line, not as comfortable and in a high pressure delivery environments on site would want to avoid that - but open to it if its the best option.

Looking at building a 64core or 32core Threadripper PC.
(link below) Reading through Pudget system reports on 64core Threadripper. Premiere sees diminishing returns at that high a core count. Im wanting to confirm whether HAP encoding would experience similar thing?

Thanks!

Hey, I realise this was posted a long time ago, but if you or anyone else has done this testing yet I’d be very interested to hear what you learned.

Currently setting up a few general purpose transcode machines, and noticing that different workflows are taking significantly longer to encode than others. e.g. using adobe media encoder takes 2-3 times as long to transcode an h264 mp4 to hap than running ffmpeg on the command line. (windows 11, i7 11850H, 16GB RAM, A3000)

Is this just how adobe deals with the codec (and that it’s using a plugin rather than native support)?

Adobe doesn’t support HAP natively, so Media Encoder is not optimized for it in any way. Something like FFWorks, uses FFMPEG and has a user friendly GUI for those who don’t like using the command line: https://www.ffworks.net/

For PC, FFMPEG gives the most options, and there are various GUI applications for Mac & PC that use FFMPEG to get the job done.

@ProjectileObjects
Thanks for the response. That’s kind of what I was thinking wrt not being a particularly well optimised plugin. As I say, i generally do end up using ffmpeg on the command line for HAP trans/encoding.

I like the look of ffworks, but alas it’s macOS only. I don’t suppose you know of another ffmpeg encoder that supports windows and allows watch folders? I’ve found https://tdarr.io/ , which looks pretty decent and I’m going to try out.


EDIT:
If anyone else is ever looking for this type of thing I’ll add some interesting things I’ve noticed. There are a few different options for plugins that allow Adobe ME to transcode HAP files, the disguise plugin, the Pandoras Box plugin (which is part of their Codecs download), the Jokyo Plugin (which is a paid license, and unlicensed only exports the first 250 frames, so difficult to judge performance).

Converting the same 1h16m 1080p50 H264 mp4 to HAP (normal, not q and without alpha), the disguise plugin too a whopping 3h 58m. The Pandoras Box plugin too half that time at 1h 54m. Both encoding at a rate slower than 1 second per second. Running the Jokyo encoder at normal seemed to be much faster, completing in just 10 minutes and seeming to process through the whole file, but as the output is limited to 250 frames it’s difficult to know if that’s accurate.

FFMPEG natively on the command line took only 25 minutes going at just over 3x speed, making it clearly the winner so far…

2 Likes

Hi Tom,

There is a way to leverage CPU multithreaded performance by using the “chunk” feature, but it’s only for decoding when using Snappy compressor, not for exports yet.

Exporters might process all frames one by one, but for example in AfterCodecs it will generate (render) the next frame while the current frame is being encoded. I guess a desktop software could theoretically encode whole bunch of frames in parallel by decoding different parts of the video in parallel and then concatenating encoded frames in the end. I do not know such software, but it is theoretically doable.
About encoding one frame in parallel, it’s already done ! So to answer your main question, yes a bigger CPU will help (unless the bottleneck is the rendering from Premiere, or decoding the source file for transcoding. So yes probably to “do you start to loose any benefits from more cores above a specific core number?” but it’s not due to the Hap encoding)

About your specific issue, you want to transcode the fastest way your footage, correct ? In Adobe Media Encoder there is an option to allow parallel encoding in the render queue, that could help.
Also, depending on your footage you might accept to reduce a little bit the quality. In AfterCodecs Exporter for AEfx, Premiere Pro and AME and for Hap / Hap Alpha codecs, you can choose among three algorithms to encode the frames : the faster the encoding speed, the lower the quality can get (but not always, it depends on your source footage complexity !). So you have three algorithms : Best Quality Vidvox, Normal quality ffmpeg, Low quality JMP Preview
Otherwise, Hap Q and Hap Q Alpha should already be quite fast.

Another tip : use WAVE audio codec, encoding AAC is slow.

Cheers !

Hello,

I will comment for jokyohapencoder.com solution (as a conceptor of theses plugins).

We choose to restrict our demo version to 250 frames because several beta testers ask us to do this
(avoid waiting render time, when user forget to install the licence).
Reading your answer, maybe it’s something we need to reconsider, to help user to test performance.

On the performance side, we work a lot to make our encoder as fast as possible (these plugins started as an in house solution, to encode HAP file, and we wanted to have our file as fast as possible).
It use all cores of the computers, and is optimized for Apple Silicon, and Intel/AMD special instructions
(AVX/AVX2 in other words most of CPU sell in the last 10 years).

For HAPQ/HAPQA, we have our own implementation (very different than other available implementations), who have similar speed than the others, but much better quality.
For HAP/HAPA, we choose to have 3 quality/speed options depending of the available time to encode

  • Fast (who is similar in term of quality than ffmpeg encoding)
  • Slow (who is similar in term of quality than old quicktime hap encoding plugin)
  • Normal (who is an in between solution in term of quality and speed).

Considering the speed aspect, Normal and Slow speed depend a lot of the content of the picture
noisy picture take more time to encode, than picture who contain lot of uniform area.
Fast speed depend very few of the content of the picture.

1 Like