4K video- so much CPU

triforcefxtriforcefx United StatesModerator, Website User Posts: 1,060 Moderator

Admittedly, my computer is no powerhouse when it comes to editing, especially in 4K, but it should still be decently within HitFilm's 4K spec. FYI:

2016 Acer Aspire Laptop

Windows 10

Intel Core i5-6200U 

Nvidia 940 MX w/ 2 GB Graphics memory


Everything up to date.


Even when playing at 1/4 resolution, using proxies, and no effects whatsoever in the timeline; the video is very choppy, and my CPU gets maxed out just trying to play the video.

This also happens when simply trying to play the raw media in the  trimmer- even transcoded to an optimal format and with the trimmer set to 1/4 resolution

When I try to play the same files on any other video player however, they play back buttery smooth and without breaking a sweat- the computer doesn't even notice.

I understand that HitFilm has much more overhead than a traditional media player, but it shouldn't be that much just for playing a video- or should it? I'm willing to consider that my computer simply isn't up to the task, but perhaps there is something I am overlooking? Maybe there's setting I need to change? (I have tried with hardware decoding on and off- same difference)


Or perhaps there is still much work on the development side that needs to be done with decoding/rendering of video? Obviously that would be quite the undertaking, especially if they were to rebuild the renderer from scratch but if it made 4K video easy to work with on even low powered systems, I think it would be well worth it. 

As a thought, VLC Player is open source and it seems handles 4K video perfectly. In theory, could incorporating its engine (or something similar) make HitFilm run faster?


  • NormanPCNNormanPCN Website User Posts: 3,948 Enthusiast

    4K video requires a top end computer. It is doing 4X more pixel processing. (HD 1080 -> UHD). Vegas has always listed an 8-core PC listed as recommended for 4K work. Not a requirement, depending on specifics, but desired. I posted some Hitfilm UHD performance results in a thread back in the HF 2017 days. (i7 4Ghz 4770k CPU 4C/8T)

    Comparing a media player to a video editor is a bit apples to oranges. That said, Hitfilm is at/near the bottom of the heap WRT timeline performance relative to some others I've played with. You might look at Resolve and see how that does. Everything there is pretty darn quick. Excluding Fusion which is kinda slow. (No idea about Fairlight) Fusion is only now starting to make a transition to an efficient GPU type workflow. Only time will tell if it picks up the pace.

    Hitfilms basic timeline is kinda slow but hardware decoding can cure this all/mostly. Testing on my old machine with the HW decode (8-bit 4:2:0 AVC) the timeline did not run out of steam anywhere near as soon as software decode with Cineform 422, 8-bit timeline. The HW decode wiped the floor with Cineform. This was a simultaneous multi stream composite test. AKA How may streams before they run out of steam an stutter.

    This indicates that maybe it is that which feeds the timeline which is the bottleneck in Hitfilm. Since the timeline made speed with the HW decode and not with Cineform. The HW AVC decoder is basically FaF. So maybe if your source media is suitable AVC and you have a CPU/GPU that has proper HW AVC decode support then maybe you can do UHD. 

    HF 13 maybe does better with Cineform but that needs more checking. Getting lots of crashes right now in 13, including error(s) in trying to send the crash dump report.

  • triforcefxtriforcefx United StatesModerator, Website User Posts: 1,060 Moderator

    Fair enough. This is my first time really working with 4K so I wasn't entirely sure what to expect, but I guess I was expecting a bit better performance. Ohhhh well... better downscale the media in this project and go back to shooting 1080p. At least I'll save plenty of space. ?

  • Triem23Triem23 Moderator Moderator, Website User, Ambassador, Imerge Beta Tester, HitFilm Beta Tester Posts: 18,293 Ambassador

    @NormanPCN I've never had a dev confirm this, but I've also never been told I'm wrong when I've posted this speculation publicly. Take this as only "probable."

    Hitfilm is OpenGL based. 

    Hitfilm video layers in Comp Shots instantly toggle 2D/3D.

    I believe all photo/video layers in Hitfilm are texture mapped polygons. 

    This would explain the difference in performance between trimmer and timeline - the Trimmer streams frames like a standard video player while the timeline has to texture map the poly before render. 

    This also explains why Anti-aliasing is global in a render - because everything is Hitfilm is s textured poly. 

    Hypothesis fits available data. 

    Other thoughts? 

This discussion has been closed.