Palacono's Bugs'n'Misunderstandings [#52 Hitfilm Framerate issues (Link only)]

2456710

Comments

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    @Andy001z Eh? I don't know who that is supposed to refer to, but no, I don't work for anyone, other than myself. Which is why I've got time to mess about making videos for bugs that could probably be described in a single paragraph. It's therapy for finding another roadblock, and editing practice all-in-one. :)

  • Andy001z
    Andy001z Posts: 3,152 Ambassador

    @Palacono sorry no offence meant, dry humor warning needed on my last post. I am sure the guys as Hitfilm are pleased with the hard work you are putting into finding these little crittas. You make some interesting and frankly well documented points. I imaging it is always a challange to balance the books for fixing old issues and meeting the needs of pushing new boundries in the software.

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    @Andy001z none taken. ;) I seem to have a knack at breaking software.

    The weird thing is than sometimes when I go back and try it again to document in a video, I can't do it.  I unconsciously avoid doing whatever it was that made it go wrong in the first place and either spend ages trying to find out what I did before, or leave it alone for a while and hope that on another day, I'll have forgotten how to 'avoid' the problem and have it pop up again. :)

  • Triem23
    Triem23 Posts: 20,073 Ambassador

    I will say I suspect it's probably harder than one would think to patch certain issues with 3. The changes in keyframing, reflections/Refractions between asset and model layers, physical shading and all in the 3D unified space implies major fundamental changes to the core rendering engine. Unfortunately FxHome is still limited in human resources. 

  • NXVisualStudio
    NXVisualStudio Posts: 734 Ambassador
    edited April 2016

    @Triem23 It's amazing that they hold such a strong agile team with the amount of staff running the whole cycle, as software evolves so does the lines of code to follow it, bugs can become strenuous to locate, while adding additional lines that 'should' work sometimes leaves issues with something that's related elsewhere, even when making patches/fixes .... that then leads to problems.  it's fundamentally down to how the software was designed really, making parts less dependant on each other - reduced coupling, was it built in classes or implementation and interfaces, 

    @Palacono It's good that you are finding these bugs and reporting them, 'it's better to know'  rather than have the problem go unnoticed for much longer before someone finally reports it,  I don't think the company are intentionally cannibalising sales, it's just heavily resource driven to find bugs, for example the clipping/z-buffer issue was something that lasted for a very long time and that was something that had to be seen too as it was becoming something that became 'signature' to Hitfilm 'Buy our software with completely free 3D Model Rave Kit'(Jokes)  and wouldn't go down to well for product advertisement, eventually a strong and well structured fix solved that, 

    If the problems that you find are becoming troublesome for a large number of people then they can assign more staff to troubleshoot the issue, if it's a small bug then it's likely to be further back at the 'To do' list,

    By all means keep up the good work :)  

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    @NXVisualStudio Oh, I'll keep on doing it as long as it keeps me amused and/or I need to use the feature/function/effect more than once and there is no workaround. :)

    What I do find interesting is the number of things I find in HF3P and HF3E that are already fixed in HF4P, so I'm actually a long way behind the curve, but as long as things I report occasionally appear on the release notes for updates, I can kid myself only I found it and it was important enough to fix. :D

  • SimonKJones
    SimonKJones Posts: 4,448 Enthusiast
    edited April 2016

    The main thing I want to note is that there's no such thing as simply 'copying the  code over' from HitFilm 4 Pro to earlier versions. Despite the surface appearance of every version 'being HitFilm', it's better to think of them as completely separate products rather than slight variants on an identical core.

    The moment we branch off to start intensive work on a major new version, from that date onwards it's a separate version of the code which will become progressively distinct from the previous version. With a product as complex as HitFilm this can snowball drastically quite quickly. What this means is that bug fixes and optimisations which go into version Y won't 'just work' in version X, as arriving at that bug fix could be a part of a much larger under-the-hood change affecting several systems and introducing major new features.

    For example, there is no specific bug logged in our system relating to transitions and proxies. General improvements throughout the software and in how proxies are handled resulted in positive changes, but we can't simply copy a line of code and stick it in HitFilm 3.

    This is generally how all software development works: once you commit to a new version, earlier versions won't receive the same frequency of updates. If we ported everything back into HitFilm 1, 2 and 3 it would create a massive resource drain that would actually prevent us (or any developer) from advancing the software in the first place. The alternative would be to keep the code base as static as possible, so that HitFilm today was still like HitFilm 1 - but that wouldn't be a good approach either, for obvious reasons.

    Other factors can complicate things : for example, we handle colour profiles more accurately now. That change was made for HitFilm 3, which probably relates to the performance difference between 2 and 3 with some videos. (there's definitely still work to do in general with regards to overall performance)

    In all our proxy tests we've not detected any proxy performance degradation except in one specific case, related to working with large images, in 16-bit and with a non-default AA method enabled.

    As for your layering up of grade layers: I'd personally do the sharpening in an embedded comp version of the source image. Do you still get odd behaviour if you do that? You are correct that grade layers don't flatten themselves - that's by design. There does seem to be something going wrong once you combine the two grade layers, however, and the devs are looking into that.

    I have used multiple grade layer tricks in the past numerous times without any problems, so I suspect this might be something specific to the projector effect. We shall find out!

    The flickery codec issue is odd - I believe Ady's going to be in touch about that one.

  • Palacono
    Palacono Posts: 3,423 Enthusiast

    @SimonKJones, While I appreciate your points: I worked in software development for 20 years, and used to write code that would work on 5 platforms at once. Libraries on each platform handled the low level nuts'n'bolts, the high level code couldn't care less what it was running on. So if your team  have written functions that cannot be cut'n'pasted and that take drastically different input and output parameters from one version of Hitfilm to another: then they've made life needlessly difficult for themselves. Given that each effect is essentially a .DLL, I doubt that's happened.

    While it would indeed be silly (financial suicide) to expect major feature improvements to make it back into earlier versions and you do have to 'move on' at some point , things like the proxy+ transitions bug do mean that only the simplest of video edits can be made, which considering the whole raison d'être of Hitfilm is to make at least slightly interesting composites, and given the push towards using proxy files (and other editor friendly formats) to deal with performance issues (however caused, low spec'd hardware included), I find it hard to understand how the bug is still there. It must have been spotted a long time ago by someone because it's fixed in v1.0 of HF4P.  One thing it does seem to imply: Not many people can be using the Editor for anything other than the simplest of videos, or it would have been rolled out in one of the earlier HF3P updates..

    By 'proxy performance degradation', do you mean it taking longer to create them on HF4P compared to HF3P or HF3E as in the first video in this thread? @Ady has the project with the two images I used in that  video, so that can be tested with all versions of Hitfilm. I tried it on an HDD and an SSD with only slight proportional improvements.

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    New Bug/feature: #13 Text Layer Limitations.
    -------------------------------------

    I've commented on text before and while it's nice to have Boris, there are times when you just want something simple, and until Boris works with Hitfilm's lighting, it's not as versatile as the built-in text could be.

    But the quality isn't great. They way it works means that while slapping a giant, static Arial Black title page is OK, fine, small, or slowly moving text (or all three) shimmies across the screen in a very distracting way. 

    Using larger text and scaling it down (as you could with an image to smooth it out) doesn't work because the way it is rendered only cares about the actual pixels on screen, so the enlarge/reduce gets cancelled out with no quality improvement.

    I usually avoid it, and import a plane with text on and use that instead, but then when I want to apply extrusion, I have to have an alpha channel, or keying and it's all a lot of faff, so I still occasionally try and use it because it's quicker.

    But, this time I tried using it with Quad Warp, because I wanted to put some text on a surface that had tracked points on it and Quad Warp was needed to keep it in place. But, it didn't work properly at all.

    I noticed that the order of Transform and Effects is reversed on a Text Layer (another order of operations to forget remember ), which is probably partly responsible. But the problem of the clipping when the text 'source' goes off screen means that a Text layer isn't anything like any other layer at all and is quite limiting. The last part of the video shows what happens when trying to use a smaller, finer, moving font. It's particularly visible if text is parented to a tracker on something that moves slowly.

    Now, I could probably render the text to a giant surface in another comp and embed it or some other convoluted workaround, but shouldn't it just work?

    @SimonKJones, @Ady, If it won't break anything, could Text Layers be made to work just like a standard Plane Layer? :)

  • NormanPCN
    NormanPCN Posts: 4,082 Enthusiast

    Video is a low resolution environment. Given that it is probably best to avoid using Extra Light or Thin weight fonts. There are just not enough pixels to properly represent the font glyphs in that environment. At a certain threshold of generated font size the issues can disappear. Small and super thin is a problem.

    In the Hitfilm text layer the glyphs are not antialiased in editor preview playback unless the antialias option to chosen. Boris does it own thing. The lack of pixels, due to low res, for the thinness of the font is the reason for the gaps you show. When antialias is enabled then the gaps disappear but most of the pixels filling the gaps are the alias edge pixels at various levels of transparency. It does not surprise me that you can get shimmer in certain settings of movement direction and/or speed with font glyphs where a certain threshold percentage of the glyph pixels are alias edge pixels. Alias edge pixels can be problematic in general in a low res environment with certain directional/angle movements.

  • SimonKJones
    SimonKJones Posts: 4,448 Enthusiast

    The reason text has Transform and Effects reversed compared to other layers is that this ensures text is rendered and anti-aliased correctly when projecting from oblique 3D angles to the 2D on-screen image. It means that whatever you do with the text layer in 2D/3D it'll maintain full resolution.

    If it rendered to a 2D image-type layer and THEN transformed, it'd appear stretched and blurred (like when you import an image of text and transform/stretch it).

    If you want it to behave more like other layers, you can stick it inside an embedded comp.

  • Andy001z
    Andy001z Posts: 3,152 Ambassador

    Hey this Anti-aliase option might be what was effecting my text... Is it under the font options? I'll check.

  • Palacono
    Palacono Posts: 3,423 Enthusiast

    @NormanPCN All of that being true - other than when I'm editing in 4k, when it isn't and the problem are still visible -  I find that any font that has any thin sections, points or angles on it (basically any font other than Arial Black) has issues with shimmer as it moves, even without scaling, if it is fairly small. Big, bold, static text generally hides the edge pixels moving about too much.

    This seem to be because of the way it is rendered. If it was rendered 'off screen' and the pixel positions (including which ones would be antialiased) set in stone, then rendering that anywhere on the screen would not be an issue, as the whole 'pattern' would slide smoothly about. This works if you print text on a plane and use that as a plane to slide around the screen, as I do in the video below.

    But, because the renderer uses the position on the screen to determine which text pixels will be rendered, it can, and does, choose different ones to plot, which causes the shimmer. 

    It also does this with textures that move around the screen, but unless they have hard edges, it's harder to spot it. But if you make a texture with a hard, diagonal line on it and move it about, you'll see that shimmer a bit too.

    I used Export Frame in this video and discovered it applies Antialiasing even when the Viewer mode is set to Full, then it applies it a bit more if the viewer is set to Antialiased, so the two Exported frames are different, and neither is like that seen in the viewer. Not helpful if you're using that method to stretch a single frame over a longer time or something similar.

    Then the export of the video applies yet another variation on Antialiasing, because the video looks different again.

    I know you can apply different levels of AA at different times and there is 8 and 16 bit mode, but honestly, I have no idea what anything looks like until I render it out and look at it. The RAM Preview doesn't apply Antialiasing at all and the Proxy is too slow to use regularly, so I often just cross my fingers.

    Dear FXHome, could we please have some breakdown of what renderer does what, where, when and why? :)

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    @SimonKJones, OK, that may have been the intention, but the rest of the time it looks ...not great. See second video and post above this.  I'd rather have it look better for the 99% of the time it's not projected at oblique angles onto something, because it currently looks equally bad at any angle. Plus...wait, what? Ordinary textures don't suffer from that problem at all, so that's apparently a solution to a non-existent problem.

    The difference in scaling and the two layers going out of sync when Quad Warp is applied is still a bit strange and I'll experiment with putting it in a separate comp, but all this faffing about seems a lot of work.

    Edit: I did that and after changing the size of the comp back to the to 500x500 the original text was, instead of the 1920x1080 it decided to default to, it worked. Why is it that creating a comp makes so many wrong guesses about what size to make things? Creating a comp of the 500x500 Plane got the size correct first time. Consistency seems to be missing.

    You can see in the second video that Exporting a frame doesn't work as expected. but it does have the side-benefit of improving the way text looks, even if if messes up anything else.

    So, why doesn't it do that all the time, including when projected at oblique angles etc. etc.? Because...it must be capable of it or I couldn't export a frame of it doing so with the improved Antialiasing applied. ;)

  • NormanPCN
    NormanPCN Posts: 4,082 Enthusiast
    edited April 2016

    @Palacono I agree that the Hitfilm text edges are more prone to shimmer than others software/effects. I can use NewBlue titler pro in Hitfilm with the same font and movement and the Hitfilm text can shimmer on movement. I would bet Boris is much like Titler Pro but have not specifically tested. 

    The shimmer is a problem, but for me, the native text is so weak in overall features I just want to avoid it and use Boris in HF4. We can't even put simple text on the NLE timeline without creating a comp. With Boris we can since it is an effect we can apply it to a simple plain which can be used on the NLE timeline. I think Boris in HF4 is a right now solution FxHome offers to any weaknesses in HF of the text plane effect. Boris does not have the ability for environment maps so that is a problem.

    Since Hitfilm is not using the operating system font functions I wonder how much they(FxHome) can affect the outcome of the native text quality. They are an OpenGL app and OpenGL does not support text so they are using a third party library to handle getting the fonts into OpenGL. That third party library loads and likely rasterize the fonts for Hitfilm. That library may do the anti-aliasing during the rasterize of the glyphs. 

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    @NormanPCN Yes, but one benefit of me doing these videos is I'd almost forgotten that I can Export a frame with text on and import the .PNG to use fairly quickly in its place (and it has Antialiasing applied on Export too) which is a bit quicker than me using another program to prepare the text, so that's a time and effort saving. 

    It also has the benefit of being a plane, so no messing about with reversed order of operations and I can have it large and scale it down and have it still look nice, so I almost get what I wanted in the first place: better text.

    It is a bit like having to use your left foot to scratch your right ear, but better than nothing. :D

    Now if only the final rendered video looked closer to the views you get in the viewer...for everything, not just text. How many different render engines are in this thing? I already showed the trimmer windows is different to the viewer windows.

  • Aladdin4d
    Aladdin4d Posts: 2,481 Enthusiast
    edited April 2016

    #13 Text Layer Limitations

    Let's start with what a modern font is and isn't because it's important for understanding what you are seeing. A modern font is not a bitmap. I'm going to say that again just because it's really really important. A modern font is not a bitmap. A modern font is a description of vector lines and Bézier curves that produce a font outline without any pixel information. It's the font engine or virtual machine that rasterizes the vector information into a bitmap approximation of the font. The underlying technologies were developed specifically for desktop publishing and typesetting with the final rendered output being printed material, not your display. 

    No outline font, be it Type I, Type III, Truetype, OpenType, AAT etc etc can be accurately rendered to a raster display and the narrower the outline and the smaller you go the less accurate a rasterization becomes. The shape of what's seen on screen can and will change depending on size, position and other transformations. (Shimmer when moving) This is true across all operating systems, applications and outline fonts. (although some handle it better than others)  Hinting was introduced help make outline fonts legible on a display and most fonts would be illegible at some point without it. Hinting is mathematical info used to deform an outline at various point sizes to force it to fit a grid which is then interpolated into primary pixels by the font engine. The font engine can add anti-aliasing or even subpixel rendering (Microsoft ClearType is a subpixel rendering engine) to make things look better but neither is a requirement. Some fonts like Verdana and Arial Black contain A LOT more hinting information than others and tend to look better on screen and some fonts don't really have any relying instead on automatic hinting which isn't all that great. The highest quality fonts that look really good onscreen are always manually hinted.

    When you add a text layer all you're seeing at that point is an approximation which is being displayed in a window that's already a very low resolution approximation of what's on your timeline. Double whammy. This will never match the rendered output and depending on what you're doing and what font you're using rendering out may produce even more artifacts and distortions.

    So.....(somewhat out of order)

    But, because the renderer uses the position on the screen to determine which text pixels will be rendered, it can, and does, choose different ones to plot, which causes the shimmer. 

     

    It has to do that because that's the way outline font engines and raster displays interact. (horribly) Remember there isn't any real bitmap of the font and it's constantly being recalculated. The only way around it is a fully rendered bitmap.

    This seem to be because of the way it is rendered. If it was rendered 'off screen' and the pixel positions (including which ones would be antialiased) set in stone, then rendering that anywhere on the screen would not be an issue, as the whole 'pattern' would slide smoothly about. This works if you print text on a plane and use that as a plane to slide around the screen, as I do in the video below.

    This works because you are no longer working with an outline font. Instead you're working with a fully rendered static bitmap. Side note - You can't set anti-aliasing in stone before hand. Anti-aliasing needs to know what the neighboring pixels are before it can be performed. Any move means different neighboring pixels which means changed anti-aliasing.

    I used Export Frame in this video and discovered it applies Antialiasing even when the Viewer mode is set to Full, then it applies it a bit more if the viewer is set to Antialiased, so the two Exported frames are different, and neither is like that seen in the viewer. Not helpful if you're using that method to stretch a single frame over a longer time or something similar.

    Not unexpected at all. When the viewer is set to full the anti-aliasing you're seeing in the export is being provided for the text by the outline font engine. When you set the viewer to anti-aliased you're getting a second round of anti-aliasing on the text. Neither will match the preview because well, they just can't. The export is at a completely different scale ergo the bitmap approximation of the outline font is also completely different.

    This method can be used to stretch a single frame over time without any problem. If you export a frame with the viewer set to full and stretch it out over a period of time then it will match any other output render pretty well even if you turn on anti-aliasing later. If you export with anti-aliasing already on then no it won't match exactly but the difference is probably only going to be really noticeable for one frame when you go from still to motion. 

    Then the export of the video applies yet another variation on Antialiasing, because the video looks different again.

    Again not unexpected especially for anything other than an uncompressed or lossless image sequence render. Any lossy compression scheme implements its own smoothing and such on top of everything else. Then amount done depends on the codec and the amount of loss. Side note - An exported Jpeg will look different than a PNG because Jpeg is lossy and implements smoothing.  

    So, why doesn't it do that all the time, including when projected at oblique angles etc. etc.? Because...it must be capable of it or I couldn't export a frame of it doing so with the improved Antialiasing applied. ;)

    Should be covered by now but there's no way to have an outline font match in both the viewer and a full resolution export. 

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    @Aladdin4D nope, disagree. Have another look at the second video.

    Of course you can render a letter at a specific scale and choose which pixels are to be plotted on a stationary surface and not need to change them. You then take that same pattern of pixels and place them on any other surface and there is no need for anything to change in the outline. You effectively choose the 'seed' for the pixel/sub-pixel shape for the first rendered position and don't change it.

    The shimmering is because it moves by less than a pixel and so the 'redrawing' of the outline chooses different pixels, because what was on a whole pixel boundary one time, is now on a half, or quarter pixel boundary the next time, so you'll need a variable-transparent pixel to represent it, but my point is it's doing it really, really badly, even with Antialiasing turned on, which does not look the same on exported video as it looks in the viewer (bold because it's important).

    It's apparently just throwing away pixels that are not on a whole pixel boundary and plotting the full ones, or it's rounding up the 50% ones to full and rounding down the 49% ones to no pixel. Then it applies the antialiasing to what is already a jagged mess. No wonder it looks so bad, it's lost most of the useful information from the original vector, then attempting to guess what was there from the pixels on the screen only!  That's too late!

    Well, fie to that, say I.

    You can do exactly what happens when you use a .PNG of the letters instead of the letters themselves  if you can be bothered. It might not be as "accurate" an outline, but if the results look better (and that .PNG still manages to plot to sub-pixel boundaries with only the smallest of shimmers - caused by the same sub-pixel calculations of what is now a flat texture - because the antialiasing is done better).

    You misunderstood the 2x antialiasing thing.  Export full and anti .PNGs reimport and view in full and anti modes. Even in full, the full looks like anti. The anti looks like anti+. In anti the full looks like anti+, the anti looks like anti++. There is nothing, in any mode that looks exactly like the full .PNG looks in the viewer in full mode.

    So any other video frame has the same extra antialiasing applied to it, even if exported in only full mode (ie, jagged looking). Try it with a small plane at a 30 degree angle (so nice jagged edges) and reimport it. Nothing like anything on the timeline in any mode.

    The Exported Video applies less antialiasing. Nothing to do with lossy compression, it's perfectly clear and clean with no artifacts, it's just worse, so the exact opposite of what you are suggesting.

    Exported frame .png is smoothed (too much antialiasing applied), video frame (like a load of lossy .jpgs,) is sharper with less antialiasing.

    As with my previous bugs: Watch the video again,  then check it yourself. ;)

    And a 100% scaled up viewer window on my second monitor (well, 99.5% because of the small menus around the edges which can't be removed) should look pretty darn close to the exported video on the same monitor. It doesn't.

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    You know what? I talk a lot of nonsense, but I've discovered something. @SimonKJones 's response was not only brief and professional,  it was right.

    "If you want it to behave more like other layers, you can stick it inside an embedded comp."

    Yes, that sentence - like all the documentation for Hitfilm - was spartan, but accurate. Simply comping a text layer 'freezes' it as I want it to and the shimmer is gone if I treat that embedded comp like any other plane.

    Shimmer free text without having to export it, then reimport it or any other messing about. Why you'd want it to work any other way when you pretty much have to do this to get a good result is beyond me, but at least it works. Done with complaining about that, but some of the other Export antialiasing-whether-you-want -it-or-not stuff is still a bit weird. :)

  • Aladdin4d
    Aladdin4d Posts: 2,481 Enthusiast

    Well you can disagree all you want but it won't change anything and I stand by every word 100% ;)

    Embedding in a comp works the way you want because the text is fully rendered to a bitmap first which I also mentioned was the way around the problem

  • Palacono
    Palacono Posts: 3,423 Enthusiast

    To be fair, I should post as much about the solution as I did the (perceived) problem. I still think it's more faff to just get some decent text on the screen, but the results do work and it also gets around the weird 'out of sync' thing with Quad Warp.

    I wish someone had pointed this out over a year ago as a solution to the noisy text when I first complained about it. But, hey, if they had, we might not have got Boris added to Hitfilm! :D

    Video is of the massive improvement when text is in a composite, so no real reason to ever use it in the 'normal' manner again.

     

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    Suggestion: #14 Rate Stretch Indicator.
    -------------------------------------

    It's OK, I haven't hit my head (actually I did when standing up under a cupboard door a few days ago. It was the sort of whack that you can taste!) but anyway, this is a suggestion to make different Rate Stretched segments of video stand out in the interface, as my next video will show you: it's easy to make a selection mistake and not be aware of it.

    My suggestion is that when slowed down they have a lighter look (imagine a painted strip of rubber and the paint cracking as you  stretch it out and the underlying rubber is lighter) and darker when they're compressed/faster (again, imagine the paint crumpling up on the slackened rubber strip)

    Anyway, I thought it would still fit into the whole minimal aesthetic of the interface.  :)

  • NormanPCN
    NormanPCN Posts: 4,082 Enthusiast

    Hitfilm could do something like Vegas does.  They put a sawtooth line on an event. The teeth spacing varies by stretch amount.

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    Yes, I use SMS which is the same interface, but it only tells me it's changed, not in which direction (I'm not going to count the number of teeth per mm to work it out) ;)

    I thought about lines spreading out, but Sony limits you to 4x changes so that would work, but Hitfilm can have any value, so the lines could be so spaced out they'd not be noticeable, so I went with just colours. But the more suggestions the merrier. :D

    I'd also like it to show the speed multiplier as you're dragging it out, but...baby steps. ;)

  • NormanPCN
    NormanPCN Posts: 4,082 Enthusiast

     Point taken. How about this. Look again at the Vegas event and notice the option I have enabled which shows the event in/out points in time. The rate stretch could have a small text bug like that, centered in the event, which would list the rate factor.

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited April 2016

    Yes, but I think/suspect the designer likes things nice and clean, without numbers visible.

    Which is why the indicator for clip length in a composite lost the handles it had in Hitfilm 2 and is now just a lighter grey against a darker grey (or vice versa?). If you didn't know it was there you'd never find it...

  • Palacono
    Palacono Posts: 3,423 Enthusiast
    edited May 2016

    New Bug/feature: #15 Viewer Playback Problems
    -------------------------------------

    The playback has always been a bit arbitrary about where the end of a clip is and if you put something on repeat loop playback that goes to the end of a clip you may, or may not, have a blank (or black) frame shown before it loops again.

    In HF3P, I mostly don't worry about that last frame too much.  Although I guess I should also test what happens at render time... nope that's one for the Dev's once they start digging in that code. ;)

    Anyway, it seems in Hitfilm 4 (video is of Express, but Pro is the same) it's a little more broken than that.

    As you'll see in the video, it not only gets confused about where the end of the clip is, it will actually go backwards if you press Play again, where adding the Reverse Effect seems to compound the not-sure-where-the-end-is problem.

    The same project in HF3P works just fine, although it also stops at the last frame (*). It also plays in reverse a lot more smoothly in HF3 than in HF4, although it is is still slightly jerkier than playing forwards, which is weird for such a simple task.

    You'd perhaps expect video might be affected by Reverse; as playing .MP4 forwards is slow enough. So backwards? Fuggedaboudit! So you'd convert to DnxHD etc. to get around that, but this is 4 small text layers, so WTH? 

    Why is HF4 slower than HF3? Again! (Proxy creation is a lot slower as well)

    (*) So, apart from the bugs, should you be able to move past that last frame or not? It seems a bit weird, but not something I've worried much about before. But as I said above:, in a playback loop (with RAM preview), it might or might not go past the end.

    BTW, It also regularly ignores the Out point on a loop playback, even if you place the playback pointer well within the range before pressing Play, so that whole coding area is up for grabs. ;)

    Update: Added a bit to the video that shows RAM preview has one of those "Start at frame 0 or 1?" issues too.

    Fun fact: the number one (or zero ;) ) problem in coding is programmers getting 0 and 1 mixed up. E.g create an Array to hold 10 items and they are referenced as Array[0] to Array[9], so the first item is in the zeroth Array space. Catches everyone out at some point. Fun'n'games begins when they try and reference Array[10] for the 10th item instead of Array[9]. Crash, Exception, Overflow, Panic! etc. :(

  • AxelWilkinson
    AxelWilkinson Posts: 5,252 Staff
    edited May 2016

    Its intended and by design that the playhead stops on the last frame. Earlier versions of HitFilm weren't like this, but it was corrected in HitFilm 4. You shouldn't be able to go past the last frame of the timeline. If you want to go past the last frame of your layer, then you can extend the length of the timeline to add a frame or more after it. The reason that sometimes the playhead stops a frame or two short of the last frame is because your system isn't playing the timeline in real-time, its dropping frames, and once it reaches the total runtime of the timeline, it stops on the last frame it was able to calculate and display.

    Every second starts with a :00 timecode, so a 4:00 runtime will end on 3:29 for your framerate. Take any second besides the first and this becomes clearer, for example: the 30 frames included in your final second are 3:00 through 3:29. 

  • Palacono
    Palacono Posts: 3,423 Enthusiast

    @AxelWilkinson OK, I wasn't that bothered about the last frame being passed or not, but Hitfilm 3 does the rest better (like so many things to do with performance <sigh>) and it sure doesn't go backwards when you press the Play button when it's 3 frames from the end.

    Still a bug in there if you look at it properly. ;)

    And seriously? It can't play 4 text layers in reverse without skipping on playback? My PC isn't that underspecced. i7 950 and a GTX 580.

    Maybe the performance enhancements promised for Hitfilm 5 will bring us back up to the speed of Hitfilm 3? Still a way to go to get as fast as Hitfilm 2 though.

    Is there a big list of If-then-else statements per pixel that just gets longer with every new feature added, that's responsible for the slowdown with each version?  If it was compiling a JIT list of used function calls per composite shot, the speed should remain constant, no matter how many Effects are available, because you're not checking and discarding them, you only worry about the ones actually being used. You could also probably merge some of the simpler ones together. If one Effect is lightening a pixel and one is darkening it, then don't do both, do the result after you calculate it once.
     
    Get some video game coders on the team, they'll show you some speed optimisations, even if they only work in Preview mode through not being completely accurate. Like the ones that are used in RAM preview, which has no Antialiasing but is now slower to create than just watching the video play on the timeline. What's going on there?

  • NormanPCN
    NormanPCN Posts: 4,082 Enthusiast
    edited May 2016

    About the backwards playback performance. Text is generated and there is no media/data file with frames. My guess is that the text position/scale/movements are calculated from front to end for each frame of reverse movement. So it computes frame 0..100 to display frame 100 and then 0..99 for frame 99 and so on. You have to compute 0..99 to know what frame 100 needs to look like. Any buffer/caching of info is likely designed to make computing frame 101 faster after having computing frame 100.

    Although I have doubts that the Hitfilm folks look to do much buffer/caching of information. For example, consider auto levels and contrast effects. They compute off the current frame but that can cause undesirable visual changes. So you can fix them to compute the auto adjust from a single frame. Hitfilm does not buffer/cache the necessary information from that specific frame for the effect. They re-decode that frame from disk, to re-compute the adjustment, for every frame the effect operates on, forever. This is lazy. You don't need to buffer the frame, just the result setting of the auto computation. This lazy technique works better with Intra codecs since only one frame needs to be decoded. This of course is primarily speculation, but I see the performance get flushed down the toilet when selecting a single frame versus on the fly.