Need help using camtrackar in unreal (examples?)

gregcorson
gregcorson Posts: 8 Just Starting Out
edited February 1 in CamTrackAR

I'm trying to build a comp using CamTrackAR and unreal engine and am having problems. I was able to convert the CamTrackAR file to FBX using blender and also converted the video to a sequence of EXR images.

I was able to build a comp in unreal using composure, but something still seems to be wrong. The first frame seems aligned right but as the camera moves the comped character slides around a lot and doesn't seem to match with the shot footage. I am thinking maybe the image sequence and unreal CG aren't syncing up correctly, or possibly the camera parameters in unreal need to be changed/added to in order to get a good match with my ipad lens.

I'm wondering if anyone has an example of a comp done entirely in unreal, using an image sequence for the movie?

The other problem I'm having is that when I try to render out the comp in sequencer, it runs fine but after about 60 frames seems to slow way down even though the cpu, gpu, disk and memory use are very low. Again this seems to be a problem related to the image sequence as even during the rendering I'm not sure the sequence is in proper sync.

Comments

  • NickDevTeam
    NickDevTeam Staff Posts: 54 Staff

    Hi @gregcorson , thanks for your post.

    There's a lot of steps (unfortunately) between CamTrackAR, Blender and Unreal and even we have to double-check things sometimes when things don't match up. We're working on something that will hopefully make it much easier to work in Unreal which we hope to announce soon.

    The Cyberpunk tutorial is definitely the best one to look at to see our steps. I'm wanting to create a sticky post soon with some "My track is sliding" FAQs with pictures.

    The important things to note

    • CamTrack is working in real world Meters, and that's how it comes into Blender which also defaults to Meters as it's units. Unreal is using Centimeters as it's default unit, so sometimes we have to adjust the scale of things manually (x100). For instance, the camera can appear scaled huge when it comes into Unreal. We're aware of this and looking into the issue.
    • When importing the FBX from the Sequencer (from the Wrench icon on the toolbar) it is crucially important to ensure that "Reduce Keys" is OFF. Unreal wants to group similar keyframes together to optimise things, but for camera tracking this is really undersirable and leads to sliding effects.
    • Unreal doesn't seem to detect the Camera framerate from the FBX format on import. So if Camtrack was recording at 60fps, Unreal may default to 30fps. This is on the Sequencer and it's recommended to change this to the native format.
    • The aspect ratio of the Camera also needs to match the recorded footage. I believe Javert does this manually in the tutorial in the Camera Inspector panel in Unreal.


  • gregcorson
    gregcorson Posts: 8 Just Starting Out

    I have been doing all the stuff you mention. The main difference is that I need to do the video+CG composite in unreal using composure and render it straight out of unreal without using hitfilm.

    What I am trying to do is use the video from CamTrackAR as a background and insert a 3D CG character into it. The CG character will be performed live in a mocap rig and the actor needs to see the final comp running live as they perform. Something similar to this demo, but with the camera in motion https://youtu.be/oQR90pDoIYg

    I think the main problem I'm having is in importing the EXR sequence into unreal and syncing it up, I thought this would be easy but for some reason it doesn't seem to be working right. I can try rendering out just the CG from unreal and comping it in hitfilm, just to see if it works, but I REALLY need to do everything in unreal to get the live output so the mocap actor can match their movements up with things going on in the video background.

    I know this won't look as nice as an external composite in hitfilm, but it's needed to get the performance right. For final video I can render out the unreal assets in high quality and composite externally.

    Any help would be appreciated and once I get this working I'll be publishing a tutorial on YouTube to help others out.

    Also, any reason why you don't support recording 4k video?

  • gregcorson
    gregcorson Posts: 8 Just Starting Out

    By the way, the final thing I'm trying to end up with is in the same room as the video posted above. There will be a six-inch high CG character standing on the grid you see on the desk. As the camera moves around the actor will talk to the robot and the robot will talk back as the camera moves in for a closeup on the character.

    The plan is to pre record the room and actor with CamTrackAR, then the MOCAP actor will perform reactions to the actor while watching the live comp of their character with the CamTrackAR footage.

    I've already done something similar with live cameras tracked with VIVE, but thanks to COVID I need to produce this without having people come to the studio to help. So with this setup someone can film the background in their home with CamTrackAR, send me the footage and I can perform the mocap and final comp here.

  • gregcorson
    gregcorson Posts: 8 Just Starting Out

    The other thing I'm wondering about is camera parameters. It seems to be importing a focal length of 29.96 which I think is correct for the iPadPro camera, but I don't see this anywhere in the hitfilm or python file, did I miss it? Or is it calculated somewhere I missed?

    The camera that gets created in unreal doesn't seem to have the correct sensor size or aspect ratio (I see 36 mmx24mm with a 1.5 aspect ratio) it seems like this should be set to an aspect of 1.333 to match your 1920x1440 video.

    The other thing is that there are no lens distortion parameters. You need these to get a good track whether using hitfilm or something else. Otherwise there will always be slippage of the tracking.

    Do you have all these parameters? Maybe you undistort the video inside of CamTrackAR? Hard to tell.

    I did run a test with blender camera tracking on my iPad video and there was much less slippage, I assume this is either because it calculated the lens parameters or just the track is more accurate than ARKit.

    I'm really interested in getting CamTrackAR to work in this workflow as it's a very convenient tool!

  • NickDevTeam
    NickDevTeam Staff Posts: 54 Staff

    Hi @gregcorson , there's quite a lot of content to read through so forgive me if I accidentally miss a few points.

    Just generally, are you able to import the CamTrackAR data into HitFilm directly (with the HFCS file) and confirm that tracking to video is working in the way you expect (with planes on the tracking points)? If there's a problem at this stage, it will filter through to other stages.

    For the EXR in Unreal, I fear you may need to ask on the Unreal Developer forums as that workflow isn't something we've handled directly here.

    "Also, any reason why you don't support recording 4k video?" --- One of the caveats of the ARKit technology is that it locks the capture device (the camera in this case) and we have to keep to the resolution that ARKit needs to process effectively. This is currently limited to 1920x1440, but we hope the API will allow for more film-oriented options in future.

    "t seems to be importing a focal length of 29.96 which I think is correct for the iPadPro camera, but I don't see this anywhere in the hitfilm or python file, did I miss it? Or is it calculated somewhere I missed?" --- The ARKit camera has a transformation matrix and a projection matrix that we are able to query to get information about alignment in the world. From the projection matrix, we are able to get the 3D Camera Field of View in degrees (vertically, I think) and this is converted to focal length in the Blender file. It's a little cryptic here because we're importing a HitFilm camera into Blender. HitFilm uses a Camera ZoomInMM value which we are converting here into a focal length.

    "The other thing is that there are no lens distortion parameters. You need these to get a good track whether using hitfilm or something else. Otherwise there will always be slippage of the tracking." --- I need to confirm this first, but my feeling is that the image is already somewhat compensated in order for Augmented Reality to actually work in basic AR apps, but I need to look into this further and discuss with my colleagues as I may be wrong about that.

    "The camera that gets created in unreal doesn't seem to have the correct sensor size or aspect ratio (I see 36 mmx24mm with a 1.5 aspect ratio) it seems like this should be set to an aspect of 1.333 to match your 1920x1440 video." --- Yes, I think this is what Javert is doing to the camera in Unreal to ensure it matches the video. Unfortunately, Unreal seems to ignore some aspects of the FBX file on import which is why this seems to be done manually.

  • gregcorson
    gregcorson Posts: 8 Just Starting Out

    I would like to see you do a demo where the live actor is standing on a CG floor. ie: so you can see their feet slip around relative to the floor. This is the acid test for camera tracking. Your demos to date are all torso shots, these can tolerate a lot of tracking error. It would be good to see a demo of the feet on CG floor case just to see how it works using your tools and if you have to manually adjust anything. If you can do it with your tools, then the problems I see must just be workflow issues that I need to fix on my end.

    I am not using hitfilm (yet) but I was able to import the the hfcs into blender, use the tracking and create the comp in blender. There is some slipping in the final blender comp where cg objects on my real desktop slide as much as a few CM during the course of the shot, when I track a similar shot in blender the error is like 4mm. Either this is normal error with ARKit or it might be caused by not having the "undistort" parameters for the iPad lens correct when doing the comp. Usually when you track a camera using blender it calculates this for you and sets up an undistort in the comp.

    Regarding lens distortion parameters, I have tracked iPad video in blender and did get some values. Not sure if when you use ARKit to capture video if it does this for you. With most ar stuff either they undistort the video to match the CG, or distort the CG to match the video. If the video isn't properly "undistorted" it will slip in the comp. When I have a chance I will use blender to track some camtrackar video, this will tell us if the distortion has been cleaned up or not.

    I will have to take a closer look at the camera settings that get imported into blender, maybe there is something there that isn't getting imported into unreal.

  • gregcorson
    gregcorson Posts: 8 Just Starting Out

    A couple things....

    1. Even when I have autofocus off, I still seem to get refocus events.
    2. I tracked the video coming out of camtrackar using blender and did find some lens distortion, so I don't think arkit is correcting for it
    3. After doing several more tests, if I put a CG object on the ground I still see some wandering, not a lot but enough to notice. Makes it hard to use this for any shot where something CG is sitting on the ground or a live actor's feet can be seen touching the CG ground

    I don't think the wandering problems are enough to notice if the object sitting on the ground is animated/moving....only if it is sitting still.

  • Sidjum
    Sidjum Posts: 5 Just Starting Out*

    iPhone XR and iPhone XS. I tried both but still keep getting wandering in the compositions. I build a talk show studio in Unreal and I am about to shoot the 2 people that need to be in that set. I think I need to mark my green screen with trackers and keep using the camera tracker in Hitfilm instead of CamTrackAR. It just doesn't do the trick for me (yet).

    The chair they sit on just does not 'stick' to the floor (in my tests). Such a shame as I bought an iPhone for this while do not like Apple.

    Any help is appreciated.