CamTrack AR with other production cameras.

JordanWright
JordanWright Posts: 1 Just Starting Out*
edited July 2021 in CamTrackAR

I have an idea for using CamTrack to be able to rig it up to my Red Komodo. I have a manfrotto phone mount attached to the camera cage on the Komodo. I very carefully try to start recording on CamTrackAR and the Komodo as close to to same time as possible.

I measured the offset between my iPhone 12max Pro and the Komodo’s lens and it’s is about 7.3 inches away at 37 degrees. Next when importing the FBX file into C4D I plan on parenting a new C4D camera to the camera created by CamTrackAR and use the same offset angle and distance of 7.3”. Also will make sure to match the new camera created in C4D to the one attached to the Komodo. This should exactly impart the motion of the tracked camera from CamTrackAR to the new C4D camera. I also will make sure to set the sensor size for the new C4D camera to match the sensor size of the Komodo (s35).

The only part I think that will take some trial and error is matching the record start time offset of when I’m able to hit record on the Komodo and my phone. That shouldn’t be hard though and can be modified by adjusting the camera track toward or backwards in C4Ds dopesheet until we get a frame accurate match.

I hope to try this out next week when I’m past the two broadcast deadlines I’m on.

Any suggestions or potential problems you might see with this plan?

Thanks for making this app! This could be a huge game changer for recording on location motion data.

Oh also! Any chance that you will ever add 24fps to the record options?

-Jordan

Comments

  • rwsd
    rwsd Posts: 1 Just Starting Out

    So curious how well this worked out or not? I just got a Blackmagic 6k pro and wanted to do something similar, mounting an iPhone to my Blackmagic rig. Any knowledge to share or tips from your experience?

  • alaska_vfx_filmer
    alaska_vfx_filmer AlaskaPosts: 556 Enthusiast

    I tried this (rough and dirty) mounted on top of my dslr, but the results were extremely disappointing,

    I would like to see how well it works with a better setup

    I would also like to see 24/25 fps, also option for 180deg shutter options, but I hear its actually AR Kit limitations, so not really anything FXHome can do yet.

  • goji1986
    goji1986 Posts: 3 Just Starting Out*
    Add me to the list of interested parties...attempting to clamp my iPhone 13 Pro to a GH5 SmallRig cage, just trying to figure out how to measure the offset.
  • mattpf
    mattpf Posts: 4 Just Starting Out*

    We're considering this as well where I work. We do a lot of green screen stuff (ProCyc-based studio) and shoot with the Black Magic 4k. We have a Zhiyun S3 Crane Pro Gimbal. We do a high volume of production using CamTrackAR ought to help speed things up for camera movement tracking.

    We're using some SmallRig cages to mount things together.

    The key issues seem to be determining the offset from the nodal point of the phone to the nodal point of the lens. High end virtual production tracking systems have tools for this. There is a similar tool for unreal engine, so perhaps the offset can be determined by using UnRemote. Once you have it, you should be set as long as your camera rig is the same.

    For sync we use Tentacle Sync for LTC timecode-based sync, which uses a TRS audio cable. There are external mic adapters for iPhone (I have one already and have used it for external mics with Filmic Pro) so the question here is if we can get CamTrackAR to accept external audio over the internal phone mic.

    If it can that will make syncing the BRAW and CamTrackAR footage a snap. If not, audio based syncing should be good enough to help sync the BRAW footage with the CamTrackA footage.

    Then just have to sync the dopesheet with the footage. Probably some scripting to help smooth the workflow at various points.

    The third issue is the fact that phones (and computer screen recorders) have a slightly irregular frame rate, which might present sync issues.

    Anyone had any luck with this since the OP posted?

  • alaska_vfx_filmer
    alaska_vfx_filmer AlaskaPosts: 556 Enthusiast

    @mattpf Not me, for the reasons you stated above (position of phone and main camera, focal length, sync, etc) I have not personally retried it, I am sitting on the sidelines patiently waiting for "somebody" to figure it out for me :)

  • mattpf
    mattpf Posts: 4 Just Starting Out*
    Here is a section on Unreal Engine's lens/camera calibration tools, which apparently has setup for nodal point offset in addition to lens distortion.

    https://docs.unrealengine.com/4.27/en-US/WorkingWithMedia/IntegratingMedia/CameraCalibration/

    The plan for our tests is to build out the camera rig and then run UnRemote and Unreal Engine (with the camera connected via HDMI) and run the above process to get values for nodal offset.

    Next step is to translate those values into equivalent units in Fusion/Resolve and After Effects, most likely via FBX.
  • mattpf
    mattpf Posts: 4 Just Starting Out*
    Update:
    Did some tests this week and going to do a bunch more next week.

    1) Tracking seems to be pretty accurate when looking at the iphone footage paired with the tracking data, even during 30 minute recordings. If the Black Magic Pocket Footage can be corrected for lens distortion, nodal offset, and most importantly the CamTrackAR footage/data doesn't drift, it should be equally accurate.

    2) Timecode and audio based syncing is a nope. Plugging in my tentacle sync via audio adapters just mutes the audio and I can't get LTC timecode from audio track in Resolve. Audio also drifts considerably so syncing by audio is out. You have to sync it by eye.

    3) The footage doesn't seem to drift, which is encouraging. Phone and computer based recordings typically have irregular timing and won't stay synced to professional audio or video recordings.

    The next step is nodal point offset and lens distortion, which I will be doing tomorrow.

    https://www.youtube.com/watch?v=a-bkK5apunc

    https://docs.unrealengine.com/4.27/en-US/WorkingWithMedia/IntegratingMedia/CameraCalibration/
  • Triem23
    Triem23 Posts: 20,288 Power User

    @mattpf progress! We'll see if you can get the needed corrections. If you make it work I'm sure other users will be interested in your steps, cuz it's not a trivial thing to do....

    Yeah, CamtrackAR is about the only mobile app I know which shoots video to a constant frame rate, not variable. This, of course, shows that most apps being VFR is an inexplicable choice on the part of the developers, since it's obviously possible to just record CFR to begin with!

  • alaska_vfx_filmer
    alaska_vfx_filmer AlaskaPosts: 556 Enthusiast

    Interested to see if this works, although this sutup is looking a bit out of my class, kudos for all the time you must be putting in, and thanks for keeping us all posted.

  • mattpf
    mattpf Posts: 4 Just Starting Out*
    @Triem23 Not so fast.
    Unfortunately, I have conducted further testing and revealed that CamTrackAR drifts quite severely.

    Yesterday I did some more testing and basically recorded 30 second clips with head and tail slate, along with some wild gimbal movements in between those slates.
    I recorded (BMPCC4K) at 23.976, 24, 25, 29.97, 30, 50, 59.94, and 60 frames per second.
    I did two takes of each frame rate of the pocket 4k. One take with CamTrackAR running at 30 fps and the other with it running at 60.

    Then I threw those Black magic clips into Premiere timelines (one timeline for each framerate), then used the head and tail slates to sync the clips up and discovered that CamTrackAR (and probably all iphone apps) run fast.

    The clips are about 30 seconds. The 23.976 and 24 fps tests drift about 3 frames. 29.97 and 30, 4 frames. At 59.94 and 60 its 7-8.

    Now the real question is weather or not this drift is actually constant and retiming the clips (and more importantly the keyframes on the import CamTrackAR data) will sync it up.

    So I used the rate stretch tool to sync up the tail slates each test. I left the Blackmagic Footage alone and adjusted the CamTrackAR clips.

    This gave me varying playback speeds, varying from 99.5% to 99.78%

    Here is table of drift rates. BMPCC4k is Black Magic Pocket Cinema Camera 4k and CTAR is CamTrack AR.

    BMPCC4K fps CTAR 30fps CTAR60fps
    23.976 99.78% 99.61%
    24.00 99.66% 99.71%
    25.00 99.61%. 99.61%
    29.97 99.61% 99.67%
    30.00 N/A 99.5%
    50.00 99.64% 99.61%
    59.94 99.69% 99.57%
    60.00 99.7% 99.52%


    I forgot to do a tail slate on the 30/30. Furthermore, some of the errors might be from my tail or head syncing being off by a frame. I synced by clapping my hands and I was doing this alone so my cinema camera was soft focus.

    The real question is if the drift rate is constant (enough) to simply fix by retiming the footage and more importantly the keyframes of the recorded track data.

    If it isn't, this is going to be a bust.

    I'll be doing some lens and nodal distortion hopefully this afternoon.
  • goji1986
    goji1986 Posts: 3 Just Starting Out*
    @mattpf keep up the amazing work! I’m really hoping to use this method to shoot a short film this summer; been testing with a BMPCC 4K as well (got access to one instead of the GH5) and the results have been less than ideal with the nodal offset.

    Haven’t encountered much of an issue with drift rate, but I’m only shooting 1 minute or less test shots for now.
  • paulharden
    paulharden Posts: 5 Just Starting Out*

    I started experimenting just today, I'll buy the paid version if I (or we) can get the thing working. I shoot with a Sony A7III. Iphone is rigged up with a bracket connected to the hot shoe. In my opinion the nodal point offset can be solved in Hitfilm parenting a second camera to the tracking camera and offsetting it. So the original tracking camera is treated just as a pivot.

    However I got stuck with the framerate of the clips exported by the Iphone! I was getting weird results so I shot a test with an on-screen timecode recorded on the Sony and on the Iphone and found that if I select 30FPS in the app, the output file is interpreted by hitfilm (or AE) as 60FPS at half-speed, so I must manually set it back to the right frame rate. I am worried about having to time-stretch the tracking data to make it match. Anyone had the same problem? Maybe it's all about my old Iphone 6 or the free version of camTrack?