Import 50 or so OBJ objects with location - small triangle object

Strings123
Strings123 Posts: 39 Just Starting Out*

Some years ago I asked a question and got great feedback.

I have 50 (for now) objects. They are essentially triangles. Smallest facet I can make. I'd like to import them all into a composite shot but have their location import somehow.

Is it possible to import multiple OBJs in one shot?

Is it possible to put them on the timeline with their relative location (x, y, z) somehow? I could create the OBJ with essentially and offset for the object center inherently, but I would prefer not to do that.

Where I'm going is that if I wanted to import dozens and dozens of objects without having to type in each offset, it could save me hours, and also allow me to change items using my excel or python code as OBJ from CSV files.

Any thoughts?

Thanks,

GG

Best Answers

  • Stargazer54
    Stargazer54 Posts: 3,740 Ambassador
    Answer ✓

    @Strings123 It might be easier to do what you want by reading in your 50 or so obj files into a dedicated 3D program (such as Blender), then position them where you want. Write that back out as a single obj and then read that into HF.

    Can you give us more details on what you are trying to do with your project?

    If you are wanting to be able to animate each of the triangle objects individually in HF, then there are some extra steps. In the 3D program, put each triangle object into its own group or layer within the overall object. Then when you import the 3D model into HF go to the Groups tab in the 3D Model Properties window and click the boxes to activate each "Pivot" layer. HF treats Groups as Pivot layers. Once you have your object in a composite shot then go to the Properties panel and under Models, open the drop down for your model and you will see the Pivot layers. Each one can then be animated separately.

    Take a look at Simon's tutorial on importing a helicopter object. He talks about animating groups in HF.


  • Strings123
    Strings123 Posts: 39 Just Starting Out*
    Answer ✓

    @Stargazer54 Thanks for the reply. I will watch that video, it's been a couple of years since I've watched it. Great video.

    The Triangles won't be animated. They will be positioned. Then when the view in 3D space moves, they will appear correctly in their position. They are stars. I picked triangle - and they will be very small triangles - because I needed a facet, and 3 is the minimum. Using an actual observation database, I have x/y/z (as well as radial, but I won't use) positions relative to Earth. There are also 2 other parameters that I would love to use that indicates intensity and color, but I'll start with just getting the objects in place to start.

    I'm not a Blender user. I probably won't be. I wish I could. Life takes my time and I picked HitFilm as my editor of choice about 6 years ago or something. Maybe I'll take a look at it.

    I do python programming so I can take the data and calc other things. That was how I intend to create the OBJ file. If I could, I'd love to attached an appearance from a subset of PNG files to attached to the triangle before importing.

    Someone once said in a past question "you astronomers create your own software to do this...." Yep - except I'm not an astronomer. It appears there are other programs it can be done in. Cool. I'm a hitfilm guy.

    I actually am not sure how to import things from Blender so that's a learning curve.

    The star field is not even that important. It's icing on the cake. If you think about your position in the firmament, moving to get a different but real view of the sky in my animation would really be something. I'm actually working with an astronomer to demonstrate in more laymen terms (he does not do such programming - he's a research astronomer).

    Thanks all!!!

  • Triem23
    Triem23 Posts: 20,511 Ambassador
    edited April 7 Answer ✓

    Well, the last comment answered some questions I was going to ask, but won't need to.

    I remember another set of questions from a user wanting to import a database of actual star positions. Was this you?

    Ok, moving on...

    Flat triangle isn't bad - it's efficient geometry, but I suggest a tetrahedron. With a single flat poly you'll have to worry about things ending up edge-on to camera. A four-poly tetra gives you a full volume.

    There is a way in Hitfilm to semi-automate controlled placement of particle clones of 3D models, but I won't get into it because it's pretty janky and would likely take longer to set up than to hand position everything.

    Hand positioning will be the only way to do this in Hitfilm. It would be somewhat tedious. To keep the layer stack from getting huge you'd drag it one tetra then drill down into the layer controls to the "Model" controls. Here you'd be able to enter positions for the model (while leaving the LAYER at 0,0,0), then duplicate the model withing the layer, enter position, duplicate, repeat. Having duplicate models in the same layer means you'll have your geometry occlude properly without messing around with other methods.

    Blender... For this project you have two possible use cases. The first is similar to Hitfilm - you'd manually position the first tetra, input its position, duplicate, input the second position, repeat. Group the array into a single object. Once done all you'd do is export the entire array as a single OBJ for import to Hitfilm. The skills you'd need to do this are under five minutes of research and reading of the online help... You just need to learn how to input numeric positions, duplicate, group and export.

    Again, moving from Blender to Hitfilm in this use case is easy. Just export your Blender array as an OBJ file and import to Hitfilm as in the linked tutorial above. You're not dealing with different materials, animation groups, Alembic/FBX animation or anything like that. Just a simple save of geometry. As a metaphor it's like using a word processor like Open Office and saving the document out as a docx for import to Word instead of saving in the Open Office format.

    In the long run, especially if you intend on doing more astronomical projects, a dive into Blender will be useful. Blender supports Python scripts, so you'll be able to write code that takes your position database and plops a tetra at each location. This is out of my skill set so I can't offer specific advice, but I know it's doable.

    For a plain white tetra, this isn't too much of a deep dive. You don't need to know anything about materials or Blender animations, Sculpting, nodes, blah-blah, just place and save. Even the scripting should be easy enough. Blender has built in primitives so you should be able to program it to just create a pyramid (or Icosphere, which will give a rounder shape, but is still only 20 polys. 50 stars is a thousand total polys, which is nothing!) at each location point. Totally possible.

  • Stargazer54
    Stargazer54 Posts: 3,740 Ambassador
    Answer ✓

    @Strings123 OK. I think I see what you are doing now. I assume you are going for a spherical cloud of triangles that will surround the HF camera?

    Yes, if you know how to write the ascii to define an obj and can run your Python script to export the x,y,z position for each poly then you are on your way. Scripting this is the way to go.

    Back in the day I built particle clouds in obj format to use with Wavefront software using triangles. Been a long time since I've done it, but there is lots of info out on the "inter-tubes" about the obj format. I'll be happy to field any questions if I can help.

    My suggestion is to start with trying to place a lower number of triangles (about 10) with your script as a test to work out any bugs before running the full Monty. If the ascii is formatted correctly, it should read into HF with no issue.

    @Triem23 does bring up a good point about - will all the polys be facing the camera? or are some going to wind up edge on? A Tetrahedron is a good choice, although I'm sure you can get away with two double sided triangles at right angles to each other, for each star point instance, to lower the poly count.

    I understand about not wanting to dive into Blender to do your scripting there. The learning curve is a challenge (which is why I keep falling back to using Lightwave). But Blender has become such a standard of sorts, learning to do what you want to with it would be time well spent.

    Good luck and let us know how you get along.

  • Strings123
    Strings123 Posts: 39 Just Starting Out*
    Answer ✓

    @Stargazer54 THANKS!!!!!!!!!!! Wavefront? No kidding! Single lens/reflector? It's so core to the James Webb Space Telescope and really is incredibly challenging and fascinating. There are 18 mirrors plus the secondary, then each instrument.

    I will definitely start with a small number.

    The object I can make a small pyramid.

    Yes - it's essentially point cloud around the camera.

    I'll probably manually create an obj file or two to begin. I don't know how to apply the textures. In fact, it would be kind of neat to actually create a small sphere of minimal number of points, which really is easy and once you get them defined, bring in the different sizes as pre-defined lookup objects. What I really want is a color and size for each "point", and I thought that a triangle/pyramid would just be placed without any specific orientation.

    I might still ask a question about whether I can have an obj with an offset from a central point of the OBJ. Thus, all of these objs would import at 0,0,0, but the point/texture (I'm defining "Point" as the sphere location) be the offset accordingly. I hope I can also figure out how to wrap textures to the spheres. And then also make them visible in the 3D space from the camera position.

    I need to do some homework first before I start hitting people with questions.

Answers

  • Strings123
    Strings123 Posts: 39 Just Starting Out*

    @Triem23 THANKS!!!!!!!!!!! Wow - both of you have such great feedback!!!!

  • Stargazer54
    Stargazer54 Posts: 3,740 Ambassador
    edited April 9

    FYI, Wavefront Technologies was a 3D animation software company back in the 80's & 90's that required Silicon Graphics (SGI) computers to run on. OBJ was the native format of Wavefront 3D objects.

    Also, what we now know as "opengl" originally came from SGI. I learned UNIX before I ever learned DOS. Here's a brief history on Wavefront:

    https://www.historyofcg.com/pages/wavefront/

    https://en.wikipedia.org/wiki/Wavefront_Technologies


  • Strings123
    Strings123 Posts: 39 Just Starting Out*

    @Stargazer54 @Triem23

    Request for anyone who can do this in literally a few minutes. I cannot yet.

    Test object which will answer a ton of questions for me. As simple of an OBJ for the purposes of learning.

    1. A multi point 3D OBJ item that has, say, 4 or 5 points (or a simplified sphere). (This will allow me to see the file as well) . They must exist all over the place.
    2. Item Offset from the origin my I don't care how much. Say, 100/100/100 (again, teach me the file). What I mean is that although the object is in place, the actual object if offset in the file. Think of a hand on a person model. it's not at the center. If you had the hand out there without the body, does it import into that location of the points...I'm new.
    3. Apply any texture - I don't care. a JPG or PNG or whatever is used, and I can then alos learn more about OBJ file.

    My goal is to:

    a. Learn the absolute basic of the file (I will end up writing my own script, have a library of about 5 spheres, and apply one of 4 or 5 textures to the sphere base on my data).

    Understand how a texture, in the OBJ file, is referenced and applied, and then how the heck I set the parameters on that object are imported into HitFilm. I need to set glow or internal lighted aspects. If I have the simplified model, I can extra complications myself.

    b. Learn how importing the 3D OBJ will be offset from the original. I will have to manually drag those points on the timeline. If I mass import 10 3D items, then drag them onto the timeline and they are positioned automatically where I want them with 3D coordinate offsets?

    Your reward? Satisfaction and a shout out? Sorry - I can't afford 1K. I'll send back my python script and a sample database from actual observation data. Add to the knowledge of everyone here if interested - SHARE!

    For the sake of information of what I will process:

    1. I will first read a row and get the OBJ coordinates
    2. I get a size and choose from one of, say, 5 or 10 OBJ objects, and that's the surface to apply the texture to.
    3. Based on Color information, I select a pre-made texture of the correct color from many choices
    4. Based on brightness information, I further refine the choice of a texture brightness (e.g. there may be 4 different yellow choices, but I select the brightness of the yellow in the texture - more pure yellow RGB vs a toned down RGB).

    The OBJ creation python script is literally just a big if-then to select choices to reference in the OBJ file.

    You guys ROCK!!!!!!! You'll become stars! (That's a bad pun).

    Thanks,

    GG

  • Stargazer54
    Stargazer54 Posts: 3,740 Ambassador

    @Strings123 I am going to attach an example file of how I made my star point cloud many years ago. Hopefully the comments help in explaining this very crude script. It was written to run under UNIX C-shell, but hopefully you can follow the logic an adapt it to Python. There are plenty of ways to make this more efficient but the idea is to follow the logic.

    You probably don't need texturing as much as assigning a material color of varying brightness or color to the triangle. I think you will find that the triangles are so far away that you will never really see any texture, just perceived brightness.

    As far as texturing, you can do it all in an ascii obj file but you will need a material file and a texture image file to go with it. It is much easier to do all this in a bonafide 3D program and then write it out as obj (which will create the .mtl file). Anyway, here is an example of a square that has texture vertices - "vt".

    Texture-mapped square
    
    This example describes a 2 x 2 square. It is mapped with a 1 x 1 square
    texture. The texture is stretched to fit the square exactly.
    
    mtllib master.mtl
    
    v 0.000000 2.000000 0.000000
    v 0.000000 0.000000 0.000000
    v 2.000000 0.000000 0.000000
    v 2.000000 2.000000 0.000000
    vt 0.000000 1.000000 0.000000
    vt 0.000000 0.000000 0.000000
    vt 1.000000 0.000000 0.000000
    vt 1.000000 1.000000 0.000000
    # 4 vertices
    
    usemtl wood  #this assumes a material call "wood" is defined in master.mtl and calls for an image file as the texture
    f 1/1 2/2 3/3 4/4
    # 1 element
    

    I would recommend doing some research on how to write an obj directly in ascii. Here are a couple of links to get you started:

    http://paulbourke.net/dataformats/obj/

    https://www.fileformat.info/format/wavefrontobj/egff.htm

    http://dirsig.cis.rit.edu/docs/new/obj.html