March 2024 Progress Report
Filmmaker
Embedded Projects / Scenebuilds
I've added support for embedding a PFM project in other PFM projects. For instance, you can set up a project "A" as a scenebuild, and then create a new project "B" that uses "A" as a base:
It's similar to using a map, but more flexible. You can use multiple embedded sub-projects, change an embedded project between film clips, etc. It also makes it easier to re-use a scenebuild for different projects, or share them.
Motion Capture
I did some more work on the motion capture module, which can now capture motion directly from webcam:
(Thanks to WigWoo1 for the model!)
It also has support for tracking hands, fingers, as well as the lower body. Eye tracking is still a work-in-progress.
This will allow you to do simple motion capture using a webcam or your SmartPhone for any humanoid character model. Unfortunately the tracking isn't good enough for complex movements (like dance moves), but I think it should still come in handy. You can also specify what you want to track, for instance if you only wish to track facial movements (for lip synching).
I'm actually thinking about launching a new sub-project based on this for VTuber avatars (I was testing that with the video above). It wouldn't take much time to implement, and it might help to attract some more people to the project, but I haven't made up my mind on that yet.
Recording
You can now record gameplay as animation data:
While the recording mode is active, all changes to selected actor properties are recorded and converted into animation data. The recorded animation data is then automatically smoothed and keyframes are generated.
The main use-case for it is to record animation data from motion capture in VR or using the motion capture module, but you can also use it to record manual changes like shown in the video above. For the future I'm also planning to use this to allow you to record physics interactions as animation.
Meta Rig System
I've been working on a "Meta Rig" system. It's still a work-in-progress and I can't show it off yet, but here's the gist of it:
The system can automatically detect humanoid biped, as well as quadruped rigs in a model, and map it onto a standardized rig.
- Automatic re-targeting: You'll be able to change the model of an animated actor to any other model (assuming they're both humanoid), and the animation will kept intact - without requiring a retarget rig
This will have a lot of different applications:
- Automatic re-targeting:
etc. TODO
Motion Editor
I've started implementing the motion editor:
The bad news is, so far I've only implemented the UI elements, so it's not actually functional yet. Fortunately the functionality is pretty trivial to implement (especially compared to the graph editor), I just haven't found the time for it yet.
Misc
So, unfortunately I've decided to put the remaining tutorials on halt for the time being. They're very time-consuming to create and I have received very little feedback on them so far, which tells me there's not much interest for them. I may come back to them at a later date, but for now I need to put my priorities elsewhere.