Skip to main content

May 2024 Progress Report

Filmmaker

The new features are only available in the preview release for now. If you want to try it, you can download it here, but keep in mind it's not a stable version.

Full-Body IK

Biiiig update on the IK system:

I still have some things in mind for the future, such as adding pole vectors/targets like the ones in Blender/Maya/Unreal, which make it easier to control the bending direction of limbs like arms and legs, but I'm content with it for now.

Meta Rig System

Part of the new update is also a new "Meta Rig" system, which can automatically dected biped character models and generate an IK rig for them (among other things). This means you don't have to do anything manually in order to make use of the new IK system, just place your character in the scene and the IK controls should be available immediately. If you find a model where this doesn't work, please let me know, I'll be continuously working on improvements to the system.

Quadruped models can also be detected, but I've yet to set up a template IK rig for them, so full-body IK isn't available for them yet.

 

Motion Editor

Also a long time coming, I finally got around to implementing the motion editor:

It has some performance issues if a large number of properties are selected, but nothing that can't be improved upon in the future.

Embedded Projects / Scenebuilds

I've added support for embedding a PFM project in other PFM projects. For instance, you can set up a project "A" as a scenebuild, and then create a new project "B" that uses "A" as a base:

 

It's similar to using a map, but more flexible. You can use multiple embedded sub-projects, change an embedded project between film clips, etc. It also makes it easier to re-use a scenebuild for different projects, or share them.

 

Motion Capture

Some more work on the motion capture module, which can now capture motion directly from webcam:

It also has support for tracking hands, fingers, as well as the lower body and facial expressions. Eye tracking is still a work-in-progress and it's still a bit jittery.

This will allow you to do simple motion capture using a webcam or your SmartPhone for any humanoid character model. Unfortunately the tracking isn't good enough for complex movements (like dance moves), but I think it should still come in handy. You can also specify what you want to track, for instance if you only wish to track facial movements (for lip synching).

 

Recording

You can now record gameplay as animation data:

While the recording mode is active, all changes to selected actor properties are recorded andas convertedan into animation data.animation. The recorded animation data is then automatically smoothed and keyframes are generated.generated (so they can be edited in the graph editor).

The main use-case for it is to record animation data from motion capture, but you can also use it to record manual changes like shown in the video above. For the future I'm also planning to use this to allow you to record physics interactions as animation.

 

Tutorials

So, unfortunately I've decided to put the remaining tutorials on halt for the time being. They're very time-consuming to create and I have received very little feedback on them so far, which tells me there's not much interest for them. I may come back to them at a later date, but for now I need to put my priorities elsewhere.