Skip to main content

May 2024 Progress Report

Filmmaker

The new features are only available in the preview release for now. If you want to try it, you can download it here, but keep in mind it's not a stable version.

Reminder: You can manage your Supporter rewards (including your credits name, if you're eligible) on the supporter page.

Full-Body IK

Biiiig update on the IK system:

I still have some things in mind for the future, such as adding pole vectors/targets like the ones in Blender/Maya/Unreal, which make it easier to control the bending direction of limbs like arms and legs, but I'm content with it for now.

Meta Rig System

Part of the new updates is also a new "Meta Rig" system, which can automatically decteddetect biped character models and generate an IK rig for them (among other things). This means you don't have to do anything manually in order to make use of the new IK system, just place your character in the scene and the IK controls should be available immediately. If you find a model where this doesn't work, please let me know, I'll be continuously working on improvements to the system.

The meta-rig system can also detect quadruped models, but I've yet to set up a template IK rig for them, so full-body IK isn't available for them yet.

 

Motion Editor

Also a long time coming, I finally got around to implementing the motion editor:

It has some performance issues if a large number of properties are selected, but nothing that can't be improved upon in the future.

Embedded Projects / Scenebuilds

I've added support for embedding a PFM project in other PFM projects. For instance, you can set up a project "A" as a scenebuild, and then create a new project "B" that uses "A" as a base:

 

It's similar to using a map, but more flexible. You can use multiple embedded sub-projects, change an embedded project between film clips, etc. It also makes it easier to re-use a scenebuild for different projects, or share them.

 

Motion Capture

Some more work on the motion capture module, which can now capture motion directly from webcam:

It also has support for tracking hands, fingers, as well as the lower body and facial expressions. Eye tracking is still a work-in-progress and it's still a bit jittery.

This will allow you to do simple motion capture using a webcam or your SmartPhone for any humanoid character model. Unfortunately the tracking isn't good enough for complex movements (like dance moves), but I think it should still come in handy. You can also specify what you want to track, for instance if you only wish to track facial movements (for lip synching).

 

Recording

You can now record gameplay as animation data:

While the recording mode is active, all changes to selected actor properties are recorded as an animation. The recorded animation data is then automatically smoothed and keyframes are generated (so they can be edited in the graph editor).

The main use-case for it is to record animation data from motion capture, but you can also use it to record manual changes like shown in the video above. For the future I'm also planning to use this to allow you to record physics interactions as animation.

 

Future Plans

I'm currently working on finishing up a new Prelewd release, hopefully ready some time next week (there will be another post at that time). I've already mentioned it a while back, but I will be focusing a lot more on Prelewd going forward to try and get more supporters on board to finance the project. For PFM I will be primarily focusing on bug fixes and quality-of-life improvements. I'm not sure if I can do any more interactive tutorials, as they're very time-consuming to create (and I haven't received much feedback on them), so I'll have to put those on halt for now. Depending on how things go, I might come back to them at a later date.