diff --git a/README.md b/README.md index 153b8b5..a60f3f0 100644 --- a/README.md +++ b/README.md @@ -1,9 +1,19 @@ -[![Blender](https://img.shields.io/badge/Blender->=4.0-blue?logo=blender&logoColor=white)](https://www.blender.org/download/ "Download Blender") +[![Blender](https://img.shields.io/badge/Blender->=2.9-blue?logo=blender&logoColor=white)](https://www.blender.org/download/ "Download Blender") [![GitHub release](https://img.shields.io/github/release/DarklightGames/io_scene_psk_psa?include_prereleases=&sort=semver&color=blue)](https://github.com/DarklightGames/io_scene_psk_psa/releases/) [![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/L4L3853VR) -This Blender add-on allows you to import and export meshes and animations to and from the [PSK and PSA file formats](https://wiki.beyondunreal.com/PSK_%26_PSA_file_formats) used in many versions of the Unreal Engine. +This Blender addon allows you to import and export meshes and animations to and from the [PSK and PSA file formats](https://wiki.beyondunreal.com/PSK_%26_PSA_file_formats) used in many versions of the Unreal Engine. + +## Compatibility + +| Blender Version | Addon Version | Long Term Support | +|--------------------------------------------------------------|--------------------------------------------------------------------------------|-------------------| +| 4.0+ | [latest](https://github.com/DarklightGames/io_scene_psk_psa/releases/latest) | TBD | +| [3.4 - 3.6](https://www.blender.org/download/lts/3-6/) | [5.0.5](https://github.com/DarklightGames/io_scene_psk_psa/releases/tag/5.0.5) | ✅️ June 2025 | +| [2.93 - 3.3](https://www.blender.org/download/releases/3-3/) | [4.3.0](https://github.com/DarklightGames/io_scene_psk_psa/releases/tag/4.3.0) | ✅️ September 2024 | + +Bug fixes will be issued for legacy addon versions that are under [Blender's LTS maintenance period](https://www.blender.org/download/lts/). Once the LTS period has ended, legacy addon versions will no longer be supported by the maintainers of this repository, although we will accept pull requests for bug fixes. # Features * Full PSK/PSA import and export capabilities. @@ -14,16 +24,6 @@ This Blender add-on allows you to import and export meshes and animations to and * PSA sequences can be exported directly from actions or delineated using a scene's [timeline markers](https://docs.blender.org/manual/en/latest/animation/markers.html) or NLA track strips, allowing direct use of the [NLA](https://docs.blender.org/manual/en/latest/editors/nla/index.html) when creating sequences. * Manual re-ordering of material slots when exporting multiple mesh objects. -## Compatibility - -| Blender Version | Addon Version | LTS Support | -|--------------------------------------------------------------|--------------------------------------------------------------------------------|-------------------| -| 4.0+ | [latest](https://github.com/DarklightGames/io_scene_psk_psa/releases/latest) | TBD | -| [3.4 - 3.6](https://www.blender.org/download/lts/3-6/) | [5.0.5](https://github.com/DarklightGames/io_scene_psk_psa/releases/tag/5.0.5) | ✅️ June 2025 | -| [2.93 - 3.3](https://www.blender.org/download/releases/3-3/) | [4.3.0](https://github.com/DarklightGames/io_scene_psk_psa/releases/tag/4.3.0) | ✅️ September 2024 | - -Bug fixes will be issued for legacy addon versions that are under [Blender's LTS maintenance period](https://www.blender.org/download/lts/). Once the LTS period has ended, legacy addon-on versions will no longer be supported by the maintainers of this repository, although we will accept pull requests for bug fixes. - # Installation 1. Download the zip file for the latest version from the [releases](https://github.com/DarklightGames/io_export_psk_psa/releases) page. 2. Open Blender 4.0.0 or later. @@ -54,7 +54,15 @@ Bug fixes will be issued for legacy addon versions that are under [Blender's LTS 3. Select the PSA file you want to import. 4. Select the sequences that you want to import and click `Import`. +> Note that in order to see the imported actions applied to your armature, you must use the [Dope Sheet](https://docs.blender.org/manual/en/latest/editors/dope_sheet/introduction.html) or [Nonlinear Animation](https://docs.blender.org/manual/en/latest/editors/nla/introduction.html) editors. + # FAQ + +## Why can't I see the animations imported from my PSA? +Simply importing an animation into the scene will not automatically apply the action to the armature. This is in part because a PSA can have multiple sequences imported from it, and also that it's generally bad form for importers to modify the scene when they don't need to. + +The PSA importer creates [Actions](https://docs.blender.org/manual/en/latest/animation/actions.html) for each of the selected sequences in the PSA. These actions can be applied to your armature via the [Action Editor](https://docs.blender.org/manual/en/latest/editors/dope_sheet/action.html) or [NLA Editor](https://docs.blender.org/manual/en/latest/editors/nla/index.html). + ## Why are the mesh normals not accurate when importing a PSK extracted from [UE Viewer](https://www.gildor.org/en/projects/umodel)? If preserving the mesh normals of models is important for your workflow, it is *not recommended* to export PSK files from UE Viewer. This is because UE Viewer makes no attempt to reconstruct the original [smoothing groups](https://en.wikipedia.org/wiki/Smoothing_group). As a result, the normals of imported PSK files will be incorrect when imported into Blender and will need to be manually fixed. diff --git a/io_scene_psk_psa/__init__.py b/io_scene_psk_psa/__init__.py index f9d5968..aa72ddc 100644 --- a/io_scene_psk_psa/__init__.py +++ b/io_scene_psk_psa/__init__.py @@ -3,7 +3,7 @@ from bpy.app.handlers import persistent bl_info = { "name": "PSK/PSA Importer/Exporter", "author": "Colin Basnett, Yurii Ti", - "version": (6, 0, 0), + "version": (6, 1, 1), "blender": (4, 0, 0), "description": "PSK/PSA Import/Export (.psk/.psa)", "warning": "", @@ -30,6 +30,7 @@ if 'bpy' in locals(): importlib.reload(psk_import_operators) importlib.reload(psa_data) + importlib.reload(psa_config) importlib.reload(psa_reader) importlib.reload(psa_writer) importlib.reload(psa_builder) @@ -55,6 +56,7 @@ else: from .psk.import_ import operators as psk_import_operators from .psa import data as psa_data + from .psa import config as psa_config from .psa import reader as psa_reader from .psa import writer as psa_writer from .psa import builder as psa_builder diff --git a/io_scene_psk_psa/psa/config.py b/io_scene_psk_psa/psa/config.py new file mode 100644 index 0000000..184d59c --- /dev/null +++ b/io_scene_psk_psa/psa/config.py @@ -0,0 +1,78 @@ +import re +from configparser import ConfigParser +from typing import Dict + +from .reader import PsaReader + +REMOVE_TRACK_LOCATION = (1 << 0) +REMOVE_TRACK_ROTATION = (1 << 1) + + +class PsaConfig: + def __init__(self): + self.sequence_bone_flags: Dict[str, Dict[int, int]] = dict() + + +def _load_config_file(file_path: str) -> ConfigParser: + """ + UEViewer exports a dialect of INI files that is not compatible with Python's ConfigParser. + Specifically, it allows values in this format: + + [Section] + Key1 + Key2 + + This is not allowed in Python's ConfigParser, which requires a '=' character after each key name. + To work around this, we'll modify the file to add the '=' character after each key name if it is missing. + """ + with open(file_path, 'r') as f: + lines = f.read().split('\n') + + lines = [re.sub(r'^\s*(\w+)\s*$', r'\1=', line) for line in lines] + + contents = '\n'.join(lines) + + config = ConfigParser() + config.read_string(contents) + + return config + + +def _get_bone_flags_from_value(value: str) -> int: + match value: + case 'all': + return (REMOVE_TRACK_LOCATION | REMOVE_TRACK_ROTATION) + case 'trans': + return REMOVE_TRACK_LOCATION + case 'rot': + return REMOVE_TRACK_ROTATION + case _: + return 0 + + +def read_psa_config(psa_reader: PsaReader, file_path: str) -> PsaConfig: + psa_config = PsaConfig() + + config = _load_config_file(file_path) + + if config.has_section('RemoveTracks'): + for key, value in config.items('RemoveTracks'): + match = re.match(f'^(.+)\.(\d+)$', key) + sequence_name = match.group(1) + + # Map the sequence name onto the actual sequence name in the PSA file. + try: + psa_sequence_names = list(psa_reader.sequences.keys()) + lowercase_sequence_names = [sequence_name.lower() for sequence_name in psa_sequence_names] + sequence_name = psa_sequence_names[lowercase_sequence_names.index(sequence_name.lower())] + except ValueError: + # Sequence name is not in the PSA file. + continue + + if sequence_name not in psa_config.sequence_bone_flags: + psa_config.sequence_bone_flags[sequence_name] = dict() + + bone_index = int(match.group(2)) + psa_config.sequence_bone_flags[sequence_name][bone_index] = _get_bone_flags_from_value(value) + + return psa_config diff --git a/io_scene_psk_psa/psa/export/properties.py b/io_scene_psk_psa/psa/export/properties.py index 518adf9..59cd755 100644 --- a/io_scene_psk_psa/psa/export/properties.py +++ b/io_scene_psk_psa/psa/export/properties.py @@ -152,7 +152,7 @@ class PSA_PG_export(PropertyGroup): default=False, name='Enforce Bone Name Restrictions', description='Bone names restrictions will be enforced. Note that bone names without properly formatted names ' - 'cannot be referenced in scripts' + 'may not be able to be referenced in-engine' ) sequence_name_prefix: StringProperty(name='Prefix', options=empty_set) sequence_name_suffix: StringProperty(name='Suffix', options=empty_set) diff --git a/io_scene_psk_psa/psa/import_/operators.py b/io_scene_psk_psa/psa/import_/operators.py index 7aff586..a5df890 100644 --- a/io_scene_psk_psa/psa/import_/operators.py +++ b/io_scene_psk_psa/psa/import_/operators.py @@ -1,10 +1,12 @@ import os +from pathlib import Path from bpy.props import StringProperty from bpy.types import Operator, Event, Context from bpy_extras.io_utils import ImportHelper from .properties import get_visible_sequences +from ..config import read_psa_config from ..importer import import_psa, PsaImportOptions from ..reader import PsaReader @@ -156,6 +158,10 @@ class PSA_OT_import(Operator, ImportHelper): psa_reader = PsaReader(self.filepath) sequence_names = [x.action_name for x in pg.sequence_list if x.is_selected] + if len(sequence_names) == 0: + self.report({'ERROR_INVALID_CONTEXT'}, 'No sequences selected') + return {'CANCELLED'} + options = PsaImportOptions() options.sequence_names = sequence_names options.should_use_fake_user = pg.should_use_fake_user @@ -170,9 +176,14 @@ class PSA_OT_import(Operator, ImportHelper): options.fps_source = pg.fps_source options.fps_custom = pg.fps_custom - if len(sequence_names) == 0: - self.report({'ERROR_INVALID_CONTEXT'}, 'No sequences selected') - return {'CANCELLED'} + if options.should_use_config_file: + # Read the PSA config file if it exists. + config_path = Path(self.filepath).with_suffix('.config') + if config_path.exists(): + try: + options.psa_config = read_psa_config(psa_reader, str(config_path)) + except Exception as e: + self.report({'WARNING'}, f'Failed to read PSA config file: {e}') result = import_psa(context, psa_reader, context.view_layer.objects.active, options) @@ -254,6 +265,8 @@ class PSA_OT_import(Operator, ImportHelper): col.use_property_decorate = False col.prop(pg, 'should_use_fake_user') col.prop(pg, 'should_stash') + col.prop(pg, 'should_use_config_file') + col.prop(pg, 'should_use_action_name_prefix') if pg.should_use_action_name_prefix: diff --git a/io_scene_psk_psa/psa/import_/properties.py b/io_scene_psk_psa/psa/import_/properties.py index fcf86d1..b571c60 100644 --- a/io_scene_psk_psa/psa/import_/properties.py +++ b/io_scene_psk_psa/psa/import_/properties.py @@ -32,6 +32,12 @@ class PSA_PG_import(PropertyGroup): description='Assign each imported action a fake user so that the data block is ' 'saved even it has no users', options=empty_set) + should_use_config_file: BoolProperty(default=True, name='Use Config File', + description='Use the .config file that is sometimes generated when the PSA ' + 'file is exported from UEViewer. This file contains ' + 'options that can be used to filter out certain bones tracks ' + 'from the imported actions', + options=empty_set) should_stash: BoolProperty(default=False, name='Stash', description='Stash each imported action as a strip on a new non-contributing NLA track', options=empty_set) diff --git a/io_scene_psk_psa/psa/importer.py b/io_scene_psk_psa/psa/importer.py index 30d8154..094441c 100644 --- a/io_scene_psk_psa/psa/importer.py +++ b/io_scene_psk_psa/psa/importer.py @@ -6,6 +6,7 @@ import numpy from bpy.types import FCurve, Object, Context from mathutils import Vector, Quaternion +from .config import PsaConfig, REMOVE_TRACK_LOCATION, REMOVE_TRACK_ROTATION from .data import Psa from .reader import PsaReader @@ -24,6 +25,8 @@ class PsaImportOptions(object): self.bone_mapping_mode = 'CASE_INSENSITIVE' self.fps_source = 'SEQUENCE' self.fps_custom: float = 30.0 + self.should_use_config_file = True + self.psa_config: PsaConfig = PsaConfig() class ImportBone(object): @@ -164,6 +167,11 @@ def import_psa(context: Context, psa_reader: PsaReader, armature_object: Object, sequence_name = sequence.name.decode('windows-1252') action_name = options.action_name_prefix + sequence_name + # Get the bone track flags for this sequence, or an empty dictionary if none exist. + sequence_bone_track_flags = dict() + if sequence_name in options.psa_config.sequence_bone_flags.keys(): + sequence_bone_track_flags = options.psa_config.sequence_bone_flags[sequence_name] + if options.should_overwrite and action_name in bpy.data.actions: action = bpy.data.actions[action_name] else: @@ -188,18 +196,21 @@ def import_psa(context: Context, psa_reader: PsaReader, armature_object: Object, # Create f-curves for the rotation and location of each bone. for psa_bone_index, armature_bone_index in psa_to_armature_bone_indices.items(): + bone_track_flags = sequence_bone_track_flags.get(psa_bone_index, 0) import_bone = import_bones[psa_bone_index] pose_bone = import_bone.pose_bone rotation_data_path = pose_bone.path_from_id('rotation_quaternion') location_data_path = pose_bone.path_from_id('location') + add_rotation_fcurves = (bone_track_flags & REMOVE_TRACK_ROTATION) == 0 + add_location_fcurves = (bone_track_flags & REMOVE_TRACK_LOCATION) == 0 import_bone.fcurves = [ - action.fcurves.new(rotation_data_path, index=0, action_group=pose_bone.name), # Qw - action.fcurves.new(rotation_data_path, index=1, action_group=pose_bone.name), # Qx - action.fcurves.new(rotation_data_path, index=2, action_group=pose_bone.name), # Qy - action.fcurves.new(rotation_data_path, index=3, action_group=pose_bone.name), # Qz - action.fcurves.new(location_data_path, index=0, action_group=pose_bone.name), # Lx - action.fcurves.new(location_data_path, index=1, action_group=pose_bone.name), # Ly - action.fcurves.new(location_data_path, index=2, action_group=pose_bone.name), # Lz + action.fcurves.new(rotation_data_path, index=0, action_group=pose_bone.name) if add_rotation_fcurves else None, # Qw + action.fcurves.new(rotation_data_path, index=1, action_group=pose_bone.name) if add_rotation_fcurves else None, # Qx + action.fcurves.new(rotation_data_path, index=2, action_group=pose_bone.name) if add_rotation_fcurves else None, # Qy + action.fcurves.new(rotation_data_path, index=3, action_group=pose_bone.name) if add_rotation_fcurves else None, # Qz + action.fcurves.new(location_data_path, index=0, action_group=pose_bone.name) if add_location_fcurves else None, # Lx + action.fcurves.new(location_data_path, index=1, action_group=pose_bone.name) if add_location_fcurves else None, # Ly + action.fcurves.new(location_data_path, index=2, action_group=pose_bone.name) if add_location_fcurves else None, # Lz ] if options.should_write_scale_keys: @@ -225,11 +236,15 @@ def import_psa(context: Context, psa_reader: PsaReader, armature_object: Object, # Write the keyframes out. fcurve_data = numpy.zeros(2 * sequence.frame_count, dtype=float) + + # Populate the keyframe time data. fcurve_data[0::2] = [x * keyframe_time_dilation for x in range(sequence.frame_count)] for bone_index, import_bone in enumerate(import_bones): if import_bone is None: continue for fcurve_index, fcurve in enumerate(import_bone.fcurves): + if fcurve is None: + continue fcurve_data[1::2] = sequence_data_matrix[:, bone_index, fcurve_index] fcurve.keyframe_points.add(sequence.frame_count) fcurve.keyframe_points.foreach_set('co', fcurve_data) diff --git a/io_scene_psk_psa/psk/builder.py b/io_scene_psk_psa/psk/builder.py index 80b3fe4..623ae66 100644 --- a/io_scene_psk_psa/psk/builder.py +++ b/io_scene_psk_psa/psk/builder.py @@ -146,7 +146,9 @@ def build_psk(context, options: PskBuildOptions) -> PskBuildResult: psk_material.texture_index = len(psk.materials) psk.materials.append(psk_material) - for input_mesh_object in input_objects.mesh_objects: + context.window_manager.progress_begin(0, len(input_objects.mesh_objects)) + + for object_index, input_mesh_object in enumerate(input_objects.mesh_objects): # MATERIALS material_indices = [material_names.index(material_slot.material.name) for material_slot in input_mesh_object.material_slots] @@ -288,6 +290,10 @@ def build_psk(context, options: PskBuildOptions) -> PskBuildResult: bpy.data.meshes.remove(mesh_data) del mesh_data + context.window_manager.progress_update(object_index) + + context.window_manager.progress_end() + result.psk = psk return result diff --git a/io_scene_psk_psa/psk/import_/operators.py b/io_scene_psk_psa/psk/import_/operators.py index 1cabb22..e90b44c 100644 --- a/io_scene_psk_psa/psk/import_/operators.py +++ b/io_scene_psk_psa/psk/import_/operators.py @@ -113,7 +113,7 @@ class PSK_OT_import(Operator, ImportHelper): message += '\n'.join(result.warnings) self.report({'WARNING'}, message) else: - self.report({'INFO'}, f'PSK imported') + self.report({'INFO'}, f'PSK imported ({options.name})') return {'FINISHED'} diff --git a/io_scene_psk_psa/psk/importer.py b/io_scene_psk_psa/psk/importer.py index 809fbad..c701db9 100644 --- a/io_scene_psk_psa/psk/importer.py +++ b/io_scene_psk_psa/psk/importer.py @@ -222,12 +222,13 @@ def import_psk(psk: Psk, context, options: PskImportOptions) -> PskImportResult: # VERTEX NORMALS if psk.has_vertex_normals and options.should_import_vertex_normals: - mesh_data.polygons.foreach_set("use_smooth", [True] * len(mesh_data.polygons)) + mesh_data.polygons.foreach_set('use_smooth', [True] * len(mesh_data.polygons)) normals = [] for vertex_normal in psk.vertex_normals: normals.append(tuple(vertex_normal)) mesh_data.normals_split_custom_set_from_vertices(normals) - mesh_data.use_auto_smooth = True + else: + mesh_data.shade_smooth() bm.normal_update() bm.free()