-
Notifications
You must be signed in to change notification settings - Fork 109
Models Inconsistency #566
Copy link
Copy link
Open
Labels
code readabilitydocumentationImprovements or additions to documentationImprovements or additions to documentationenhancementNew feature or requestNew feature or requestrefactoringCode RefactoringCode Refactoring
Metadata
Metadata
Assignees
Labels
code readabilitydocumentationImprovements or additions to documentationImprovements or additions to documentationenhancementNew feature or requestNew feature or requestrefactoringCode RefactoringCode Refactoring
Description
Tiatoolbox has several pre-trained models helpful for data processing. However, models differ in how they handle input and output, making them confusing to use (especially when customizing):
forwardmethod (e.g.CNNModel), sometimesforwardreturns a raw layer output and the transformation applies ininfer_batch(e.g.UNetModel).HoVerNetuses it inforward,UNetModelin_transform,MicroNetinpreproc, and vanilla models rely on the user to do so.preproc_func/_preprocfunctions,UNetModeluses its own_transform, unrelated to the standard methods. Yet, its behavior could implement in_preproc.What to do
Refactoring the code will significantly improve readability:
ModelABC: one method for normalization, activation function as an attribute, etc.ModelABCmethods in their documentation: doesinfer_batch relyonpostproc_func? Caninfer_batchbe used for training? How?ModelABCstructure.